Monday, April 29, 2013

Why isn’t language clearer?



Anyone who has spent any amount of time with me knows that I am addicted to puns.  I know that they make everyone groan, but I just can’t help playing around with the various meanings of words.  I remember several years ago going to the airport and forgetting that I had a yogurt in my backpack.  The TSA folks pulled the pack out of the X-ray scanner and took the yogurt from me, because it violated the rules for what you can get into an airport.  A few days later, I was complaining about this to my lab meeting, and couldn’t resist adding, “I guess they were biased against my culture.”

For punsters, the fact that words and phrases, and sentences can take on many meanings is a blessing.  But, why aren’t languages clearer?  Wouldn’t it be better if every word had just one meaning?  That would seem to avoid a lot of problems.

There are a number of reasons why languages aren’t much clearer than they are.  I’ll focus on just three of them.

First, words in language necessarily lose some information about the things they describe.  When you point at a cute four-legged object on the street, and say, “Look at the dog!” you are focused on some of its properties like having four legs, being furry, and barking.  If you said, “Look at the poodle.” instead, then you would have added some information about it.  And if you said, “Look at the animal.” then you would have been talking more generally.  It is helpful to have these different levels for talking about objects, but that means that from the beginning we have a choice about how specifically or abstractly we want to talk about things. 

You might think that we should always talk about things as specifically as possible.  But how specifically should that be?  For people who know that particular poodle, they should say, “Look at Fido!”  But that might not even be specific enough.  Perhaps we should have a particular word for Fido each day, because he is a slightly different dog all the time.  Someone else who doesn’t know that this is Fido might be better of calling it a poodle, but there might be still other people who don’t know enough about dogs to distinguish the breed. So, we usually try to use words that we assume everyone else will understand, but are still specific enough to convey enough information.  As Roger Brown pointed out in a classic paper in 1958, that leads us to use words at a medium level of abstraction like dog rather than specific words like poodle or general ones like animal.

Second, even if we could settle on the way we wanted to talk about things, it is efficient for us to be able to reuse the words and sounds of language.  This issue was discussed in a 2012 paper by Steven Piantadosi, Harry Tily, and Edward Gibson in the journal Cognition.  As they point out, languages have thousands of words, but a much smaller number of sounds that are used to make up those words.  Words that we use frequently, tend to be short.  That is why the most common words like articles, prepositions, and pronouns all tend to be one and two syllable words.

As words get more complex and are used less frequently, the words also get longer.  That is why the word complicated is longer than the word the.  However, if every word had to be completely unique, then some words might get very long.  As it turns out, though, in most cases it is pretty obvious what you’re talking about, because the situation helps everyone to understand what is being said.  The word cap is used to mean a number of things including a physical hat that someone can wear as well as a limit placed on something.  While these meanings are related, they are not identical, yet we don’t confuse them.  That allows us to re-use short words and makes our speech more efficient.

Finally, it is helpful to be able to say things indirectly.  When you have to give criticism, there are times when you can soften it or at least inject some humor by speaking indirectly.  Rather than telling someone that they really messed up a situation, you can say, “This may not have been your finest moment.”  If everything could only be conveyed in a single direct way, then there wouldn’t be opportunities to avoid direct confrontation.

Of course, this ambiguity can lead to unintended humor.  There are many examples of newspaper headlines that must have seemed perfectly clear to editors when they were written, but can be read in many ways.  For example, “Kids make nutritious snacks.” and “Killer sentenced to die for second time in 10 years.”  And a colleague of mine has a great ‘recommendation letter’ in which every sentence is one that seems positive on the surface, but could be read less positively, like “You’ll be lucky if you can get him to work for you.” 

From a practical standpoint, this means that before you send anything out to be read by a large audience, it is useful to get someone to read it from a fresh perspective to make sure that you haven’t missed an alternative way of interpreting what you have just said.

Friday, April 26, 2013

Meat eaters downplay animal minds


The decision about whether to eat meat has a moral dimension to it.  The animals that we use for food are complex creatures.  Deciding to eat them means accepting that they will be killed so that you can eat them.

That is not to say, of course, that people grapple with this decision at every meal, but in some way everyone has to make some decision about whether to eat animals.  And before I go any further with this discussion, I should mention that I have been a vegetarian for about 10 years now for a combination of economic, health, and moral reasons.

An interesting question about eating meat involves how people grapple with the issue that many animals people eat are reasonably intelligent creatures.  An interesting paper in the February, 2012 issue of Personality and Social Psychology Bulletin by Brock Bastian, Steve Loughnan, Nick Haslam, and Helena Radke suggests that when people eat meat, they tend to downplay the minds of the animals that they eat.

In one simple study, the researchers asked (meat-eating) participants to rate how willing they were to eat a variety of animals ranging from houseflies, to fish, to chicken to elephants to gorillas.  They also rated the how strongly each of these animals had a number of mental abilities such as feeling hunger, fear, and pain, and having self-control and planning abilities.  There was a systematic relationship between the animals people choose to eat and their beliefs about the minds of the animals.  People were much less willing to eat animals that they believe have complex mental abilities than to eat animals that do not have complex minds.

Of course, this alone might just mean that the animals that people choose to eat are the ones that are not so smart.  In another study, meat eaters were asked to think about cows and sheep.  Some of them thought about these animals living an idyllic life on a farm.  Others thought specifically about these animals growing up on a farm and then being killed for food.  Later, they also rated the mental abilities of the animals.  When people thought about the animals as food, their ratings of the mental abilities of the animals were lower than when they thought about the animals living on a farm.

It isn’t just thinking about animals being used for food, though.  In one final study, all of the participants had to write about the process of raising and butchering animals for food.  All of the participants thought they were going to do a food sampling task after writing the essay.  Half of the participants were told they would be eating fruit during the food sampling, while others were told they would be eating beef and lamb.  Finally, participants rated the mental abilities of cows and sheep.  The group that was about to eat meat gave much lower ratings of the mental abilities of cows and sheep than the group that was about to eat fruit.

These studies suggest that people who choose to eat meat have to grapple with the moral dilemma of eating an animal with a brain whether they realize it or not.  Because of the importance of eating to our lives, we think about food animals as less complex than other animals.  This effect is particularly strong in the context of meat eating.

Of course, this mechanism is not special to eating.  There are lots of situations in life   that cause different goals and moral values to come into conflict.  Eating a piece of chocolate may conflict with a diet.  Buying a new car may conflict with the desire to save for a new home.  Research that I did with Miguel Brendl demonstrates that, when one goal becomes highly engaged, we change our attitudes about things that would conflict with that goal to make them less attractive.  

So, generally speaking, we have mechanisms that help us to satisfy our goals, in part by discounting the attitudes we hold that might get in the way of  those goals. 

Monday, April 15, 2013

Why regret makes buying experiences better than buying stuff



In the past, I have written about ways to spend money to make yourself happier.  One general rule that comes from research by Tom Gilovich and his colleagues is that it is better to buy experiences than to buy objects.  That is, if you spend big bucks on a trip to Mexico, you are likely to feel better about that purchase in the long run than if you spend the same amount on clothes.

A paper by Emily Rosenzweig and Tom Gilovich in the February, 2012 issue of the Journal of Personality and Social Psychology shows that an important reason for this difference is that experiences and stuff lead to different kinds of regrets. 

When you buy an object, like a computer, you may experience buyer’s remorse.  That is, soon after buying it, you may regret buying that particular computer, because you could have bought another one (or something else entirely).  You are much less likely to regret buying an experience.  Think about a big concert going in on your town.  You are more likely to regret passing up the opportunity to go to the concert than you are to regret buying a ticket to go. 

Why is this?

In one study, Rosenzweig and Gilovich examined the uniqueness of objects and experiences.  One big reason why people regret buying objects is that after they own the object, they can continue to compare it to other objects that are available.  You buy a computer, and a month later, you find another one that is faster, smaller, and cheaper.  So, now you feel like you didn’t get a good deal.  When you go on a vacation, though, that experience is relatively unique.  It is hard to compare a particular trip to Mexico with other trips you might have taken, and so you spend less time comparing your experience to other things you might have done. 

Indeed, in one study in this paper, participants listed specific purchases they had made of objects or experiences.  People listing objects felt that their purchases were interchangeable with other objects.  People listing experiences felt that their purchases were unique.  In addition, the more interchangeable the objects, the more that people were likely to regret making a purchase. 

Looking at regret in this way also suggests two ways to avoid regret from purchases.  First, if you are going to make a significant purchase of an object, try to make it something unique.  In another study in the paper, participants were asked to imagine a purchase of an object that was either fairly common (a dresser) or unique (a particular antique dresser).  In this case, participants were much more likely to regret buying the common dresser, but to regret not buying a unique dresser.  Other participants imagined buying a plane ticket to a common experience (their yearly family reunion) or to a unique experience (the first ever family reunion).  For this experience, the same pattern held.  People were more likely to regret buying the ticket to the yearly reunion, and to regret not buying the ticket for the first-ever reunion.

The second way to avoid buyer’s regret is to find objects that can be treated as experiences.  Many objects have an experience component to them.  If you buy an expensive car, for example, you can treat it as an object or you can savor the experience of owning and driving the car.  Indeed, car makers like BMW focus on the driving experience as a way of making the car feel unique.

As support for this view, a final study had people think about two friends, Mark and Joe who were each considering buying a 3d television.  Ultimately, Mark bought the TV and Joe did not.  For one group, the description of the TV focused on the object itself.  For another group, the description focused on the experience of having a third dimension when watching TV and sharing that with friends.  The group that was focused on the TV as an object assumed that Mark (who bought the TV) would regret the decision more than Joe (who did not).  In contrast, the group that was focused on the experience thought that Joe (who passed on the TV) would regret the choice more than Mark (who bought it).

Obviously, you have to buy a certain number of objects in your life just to survive.  But if you have some extra money around and are looking for a way to spend it to increase your happiness, then you should buy experiences.  And whenever you can, you should think about the great experiences you can have with the objects you buy.

Thursday, April 11, 2013

No more senior moments




I have three teenage boys.  They are about as forgetful as any human being is capable of being.  They routinely forget all kinds of things ranging from appointments, to chores, to homework assignments.  With the older guys, I used to joke that when they forgot something, they werehaving a senior (in high school) moment.

Of course, the concept of the senior moment is a label that many older people give to the situations in which they forget something.  Most of us assume that our memories are going to get worse as we get older, and so age must be the reason that we forget after the age of 55.

It is true that there is a general cognitive decline starting in your 20s.  And your memory will get a bit worse as you age.  But unless you have suffered brain injury, those declines are not precipitous.  Indeed, there is some evidence that your beliefs about your memory abilities are at least as important to your ability to remember as any changes in brain function.

An interesting study making this point was presented in a paper by Ayanna Thomas and Stacey Dubois in the December, 2011 issue of Psychological Science. 

These researchers took advantage of a strange memory phenomenon known as the Deese Roediger McDermott effect after the researchers who discovered and popularized it in research.  To get a sense of how this effect works, read the following list of words slowly.

butter, food, eat, sandwich, rye, jam, milk, flour, jelly, dough, crust, slice, wine, loaf, toast

Now, close your eyes for a second and remember as many of the words you can.  Without looking back at the list, ask yourself, was the word sandwich on that list?  How about the word bread?  The word sandwich was indeed on the list.  But how about bread?  It actually is not on the list, but about half the people given this list will answer that they saw the word bread. 

Lists like this are constructed by taking 15 words that are highly associated with some other word (in this case, bread).  When you study the list, the words make you think of the associate, and later you act as if you saw that word as well.

Studies show that college students typically mis-remember seeing the associated word about 50% of the time.  The researchers speculated that if older adults are worried that their memories get worse with age, they might be even more likely than that to mis-remember seeing the associated word.

Participants in this study were either older adults (with a mean age of 70) or younger adults (with a mean age of 19).  They began the study by studying several lists of words like the one I just showed you.  After that, half the participants had a paragraph read to them about age-related declines in memory.  The other half had a paragraph read to them about psychology research unrelated to age.  The first group was expected to be more concerned about the effects of age on memory than the second.  Finally, participants saw a number of words and were asked whether they had appeared on the lists they studied.  Several of these words were the associates that were expected to lead to false recognitions.

The younger adults were not affected by hearing about age-related declines in memory.  They responded that they had seen the associated word about 50% of the time regardless of what paragraph they had read to them.  Older adults who heard a paragraph about research in general also said they recognized the associated word about 50% of the time.  However, the older adults who heard about age-related declines in memory said they recognized the associated word about 70% of the time.

The idea is that when older adults are concerned that they are experiencing memory problems, they do not focus as carefully on their knowledge about where they encountered words as they do when they are not worried about their memory.  In actuality, older adults had pretty good memory overall.  They correctly recognized about 85% of the words they had actually studied before and only said that they had seen words that had not been studied about 10% of the time.  So, older adults got worse on the memory test just because of their concern about memory.

These findings are one more reason to stamp the concept of a senior moment out.  As you get older, worrying about declines in memory is far more damaging to your ability to think than any actual declines in memory ability.  So relax.  When you get older, you probably aren’t that much more forgetful than your typical teen.

Monday, April 8, 2013

Comparison creates confidence



There are two broad strategies that people use to make choices.  One method is to compare the options to each other and choose the best one.  The other is to evaluate each option individually and then pick the one that is rated as the best. 
These strategies are used in different circumstances.  In his book, Sources of Power, Gary Klein suggests that experts are more likely to evaluate the options individually, while people with less expertise tend to compare the options. 
One reason why comparison helps novices more than experts comes from research by Chris Hsee.  This work shows that it is easier for people to evaluate the options when they are being compared.  Imagine buying a new dictionary.  You find out that a particular dictionary has 50,000 entries in it.  Is that good or bad?  If you are a dictionary expert, the you might know whether that is a large number of entries.  Suppose, though, that you find out that another dictionary only has 25,000 entries in it.  Now, you know that 50,000 entries is a good number for a dictionary to have.
A 2012 paper by Thomas Mussweiler and Ann-Chrstin Posten in Cognition demonstrates that when people compare options, they also get more confident in their judgments.
To get participants in their studies in a mindset to make comparisons, they had people look at a complex picture and write down the commonalities and differences between two halves of the picture.  Other participants evaluated the picture without making comparisons.  Previous work by these researchers shows that this technique reliably gets people to make comparisons in later tasks. 
In one study, after looking at the complex picture, participants were shown descriptions of three brands of cell phones (labeled Brands A, B, and C).  They had a chance to study the descriptions.  Later, they were shown fourteen of the features they had seen and were asked whether those features belonged to Brand B.  With each response, participants were allowed to place a bet between 0 and 10 Euros (the study was done in Germany) based on how confident they were in their response.  The higher the bet, the more confident that people were that they knew whether the feature belonged to Brand B. 
People who were put in a mindset to make comparisons were more confident in their judgments about the features of the cell phones than people who did not make comparisons.  Despite the difference in confidence, the people who made comparisons were not more accurate in their judgments than those who did not make comparisons.
This confidence can also affect the choices people make.  In another study, participants were shown the menu from the university cafeteria before lunch.  They were asked to select the item from the cafeteria they thought they would want to eat.  As before, some participants were put in a mindset to make comparisons while others were not.  After lunch, participants were asked what they actually ate.  Those who made comparisons ate what they predicted they would eat about 75 percent of the time, while those who did not make comparisons at what they predicted they would eat about 50% of the time.  (Because there were about 10 items on the menu, chance would be about 10%.) 
Putting all this research together, it suggests that when you don’t have a lot of expertise in a domain, you need to be careful when making decisions.  On the one hand, you are quite likely to rely on comparing the options in order to make a choice.  On the other hand, those comparisons will increase your feeling of confidence in the decision.  So, you need to recognize that at least part of that confidence comes from the way the choice was made.

Tuesday, April 2, 2013

Punishment helps kids learn to lie



Here’s a news flash.  Parenting is hard.  There are so many competing goals.  We want to raise happy kids, but also good kids who will do the right thing.  We want our kids to be smart, honest, kind, and generous.  And ideally we would do all of that while being nurturing all the time.

Of course, the real world doesn’t make it easy to be a nurturing parent.  Kids have minds of their own.  They want to explore the world, to try new things, and to make their own mistakes.  They push the limits of the rules we create, and they find ways to push our buttons. 

Most parents manage to strike a balance between being a nurturing and loving parent and having to punish when necessary.

What happens when the balance swings too far toward punishment?

There is some research suggesting that when children grow up in an environment with extensive physical and verbal they are at risk for behavioral problems as they get older.

A fascinating study by Victoria Talwar and Kang Lee in the November, 2011 issue of Child Development explored how an environment with lots of punishment affects lying in 3- and 4-year-old children. 

The study took advantage of a natural experiment in a West African country.  In this country (not identified in the paper), there was a long history of corporal punishment in the schools including beatings when children did something wrong.  Although corporal punishment has been outlawed in the public schools in the country, private schools are still allowed to use it. The researchers went to one private pre-school that used corporal punishment and a second that did not.   

To explore lying, the children were first given a temptation.  The experimenter told the children that a toy was being hidden behind them.  The experimenter said that she had to leave the room for a moment and that the child should not turn around and peek at the toy while she was gone. 

This situation is quite tempting, and most children end up turning around and looking at the toy.  When the experimenter returns, she asks the children whether they peeked.

At the school where the children are punished often, about 90 percent of them lied to the experimenter and said that they did not look at the toy.  In the school that did not use harsh punishments, only about half of the children lied.

Of course, young children are often bad liars.  So, the experimenter asked a follow-up question.  She asked the children who lied to guess what they thought the toy was.  Children who are bad liars will identify the toy that they saw.  Good liars will not let on that they know what the toy is.

In this study, about 70% of the children from the school that did not use harsh punishment identified the toy when asked.  Only about 30% of the students from the school that used harsh punishment identified the toy.

Putting this together, the children who went to the school where they got harsh punishments were more likely to lie and were better liars than the ones who went to the school where they were not punished harshly. 

Ultimately, even young children learn survival skills.  In situations where they are being severely punished, children learn ways to avoid that punishment.  They learn how to lie and how to do it effectively.  As children get older, those lies get bitter. 

In the end, even though punishment seems to work to keep children in line, it ultimately increases the bad behaviors it aims to stop.

Nix, R. L., Pinderhughes, E. E., Dodge, K. A., Bates, J. E., Pettit, G. S., & McFadyen-Ketchum, S. A. (1999). The relation between mothers' hostile attribution tendencies and children's externalizing behavior problems: The mediating role of mothers' harsh discipline practices. Child Development, 70, 896-909.