Wednesday, May 20, 2015

Does rejection make you creative?

When you watch movies about high school, certain stereotypes repeat. The football players hang around in packs wearing their letter jackets with cheerleaders hanging off their arms. The science nerds sit quietly in the cafeteria eating lunch hoping nobody notices them. And the artists sit by themselves—at a distance from all of this social interaction—watching the world go by.

This scene reinforces a stereotype that there is a relationship between creativity and being rejected by society.  Of course, even if this relationship exists, it is hard to know the direction it goes. It is possible that people who are truly creative are rejected by others, because their ideas go against the norm. It is also possible tat something about social rejection fuels creativity.

This issue was explored in an interesting paper by Sharon Kim, Lynne Vincent, and Jack Goncalo in the August, 2013 issue of the Journal of Experimental Psychology: General.  They suggest that social rejection can make some people more creative.

In particular, people differ in how strongly they prize independence. Some people really want to see themselves as unique and different from everyone else.  Other people get a lot of their energy from being part of a group.  The more independent people are, the more that social rejection can actually make them more creative.

In one study, the authors measured people’s need to feel unique using a questionnaire.  Then, participants were brought to the lab with five other people and were told that some would be selected for a group exercise, while others would work alone.  They filled out a description of themselves.  Some people were told that they were rejected from the group and would perform tasks alone.  Other people were told that they were accepted into the group and would join the group right after doing some tasks. 

In this first study, participants then did the remote associates test (RAT), which has been used as a measure of creativity.  In the RAT, people see three words (like SALT DEEP and FOAM) and they have to select a word that goes with all three of these words (in this case, SEA).  Doing well on this test requires people to think divergently. 

The authors found an interaction between social rejection and people’s need to feel unique.  For people with a low need to feel unique, rejection had no influence on their scores on the remote associates test.  For people with a high need to feel unique, though, they got more correct answers on the RAT following rejection than following acceptance into a group. 

In a second study, the researchers manipulated people’s need to feel unique using a procedure that has been employed in other studies.  Participants read a passage and were asked to circle the pronouns.  For some people, the pronouns were first-person singular (I  and me).  For other people the pronouns were first-person plural (we and our). Participants who circle singular pronouns are more focused on being independent than those who circle plural pronouns.  After that, the rejection manipulation and RAT were done as before.   The people primed to be independent who were rejected scored best on the RAT of all the groups.  Once again, being independent and being socially rejected led to creativity.

A third study repeated the one I just described with the manipulations of independence and rejection, but used a different measure of creativity.  This group was asked to draw alien creatures from a planet not like Earth.  This task has been used by Tom Ward and his colleagues in the past as a measure of creativity.  The drawings were then judged for their creativity. The group that was primed for independence and was socially rejected also made the most creative drawings.

What is going on here?

When people are part of a group or want to be part of a group, then there is social pressure for people’s ideas to conform to those of the people around them.  This conformity makes people less creative, because it decreases the value they put on divergent thinking.  When people are motivated to be independent, though, then having unique ideas further reinforces that independence. The combination of a mindset to be independent and some social rejection is one way to spur this mindset.

Friday, May 15, 2015

Personality and weight gain

There is a tendency to look at people who have put on weight and assume that there is something about their personality that made them gain weight.  We rarely contemplate the opposite possibility, though.  Perhaps behaviors that lead people to gain weight actually lead to changes in people’s personality over time.

This possibility was explored in a fascinating paper by Angelina Suttin and seven co-authors in the July, 2013 issue of Psychological Science.  These authors examined data from about 2000 people taken from two longitudinal studies.  The adults in these studies were generally in their 40s and 50s at the time of the first measurement.  The individuals in these studies took a basic personality inventory and also had their height and weight measured (in one study) or they self-reported their weight (in the other).  The measurements for each individual were taken 8-10 years apart. 

The researchers analyzed the data to see whether significant weight gain (a gain of more than 10 pounds) and significant weight loss (a loss of more than 10 pounds) influenced measures of personality.  Weight loss had no reliable effect on the measures of personality.  However, weight gain had two relationships to personality.

Participants who gained more than 10 pounds were just as impulsive as those who did not at the baseline measure, but were significantly more impulsive in the follow-up test than those who did not gain weight.  Surprisingly, those who gained weight also increased in how likely they were to deliberate about decisions compared to those who did not gain weight.

This pair of findings is interesting for a number of reasons.  First, it suggests that repeated behaviors that lead to bodily changes can ultimately influence personality characteristics as well.  Giving in and eating too much repeatedly over a 10-year period can lead people to become more impulsive overall. 

Second, the combination of results for impulsiveness and deliberation is interesting.  You might think that people who are impulsive do not think about their actions and the consequences of their actions.  In this case, though, people are both more impulsive and more deliberative.  That means that they likely understand the consequences of their impulsiveness, but they cannot stop themselves from acting.

These data suggest that it might be useful to take a different approach to weight loss, particularly with older adults.  Often, we provide a lot of information about healthy eating and weight loss.  The assumption is that if more people understood why their eating habits are leading to weight gain and potential bad health, they would change the way they eat.

These data suggest that information alone is unlikely to help.  The people in this study are able to think about their actions, they simply don’t change their behavior in the face of temptation.  That suggests that we need to help people to change their environment to make the behaviors they want to perform easier to do and the behaviors they want to avoid harder to do.  In addition, it suggests that people need to engage with family and friends to save them from temptation.  Ultimately, when you are likely to be impulsive, the people around you can be a great source of strength.

Friday, May 8, 2015

You Use Body Information to Recognize People, But You Don’t Know You’re Doing It.

Standing at the airport waiting for a friend or relative to emerge from a flight can be a frustrating experience.  People come pouring out of the exit, and you are searching for one person in particular.  On a crowded day, you might not even be able to get that close to the exit, and so it can be hard to see the person you are looking for.  Yet, most of the time, you manage to find the person you seek.
Part of what helps you to identify friends and relatives is information about their body.  You recognize their height, body shape, and even their manner of walking.  In fact, all of that may help you to know who you are looking at before the person is close enough to really see his or her face. 
An interesting paper in the November, 2013 issue of Psychological Science by Allyson Rice, Jonathon Phillips, Vaidehi Natu, Xiaobo An, and Alice O’Toole demonstrates that people use information about the body to identify people, but they are not aware that they are doing so. 
In these studies, participants saw pairs of pictures drawn from a large database.  Participants had to identify whether the pair of pictures showed the same person or different people.  The pictures used in the study were carefully selected so that this task was quite difficult.   Many of the pictures of the same person were rather dissimilar, while many of the pictures of different people were similar to each other.  As a result, face information alone was not helpful in determining whether the people in each pair were similar.
When participants were given the full pictures, they were reasonably accurate in making the judgments of which pictures were the same or different.  Some participants were shown only the faces from the pictures.  This group was not good at all at distinguishing the same and different pairs.  A third group saw only the bodies with the faces covered by an oval.  This group was about as accurate at identifying the pictures as the group who saw the full pictures. 
So far, this probably doesn’t seem so surprising.
In another study, participants were given the full pictures to judge.  Afterward, they were asked about a variety of facial and bodily features, and were asked how much they used this information to make the judgments.  Participants performed well in this study, suggesting that they had to be using information about the body, but their ratings suggested they believed that they were focused on the nose, face shape, ears, mouth, eye shape and eyebrows, but not on properties like the hair length, height, shoulders, and neck. 
These ratings suggest that people are mistaken.  That is, when people see just the face information, they are not able to distinguish between the pictures of the same people and pictures of different people.  The body shape information is important, yet people do not report using it.
A final study demonstrated that people really were reporting the information they used to make judgments incorrectly.  In this study, participants saw full pictures. They viewed the picture pairs while their eyes were being tracked.  Eye tracking enables researchers to monitor what people are looking at on a moment-by-moment basis.  The technique is effective, because you clear vision for only a small area (about the size of your thumbnail at arm’s length).  So, your eyes are constantly in motion to create a clear image of what you are seeing.
In this eye tracking study, some of the pairs of pictures were ones in which face information could be used to make reasonably accurate judgments.  Other pairs required body information to be used.  When the face information was helpful for making judgments, people looked at it quite a bit.  When the face information was not helpful for making judgments, then people focused more on areas of the body that would help them to determine whether the two people were the same.
What does this mean?
First of all, your visual system is smart.  It does a good job of figuring out the information you need to make judgments. 
Second, you do not have complete access to all of what your visual system is doing. Even though you shift your attention from the face to the body when body information will help you to recognize a person, you still think that you are focused on the person’s face.  This is another great example of how your conscious experience of what you are doing is not an accurate portrayal of what you are actually doing.

Thursday, April 30, 2015

Can Video Games Make You Smart (Or At Least More Flexible)?

The potential ills of video game play have been broadcast all over the media.  Playing violent video games can prime aggressive behavior.  Kids who get video game systems perform worse in school after they get the system than they did before. 

Not all effects of video games are bad, though.  There is evidence that playing video games can make people faster at processing visual information like searching for an object among a set of other distracters. 

One hallmark of smart thinking is flexibility.  People who are able to see the same object in different ways and can keep lots of possibilities in mind at the same time are often able to develop novel and creative solutions to problems.  A paper by Brian Glass, Todd Maddox, and Brad Love in the August 2013 issue of PLoS One suggests that some kinds of video games can help to teach this skill.

They compared the effects of playing real-time strategy games to playing games that require no particular strategic thinking.  The participants in this study were all women, because the experimenters had trouble finding enough men who do not play video games regularly.  The women were assigned to one of three groups. 

One group played a simple version of the game StarCraft.  In this game, participants have to create, organize, and deploy armies to attack an enemy.  In the simple version of the game, the player had one base and the enemy had one base.  In the more complex version of the game, the player had two bases and the enemy had two bases.  The overall difficulty of the game was then set up so that the simple and complex versions of the game were about equally hard to win.  This way, the games differed primarily in how much information players needed to keep in mind while playing.  The control condition had people play a life simulation (the SIMS), which does not require much strategy or memory.  Participants played their assigned game for 40 hours.

As a test, participants were given a pre-test and post-test of a series of tasks that tap cognitive abilities.  Some of the tests require cognitive flexibility.  For example, in the classic Stroop task, people name the color of a font for words that name colors.  The typical finding is that people are slow to name the color when the word names a different color than the font. 

In task switching procedures, people flip back and forth between the responses they make.  For example, in one task, people are shown a letter and a number (say e4).  On some trials, they are prompted to identify whether the letter is a vowel or consonant, while on other trials, they are prompted to identify whether the number is odd or even.  People generally slow down when asked to switch from one task (say identifying letters) on one trial to the other task (identifying numbers) on the next.  The faster you are able to switch between tasks, though, the more flexibly you are thinking.

Other tasks did not require flexibility.  For example, a visual search task requires finding a particular object among a set of distracters.  That task requires perceptual speed, but not flexibility.

The results of the study were striking.  Participants who played StarCraft showed significant improvement on the cognitive flexibility tasks, but not the other tasks compared to those who played the SIMS.  The improvement was largest for those who played the complex version of the game, and smaller for those who played the simple version.

Additional analyses found that the people who played the complex version of the game had to keep more information in mind while playing than those who played the simple version.  Practice using all of this information may have been the root of the improvement on the flexibility tasks.

These results are intriguing.  It is hard to get people to work on difficult tasks for long in school settings, but much easier to get them to work for long hours while playing video games.  If games can be structured to promote skills that improve flexible thinking, then they can be a valuable tool in helping people to get smarter. 

That said, flexible thinking is only a part of being smarter.  In order to really do smart things, you also need to know a lot of information in order to be able to use that knowledge to solve problems.  As much fun as video games may be, they will not substitute for the hours you need to put in to become an expert in at least one domain.

Thursday, April 16, 2015

Having a Hot Hand Increases Confidence, But Not Success

One of the great things about doing research is that you can actually test the beliefs that people take for granted.  And sometimes, those beliefs are shown to be false.  A classic example of this approach comes in the belief in a hot hand in basketball.  When you watch a basketball game, a player will make a couple of shots, and the announcers will decide that player is “on fire” and that he ought to take the team’s next shot.

Back in 1985, though, Tom Gilovich, Robert Vallone, and Amos Tversky actually analyzed data from the Philadelphia 76ers.  They found no evidence for a hot hand.  The hot hand would say that if a player makes one shot, then they should be more likely to make a second.  Gilovich, Vallone, and Tversky found that the probability that a player would make a second shot was independent of whether they made the first one, suggesting that there is no hot hand.

An interesting question, though, is whether the belief in the hot hand influences the behavior of the players themselves. That question was explored in analyses by Yigal Attali reported in the July, 2013 issue of Psychological Science.  He analyzed all of the data from every game in the 2010-2011 National Basketball Association season.  Modern transcripts for games include lots of information including who took each shot, whether it was made, and the distance of the shot. 

Attali found evidence that the belief in a hot hand did affect the behavior of players.  When a player made one shot, it affected whether they would take the team’s next shot.  When the shot was from a short distance (a dunk or layup), then players took about 20% of their team’s next shots regardless of whether they made or missed the shot.  However, when they made a shot that was longer than 4 feet, they were much more likely to take the team’s next shot than if they missed that shot.

That’s not all.  When players made a shot, the next shot they took was generally further from the basket than when players missed their last shot.  Because longer shots probably reflect that a player has more confidence in his ability, this suggests that making a shot increases a player’s confidence. 

Paradoxically, though, this confidence has a cost.  Longer shots are more likely to be missed than shorter shots, so when a player takes two shots in a row, he is much more likely to miss the second shot than to make it, because the second shot is probably taken from further away following a hit than following a miss.  (Indeed, Attali re-analyzed the data from the Philadelphia 76ers that Gilovich, Vallone, and Tversky used, and found a similar effect that when a player makes one shot, they are actually less likely to make the second shot than when they missed the previous shot.)

Finally, Attali explored the effect of making a shot on the behavior of coaches.  He found that players were much less likely to be taken out of a game following a made shot than following a missed shot.  So, coaches are also acting as though they believe in a hot hand.

What does all of this mean?

In lots of domains (including basketball), we have theories about the way the world works.  Those theories influence our actions.  However, it is important to know whether our theories about the way the world works are actually true.  Sometimes, as in the case of the hot hand in basketball, not only is the theory false, but acting based on the theory also makes people’s performance worse than it would be if they did not believe in the theory.

Tuesday, April 7, 2015

Rituals Make the World Taste Better

In the United States, we have a strange relationship with food.  Most of us eat on the go.  We drive through at fast food restaurants and then stuff our faces as we get where we’re going.  We eat at our desks while working.  We grab dinner in between other tasks, sometimes standing at a counter in the kitchen.
Food is fuel, of course, so perhaps this approach makes sense.  We don’t make an elaborate ceremony of putting gas in the car, so why should mealtime be any different?
Yet, cultures have often created rituals around food.  In many countries, mealtime is an oasis from the troubles of the day.  Everyone sits down around a well-set table.  Dishes are placed in the center.  People may say a prayer before eating.  And then the meal and the conversation commences. 
What exactly do we get out of creating ceremonies around eating?
An interesting paper by Kathleen Vohs, Yajin Wang, Francesca Gino, and Michael Norton in the September, 2013 issue of Psychological Science examines whether rituals affect the taste of food. 
In one study, participants ate carrots three times over the course of an experimental session.  Carrots are an interesting food choice, because they taste good, but they are not high on most people’s lists of desirable foods (compared, say, to ice cream or chocolate).  One group was given a ritual to perform before eating each carrot.  They would bang their knuckles on the table, close their eyes, and take a deep breath.  A second group was given a different sequence of actions before each carrot.  So, they performed an action, but it was not a ritual, because the actions were always different. 
Before eating the last carrot, participants rated how much they thought they would enjoy it, and after eating it, they rated their enjoyment of the carrot.  Finally, some participants were able to eat the third carrot immediately after performing the ritual, but others had a delay before eating the carrot.  The participants with the delay performed an unrelated study before eating the carrot.
Overall, participants who performed the ritual anticipated enjoying the carrot more than those who performed random actions, and their ratings of actual enjoyment were also higher.  The delay actually enhanced the influence of the ritual.  When people knew there would be a delay, they believed they would enjoy the carrot more and they actually did enjoy it more.
Another study (this one involving lemonade) found that you have to perform the ritual yourself to get the benefit of it.  Participants who watched the experimenter perform the ritual enjoyed the lemonade less than those who performed the ritual themselves. 
One last study (this one involving chocolate) found that participants who performed a ritual were more interested in the food than those who did not perform a ritual.  So, the ritual seems to have affected people’s intrinsic interest in the activity of eating.
Rituals are a pervasive cultural invention.  Every culture asks people to perform actions that have no obvious value in and of themselves.  I have written before about studies demonstrating that rituals can increase people’s sense of closeness to a community.  These studies expand this influence to show that rituals can increase people’s sense of closeness to food as well.
If you find that you are not enjoying the food you eat and that you tend to treat your food as fuel, then consider creating some rituals around the way you eat.  Set your table.  Turn off the TV and the computer.  Close your eyes for a moment and prepare to eat.  And then…enjoy.

Monday, March 30, 2015

Self-Compassion and Health

A few times in this blog, I have written about self-compassion.  Self-compassion is the degree to which you treat yourself with kindness.  It differs from related concepts like self-esteem, which is how good you feel about yourself.  Self-compassion determines how well you come back from adversity.  If you get down on yourself when things go wrong, then it is hard to bounce back from a problem.  If you treat yourself with kindness, then it is easier to recover from a bad experience.

An interesting paper in the July, 2013 issue of Personality and Social Psychology Bulletin by Meredith Terry, Mark Leary, Sneha Mehta, and Kelly Henderson examined the relationship between self-compassion and health behaviors. 

A key question in health-care is what factors lead people to seek help for medical problems.  Every year, some number of people avoid going to the doctor, even when they think they might be sick.  This avoidance can be dangerous if the delay leads a treatable condition to get worse.

In a series of studies, the authors examined the relationship between a measure of self-compassion and a variety of health-related behaviors.  To measure self-compassion, the authors used a scale that described a series of bad things that could happen in someone’s life like making a stupid mistake or having a hard time doing something that other people find easy.  They asked people to evaluate whether they would be likely to do self-compassionate things like cheering themselves up or uncompassionate things like judging themselves harshly.

One study found that people with health problems who have a high level of self-compassion are less depressed about those problems than people with a low level of self-compassion.  Another study found that people with a high-level of self-compassion said they would see a doctor more quickly for health problems than people with a low-level of self-compassion.  The authors found this relationship even after controlling for factors like how good people are at planning for the future.

A final study looked at why self-compassion influences health-related behaviors.  This study found that people with a high level of self-compassion also treat themselves kindly.  That is, they do not get down on themselves for having an illness.  They also frequently remind themselves that many people have health problems and that they do not deserve to be sick.  The combination of self-kindness and positive self-talk help to explain the influence of self-compassion on health behaviors.

This study adds to a growing body of work demonstrating the powerful effects of self-compassion.  Everyone is going to experience negative events in their lives. People try a new venture and fail.  They get sick or injured.  They get in relationships that ultimately break up.  They have loved-ones who get sick or die.  Nobody can escape the bad things that happen in life.

The key is to find ways to deal with those negative events in a positive way.  It is fine to experience the pain of a negative event.  But, after acknowledging the pain, it is also important to get up and try again—to remember that failures and illnesses and bad relationships are not a verdict on your worth as a person, but just another hurdle to be overcome.  Ultimately, you need to learn to treat yourself with the same kindness you would show to others in the same situation.