You are on page 1of 3

We often underestimate the difference between our knowledge and someone else's.

When we know much more than someone else does about a topic, we might not accurately account for another person's lack of knowledge. We're swayed heavily by how much we know, and to some extent, we have a tendency to think that the other person shares our knowledge. We often think he knows more than he does. Sales agents who know the most about their product often have the worst understanding of how little a client knows about it. In other words, a sales agent's knowledge about the topic makes him think the customer also knows about it. If a sales agent has 100,000 units of knowledge about his product, he might think the average customer has 5,000 units, when in fact, the true number is 1,000. The sales agent's abundance of knowledge makes him overestimate his client's knowledge. Another sales agent, on the other hand, might have 35,000 units, and figure that his customers only have 1,000. He might not know exactly what the customer does and doesn't know. But all in all, he has a far more accurate idea of what his client knows. In cases like that, the first salesman's extra knowledge has that major drawback. And in order to eliminate the drawback, he must do a better job accounting for the differences between what he knows and what his customers do. It's far easier for the second salesmen (35,000 units of knowledge) to make that adjustment. In an experiment, some participants were given privileged information about a company's future earnings, while others didn't get that information. The first group was then told to guess what the second group would predict about the company's earnings. In other words, the first group's instructions were something like, "That other group doesn't have the extra information we gave you. What do you think their predictions will be like without that information?" And the researchers found that people in the first group weren't able to completely disregard their privileged information when trying to figure out the second group's predictions. They knew that the second group didn't have the extra information--but they still couldn't completely account for the other group's ignorance on that matter. And later experiments showed that even when the experimenters let participants in that first group know about the error that other participants had made, and even when they also offered the participants money as a reward for not making the mistake, the first group

still had a tough time with the task. They still allowed the extra information to affect their guesses of the other group's predictions. In other words, the study suggests that even when we know that others lack our knowledge, and even when that fact is emphasized, and we're given an incentive to account for it, we still might end up not realizing it on certain levels. We "know" it in a certain sense. But in other ways, we only partially know it. Studies like that don't necessarily prove much. But I do feel they suggest a basic idea that has a lot of validity: we often base our view of others too much on our own example. And in some cases, it can be very difficult to really escape that tendency, and look very far outside of our example. According to Thomas Gilovich, Victoria Husted Medvec, and Kenneth Savitsky, "People are typically quite aware of their own internal states and tend to focus on them rather intently when they are strong. To be sure, people recognize that others are not privy to the same information as they are, and they attempt to adjust for this fact when trying to anticipate another's perspective. Nevertheless, it can be hard to get beyond one's own perspective even when one knows that it is necessary to do so: The 'adjustment' that one makes from the 'anchor' of one's own internal experience is likely to be insufficient." Many other studies point in similar directions. When one person taps out a very well know song and another person listens, the first person usually thinks it's somewhat obvious what song he's tapping, and he figures the second person has a decent chance of figuring it out. And yet, about 98% of the time, the second person doesn't know what song is being tapped. In other words, when a person taps out a song like "Happy Birthday" to another person, the first person thinks there's a pretty good chance the second person will eventually hear "Happy Birthday" in the taps, even though he probably won't. When someone watches a funny video, and then afterwards, he's asked about what his facial expressions were like while he was watching the video, he usually thinks those expressions really showed how he was feeling. But most of the time, that's far from being the case. To outside observers, his face is far less expressive than he thinks it was. And then when the person is shown a

hidden camera recording of himself watching the funny video, he also usually says that he showed less emotion than he thought he had shown. As in, "I thought I showed 10 units of emotion. But other people are saying I showed five. And now that I see myself on camera, it really does look like I only showed 5." When someone reads a vague statement that can be interpreted two different ways, and then he's told that it means A and not B, he usually ends up thinking that it's obvious A is the correct meaning. It seems more obvious to him than it really is. And when he's asked what other people will think the statement means, he'll overestimate how many of them will think it means A and not B. In other words, if he's asked, "What percentage of people will think the statement means A?," he might guess 85%, even though the actual number is 50%. In reality, the statement is unclear. But when a person is told it means A and not B, he might not realize how unclear the statement is. All of a sudden, it seems somewhat obvious that meaning A is correct. And in real life, it's common for people to underestimate how often they'll be misunderstood. They might make a somewhat vague comment, and think the intended meaning is very apparent. People also sometimes make somewhat obscure references, but are under the impression that they're not so obscure. Or when people debate, they often have a tough time really understanding how and why someone else's views differ from their own; and when they present their arguments, they often overestimate how persuasive they'll be. All of these cases--the sales agents, extra information about a stock, tapping, facial expressions, vague statements, obscure references, and debates--show how it can often be difficult for us to look outside of our own knowledge, perception, belief system, ideas, states, etc., and how we frequently underestimate the difficulty of doing so. In order to understand others, it's important to realize that we're like this. And it's also important to realize that others are also the same way.

You might also like