You are on page 1of 21

REFERENCE FOR SPEAKING ENGLISH

How to use?
- Học thuộc và nói đi nói lại trong 3 tháng
- Khi học cần đi chung với việc nghe trên TED để nói đúng chuẩn ngữ điệu
- Phối hợp với cách nói giao tiếp hằng ngày trong Nói tiếng Anh như người bản
ngữ
Sherry Turkle
|
TED2012

Connected, but alone?


https://www.ted.com/talks/sherry_turkle_connected_but_alone/transcript?referrer=playlist-ou
r_digital_lives

00:12
Just a moment ago, my daughter Rebecca texted me for good luck. Her text said,
"Mom, you will rock." I love this. Getting that text was like getting a hug. And so
there you have it. I embody the central paradox. I'm a woman who loves getting texts
who's going to tell you that too many of them can be a problem.

00:45
Actually that reminder of my daughter brings me to the beginning of my story. 1996,
when I gave my first TEDTalk, Rebecca was five years old and she was sitting right
there in the front row. I had just written a book that celebrated our life on the
internet and I was about to be on the cover of Wired magazine. In those heady days,
we were experimenting with chat rooms and online virtual communities. We were
exploring different aspects of ourselves. And then we unplugged. I was excited. And,
as a psychologist, what excited me most was the idea that we would use what we
learned in the virtual world about ourselves, about our identity, to live better lives in
the real world.

01:39
Now fast-forward to 2012. I'm back here on the TED stage again. My daughter's 20.
She's a college student. She sleeps with her cellphone, so do I. And I've just written a
new book, but this time it's not one that will get me on the cover of Wired magazine.
So what happened? I'm still excited by technology, but I believe, and I'm here to
make the case, that we're letting it take us places that we don't want to go.

02:18
Over the past 15 years, I've studied technologies of mobile communication and I've
interviewed hundreds and hundreds of people, young and old, about their plugged in
lives. And what I've found is that our little devices, those little devices in our pockets,
are so psychologically powerful that they don't only change what we do, they change
who we are. Some of the things we do now with our devices are things that, only a
few years ago, we would have found odd or disturbing, but they've quickly come to
seem familiar, just how we do things.

03:00
So just to take some quick examples: People text or do email during corporate board
meetings. They text and shop and go on Facebook during classes, during
presentations, actually during all meetings. People talk to me about the important
new skill of making eye contact while you're texting. (Laughter) People explain to me
that it's hard, but that it can be done. Parents text and do email at breakfast and at
dinner while their children complain about not having their parents' full attention.
But then these same children deny each other their full attention. This is a recent
shot of my daughter and her friends being together while not being together. And we
even text at funerals. I study this. We remove ourselves from our grief or from our
revery and we go into our phones.

04:05
Why does this matter? It matters to me because I think we're setting ourselves up for
trouble -- trouble certainly in how we relate to each other, but also trouble in how we
relate to ourselves and our capacity for self-reflection. We're getting used to a new
way of being alone together. People want to be with each other, but also elsewhere --
connected to all the different places they want to be. People want to customize their
lives. They want to go in and out of all the places they are because the thing that
matters most to them is control over where they put their attention. So you want to
go to that board meeting, but you only want to pay attention to the bits that interest
you. And some people think that's a good thing. But you can end up hiding from each
other, even as we're all constantly connected to each other.

05:05
A 50-year-old business man lamented to me that he feels he doesn't have colleagues
anymore at work. When he goes to work, he doesn't stop by to talk to anybody, he
doesn't call. And he says he doesn't want to interrupt his colleagues because, he says,
"They're too busy on their email." But then he stops himself and he says, "You know,
I'm not telling you the truth. I'm the one who doesn't want to be interrupted. I think I
should want to, but actually I'd rather just do things on my Blackberry."

05:36
Across the generations, I see that people can't get enough of each other, if and only if
they can have each other at a distance, in amounts they can control. I call it the
Goldilocks effect: not too close, not too far, just right. But what might feel just right
for that middle-aged executive can be a problem for an adolescent who needs to
develop face-to-face relationships. An 18-year-old boy who uses texting for almost
everything says to me wistfully, "Someday, someday, but certainly not now, I'd like to
learn how to have a conversation."

06:23
When I ask people "What's wrong with having a conversation?" People say, "I'll tell
you what's wrong with having a conversation. It takes place in real time and you can't
control what you're going to say." So that's the bottom line. Texting, email, posting,
all of these things let us present the self as we want to be. We get to edit, and that
means we get to delete, and that means we get to retouch, the face, the voice, the
flesh, the body -- not too little, not too much, just right.

07:06
Human relationships are rich and they're messy and they're demanding. And we
clean them up with technology. And when we do, one of the things that can happen is
that we sacrifice conversation for mere connection. We short-change ourselves. And
over time, we seem to forget this, or we seem to stop caring.

07:33
I was caught off guard when Stephen Colbert asked me a profound question, a
profound question. He said, "Don't all those little tweets, don't all those little sips of
online communication, add up to one big gulp of real conversation?" My answer was
no, they don't add up. Connecting in sips may work for gathering discrete bits of
information, they may work for saying, "I'm thinking about you," or even for saying,
"I love you," -- I mean, look at how I felt when I got that text from my daughter -- but
they don't really work for learning about each other, for really coming to know and
understand each other. And we use conversations with each other to learn how to
have conversations with ourselves. So a flight from conversation can really matter
because it can compromise our capacity for self-reflection. For kids growing up, that
skill is the bedrock of development.

08:58
Over and over I hear, "I would rather text than talk." And what I'm seeing is that
people get so used to being short-changed out of real conversation, so used to getting
by with less, that they've become almost willing to dispense with people altogether.
So for example, many people share with me this wish, that some day a more
advanced version of Siri, the digital assistant on Apple's iPhone, will be more like a
best friend, someone who will listen when others won't. I believe this wish reflects a
painful truth that I've learned in the past 15 years. That feeling that no one is
listening to me is very important in our relationships with technology. That's why it's
so appealing to have a Facebook page or a Twitter feed -- so many automatic
listeners. And the feeling that no one is listening to me make us want to spend time
with machines that seem to care about us.

10:04
We're developing robots, they call them sociable robots, that are specifically designed
to be companions -- to the elderly, to our children, to us. Have we so lost confidence
that we will be there for each other? During my research I worked in nursing homes,
and I brought in these sociable robots that were designed to give the elderly the
feeling that they were understood. And one day I came in and a woman who had lost
a child was talking to a robot in the shape of a baby seal. It seemed to be looking in
her eyes. It seemed to be following the conversation. It comforted her. And many
people found this amazing.

10:57
But that woman was trying to make sense of her life with a machine that had no
experience of the arc of a human life. That robot put on a great show. And we're
vulnerable. People experience pretend empathy as though it were the real thing. So
during that moment when that woman was experiencing that pretend empathy, I was
thinking, "That robot can't empathize. It doesn't face death. It doesn't know life."

11:34
And as that woman took comfort in her robot companion, I didn't find it amazing; I
found it one of the most wrenching, complicated moments in my 15 years of work.
But when I stepped back, I felt myself at the cold, hard center of a perfect storm. We
expect more from technology and less from each other. And I ask myself, "Why have
things come to this?"

12:08
And I believe it's because technology appeals to us most where we are most
vulnerable. And we are vulnerable. We're lonely, but we're afraid of intimacy. And so
from social networks to sociable robots, we're designing technologies that will give us
the illusion of companionship without the demands of friendship. We turn to
technology to help us feel connected in ways we can comfortably control. But we're
not so comfortable. We are not so much in control.

12:42
These days, those phones in our pockets are changing our minds and hearts because
they offer us three gratifying fantasies. One, that we can put our attention wherever
we want it to be; two, that we will always be heard; and three, that we will never have
to be alone. And that third idea, that we will never have to be alone, is central to
changing our psyches. Because the moment that people are alone, even for a few
seconds, they become anxious, they panic, they fidget, they reach for a device. Just
think of people at a checkout line or at a red light. Being alone feels like a problem
that needs to be solved. And so people try to solve it by connecting. But here,
connection is more like a symptom than a cure. It expresses, but it doesn't solve, an
underlying problem. But more than a symptom, constant connection is changing the
way people think of themselves. It's shaping a new way of being.

13:48
The best way to describe it is, I share therefore I am. We use technology to define
ourselves by sharing our thoughts and feelings even as we're having them. So before
it was: I have a feeling, I want to make a call. Now it's: I want to have a feeling, I need
to send a text. The problem with this new regime of "I share therefore I am" is that, if
we don't have connection, we don't feel like ourselves. We almost don't feel ourselves.
So what do we do? We connect more and more. But in the process, we set ourselves
up to be isolated.

14:30
How do you get from connection to isolation? You end up isolated if you don't
cultivate the capacity for solitude, the ability to be separate, to gather yourself.
Solitude is where you find yourself so that you can reach out to other people and
form real attachments. When we don't have the capacity for solitude, we turn to other
people in order to feel less anxious or in order to feel alive. When this happens, we're
not able to appreciate who they are. It's as though we're using them as spare parts to
support our fragile sense of self. We slip into thinking that always being connected is
going to make us feel less alone. But we're at risk, because actually it's the opposite
that's true. If we're not able to be alone, we're going to be more lonely. And if we
don't teach our children to be alone, they're only going to know how to be lonely.

15:34
When I spoke at TED in 1996, reporting on my studies of the early virtual
communities, I said, "Those who make the most of their lives on the screen come to it
in a spirit of self-reflection." And that's what I'm calling for here, now: reflection and,
more than that, a conversation about where our current use of technology may be
taking us, what it might be costing us. We're smitten with technology. And we're
afraid, like young lovers, that too much talking might spoil the romance. But it's time
to talk. We grew up with digital technology and so we see it as all grown up. But it's
not, it's early days. There's plenty of time for us to reconsider how we use it, how we
build it. I'm not suggesting that we turn away from our devices, just that we develop a
more self-aware relationship with them, with each other and with ourselves.

16:39
I see some first steps. Start thinking of solitude as a good thing. Make room for it.
Find ways to demonstrate this as a value to your children. Create sacred spaces at
home -- the kitchen, the dining room -- and reclaim them for conversation. Do the
same thing at work. At work, we're so busy communicating that we often don't have
time to think, we don't have time to talk, about the things that really matter. Change
that. Most important, we all really need to listen to each other, including to the
boring bits. Because it's when we stumble or hesitate or lose our words that we reveal
ourselves to each other.

17:30
Technology is making a bid to redefine human connection -- how we care for each
other, how we care for ourselves -- but it's also giving us the opportunity to affirm
our values and our direction. I'm optimistic. We have everything we need to start. We
have each other. And we have the greatest chance of success if we recognize our
vulnerability. That we listen when technology says it will take something complicated
and promises something simpler.

18:08
So in my work, I hear that life is hard, relationships are filled with risk. And then
there's technology -- simpler, hopeful, optimistic, ever-young. It's like calling in the
cavalry. An ad campaign promises that online and with avatars, you can "Finally, love
your friends love your body, love your life, online and with avatars." We're drawn to
virtual romance, to computer games that seem like worlds, to the idea that robots,
robots, will someday be our true companions. We spend an evening on the social
network instead of going to the pub with friends.

18:56
But our fantasies of substitution have cost us. Now we all need to focus on the many,
many ways technology can lead us back to our real lives, our own bodies, our own
communities, our own politics, our own planet. They need us. Let's talk about how
we can use digital technology, the technology of our dreams, to make this life the life
we can love.

19:31
Thank you.

19:33
(Applause)

Kashmir Hill and Surya Mattu

TED2018

What your smart devices know (and


share) about you
https://www.ted.com/talks/kashmir_hill_and_surya_mattu_what_your_smart_de
vices_know_and_share_about_you/transcript?referrer=playlist-our_digital_lives&l
anguage=en

00:12
Kashmir Hill: So for my birthday last year, my husband got me an Amazon Echo. I
was kind of shocked, actually, because we both work in privacy and security.

00:22
(Laughter)

00:24
And this was a device that would sit in the middle of our home with a microphone on,
constantly listening.

00:31
We're not alone, though. According to a survey by NPR and Edison Research, one in
six American adults now has a smart speaker, which means that they have a virtual
assistant at home. Like, that's wild. ​The future, or the future dystopia, is getting here
fast.​ Beyond that, companies are offering us all kinds of internet-connected devices.
There are smart lights, smart locks, smart toilets, smart toys, smart sex toys. Being
smart means the device can connect to the internet, it can gather data, and it can talk
to its owner.

01:06
But once your appliances can talk to you, who else are they going to be talking to? I
wanted to find out, so I went all-in and turned my one-bedroom apartment in San
Francisco into a smart home. I even connected our bed to the internet. As far as I
know, it was just measuring our sleeping habits. I can now tell you that the only thing
worse than getting a terrible night's sleep is to have your smart bed tell you the next
day that you "missed your goal and got a low sleep score."

01:35
(Laughter)

01:37
It's like, "Thanks, smart bed. As if I didn't already feel like shit today."

01:41
(Laughter)

01:42
All together, I installed 18 internet-connected devices in my home. I also installed a
Surya.

01:49
Surya Mattu: Hi, I'm Surya.

01:50
(Laughter)

01:51
I monitored everything the smart home did. I built a special router that let me look at
all the network activity. You can think of my router sort of like a security guard,
compulsively logging all the network packets as they entered and left the smart
home.

02:06
KH: Surya and I are both journalists, he's not my husband, we just work together at
Gizmodo.

02:10
SM: Thank you for clarifying. The devices Kashmir bought -- we were interested in
understanding what they were saying to their manufacturers. But we were also
interested in understanding what the home's digital emissions look like to the
internet service provider. We were seeing what the ISP could see, but more
importantly, what they could sell.

02:28
KH: We ran the experiment for two months. In that two months, there wasn't a
single hour of digital silence in the house -- not even when we went away for a week.

02:36
SM: Yeah, it's so true. Based on the data, I knew when you guys woke up and went to
bed. I even knew when Kashmir brushed her teeth. I'm not going to out your
brushing habits, but let's just say it was very clear to me when you were working from
home.

02:48
KH: Uh, I think you just outed them to, like, a lot of people here.

02:51
SM: Don't be embarrassed, it's just metadata.

02:54
I knew when you turned on your TV and how long you watched it for. Fun fact about
the Hill household: they don't watch a lot of television, but when they do, it's usually
in binge mode. Favorite shows include "Difficult People" and "Party Down."

03:06
KH: OK, you're right, I loved "Party Down." It's a great show, and you should
definitely watch it. But "Difficult People" was all my husband, Trevor. And Trevor
was actually a little upset that you knew about his binges, because even though he'd
been the one to connect the TV to the router, he forgot that the TV was watching us.

03:23
It's actually not the first time that our TV has spied on us. The company that made it,
VIZIO, paid a 2.2 million-dollar settlement to the government just last year, because
it had been collecting second-by-second information about what millions of people
were watching on TV, including us, and then it was selling that information to data
brokers and advertisers.

03:43
SM: Ah, classic surveillance economy move. The devices Kashmir bought almost all
pinged their servers daily. But do you know which device was especially chatty? The
Amazon Echo. It contacted its servers every three minutes, regardless of whether you
were using it or not.

03:59
KH: In general, it was disconcerting that all these devices were having ongoing
conversations that were invisible to me. I mean, I would have had no idea, without
your router. If you buy a smart device, you should probably know -- you're going to
own the device, but in general, the company is going to own your data. And you
know, I mean, maybe that's to be expected -- you buy an internet-connected device,
it's going to use the internet. But it's strange to have these devices moving into the
intimate space that is the home and allowing companies to track our really basic
behavior there.

04:32
SM: So true. Even the most banal-seeming data can be mined by the surveillance
economy. For example, who cares how often you brush your teeth? Well, as it turns
out, there's a dental insurance company called Beam. They've been monitoring their
customers' smart toothbrushes since 2015 -- for discounts on their premiums, of
course.

04:49
KH: We know what some of you are thinking: this is the contract of the modern
world. You give up a little privacy, and you get some convenience or some price
breaks in return. But that wasn't my experience in my smart home. It wasn't
convenient, it was infuriating. I'll admit, I love my smart vacuum, but many other
things in the house drove me insane: we ran out of electrical outlets, and I had to
download over a dozen apps to my phone to control everything. And then every
device had its own log-in, my toothbrush had a password ...

05:22
(Laughter)

05:23
And smart coffee, especially, was just a world of hell.

05:28
SM: Wait, really? Cloud-powered coffee wasn't really working for you?

05:32
KH: I mean, maybe I'm naive, but I thought it was going to be great. I thought we'd
just wake up in the morning and we'd say, "Alexa, make us coffee." But that's not how
it went down. We had to use this really particular, brand-specific phrase to make it
work. It was, "Alexa, ask the Behmor to run quick start." And this was just, like, really
hard to remember first thing in the morning, before you have had your caffeine.

05:57
(Laughter)

05:58
And apparently, it was hard to say, because the Echo Dot that was right next to our
bed just couldn't understand us. So we would basically start every day by screaming
this phrase at the Echo Dot.

06:10
(Laughter)

06:11
And Trevor hated this. He'd be like, "Please, Kashmir, just let me go to the kitchen
and push the button to make the coffee run." And I'd be like, "No, you can't! We have
to do it the smart way!"

06:23
(Laughter)

06:25
I'm happy to report that our marriage survived the experiment, but just barely.

06:30
SM: If you decide to make your home smart, hopefully, you’ll find it less infuriating
than Kashmir did. But regardless, the smart things you buy can and probably are
used to target and profile you. Just the number of devices you have can be used to
predict how rich or poor you are. Facebook's made this tech, and they've also
patented it.

06:48
KH: All the anxiety you currently feel every time you go online, about being tracked,
is about to move into your living room. Or into your bedroom.

06:57
There's this sex toy called the We-Vibe. You might wonder why a sex toy connects to
the internet, but it's for two people who are in a long-distance relationship, so they
can share their love from afar. Some hackers took a close look at this toy and saw it
was sending a lot of information back to the company that made it -- when it was
used, how long it was used for, what the vibration settings were, how hot the toy got.
It was all going into a database. So I reached out to the company, and I said, "Why
are you collecting this really sensitive data?" And they said, "Well, it's great for
market research." But they were data-mining their customers' orgasms. And they
weren't telling them about it. I mean, even if you're cavalier about privacy, I hope
that you would admit that's a step too far.

07:46
SM: This is why I want to keep my sex toys dumb.

07:49
KH: That's great. We're all very glad to know that.

07:52
(Laughter)

07:53
SM: A data point I'm willing to share.

07:55
(Laughter)

07:57
The devices Kashmir bought range from useful to annoying. But the thing they all
had in common was sharing data with the companies that made them. With email
service providers and social media, we've long been told that if it's free, you're the
product. But with the internet of things, it seems, even if you pay, you're still the
product. So you really have to ask: Who's the true beneficiary of your smart home,
you or the company mining you?

08:19
KH: Look, we're a tech savvy crowd here. I think most of us know that these things
connect to the internet and send data out. And fine, maybe you're OK with living in
that commercial panopticon, but others aren't. We need the companies to rethink the
design of these devices with our privacy in mind, because we're not all willing to
participate in "market research," just because a device we bought has a Wi-Fi
connection. And I have to tell you, even when you're aware, generally, this is
happening, it's really easy to forget that normal household items are spying on you.
It's easy to forget these things are watching you, because they don't look like
cameras. They could look like ... well, they could look like a dildo.

08:59
Thank you.

09:00
(Applause)

Eli Pariser

TED2011

Beware online "filter bubbles"


https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles/transcript?r
eferrer=playlist-our_digital_lives&language=en

00:12
Mark Zuckerberg, a journalist was asking him a question about the news feed. And
the journalist was asking him, "Why is this so important?" And Zuckerberg said, "A
squirrel dying in your front yard ​may be more relevant to your interests right now
than people dying in Africa." And I want to talk about what a Web based on that idea
of relevance might look like.

00:37
So when I was growing up in a really rural area in Maine, the Internet meant
something very different to me. It meant a connection to the world. It meant
something that would connect us all together. And I was sure that it was going to be
great for democracy and for our society. But there's this shift in how information is
flowing online, and it's invisible. And if we don't pay attention to it, it could be a real
problem. So I first noticed this in a place I spend a lot of time -- my Facebook page.
I'm progressive, politically -- big surprise -- but I've always gone out of my way to
meet conservatives. I like hearing what they're thinking about; I like seeing what they
link to; I like learning a thing or two. And so I was surprised when I noticed one day
that the conservatives had disappeared from my Facebook feed. And what it turned
out was going on was that Facebook was looking at which links I clicked on, and it
was noticing that, actually, I was clicking more on my liberal friends' links than on
my conservative friends' links. And without consulting me about it, it had edited
them out. They disappeared.

01:51
So Facebook isn't the only place that's doing this kind of invisible, algorithmic editing
of the Web. Google's doing it too. If I search for something, and you search for
something, even right now at the very same time, we may get very different search
results. Even if you're logged out, one engineer told me, there are 57 signals that
Google looks at -- everything from what kind of computer you're on to what kind of
browser you're using to where you're located -- that it uses to personally tailor your
query results. Think about it for a second: there is no standard Google anymore. And
you know, the funny thing about this is that it's hard to see. You can't see how
different your search results are from anyone else's.

02:39
But a couple of weeks ago, I asked a bunch of friends to Google "Egypt" and to send
me screen shots of what they got. So here's my friend Scott's screen shot. And here's
my friend Daniel's screen shot. When you put them side-by-side, you don't even have
to read the links to see how different these two pages are. But when you do read the
links, it's really quite remarkable. Daniel didn't get anything about the protests in
Egypt at all in his first page of Google results. Scott's results were full of them. And
this was the big story of the day at that time. That's how different these results are
becoming.

03:18
So it's not just Google and Facebook either. This is something that's sweeping the
Web. There are a whole host of companies that are doing this kind of personalization.
Yahoo News, the biggest news site on the Internet, is now personalized -- different
people get different things. Huffington Post, the Washington Post, the New York
Times -- all flirting with personalization in various ways. And this moves us very
quickly toward a world in which the Internet is showing us what it thinks we want to
see, but not necessarily what we need to see. As Eric Schmidt said, "It will be very
hard for people to watch or consume something that has not in some sense been
tailored for them."

04:02
So I do think this is a problem. And I think, if you take all of these filters together,
you take all these algorithms, you get what I call a filter bubble. And your filter
bubble is your own personal, unique universe of information that you live in online.
And what's in your filter bubble depends on who you are, and it depends on what you
do. But the thing is that you don't decide what gets in. And more importantly, you
don't actually see what gets edited out. So one of the problems with the filter bubble
was discovered by some researchers at Netflix. And they were looking at the Netflix
queues, and they noticed something kind of funny that a lot of us probably have
noticed, which is there are some movies that just sort of zip right up and out to our
houses. They enter the queue, they just zip right out. So "Iron Man" zips right out,
and "Waiting for Superman" can wait for a really long time.

04:59
What they discovered was that in our Netflix queues there's this epic struggle going
on between our future aspirational selves and our more impulsive present selves. You
know we all want to be someone who has watched "Rashomon," but right now we
want to watch "Ace Ventura" for the fourth time. (Laughter) So the best editing gives
us a bit of both. It gives us a little bit of Justin Bieber and a little bit of Afghanistan. It
gives us some information vegetables; it gives us some information dessert. And the
challenge with these kinds of algorithmic filters, these personalized filters, is that,
because they're mainly looking at what you click on first, it can throw off that
balance. And instead of a balanced information diet, you can end up surrounded by
information junk food.

05:56
What this suggests is actually that we may have the story about the Internet wrong.
In a broadcast society -- this is how the founding mythology goes -- in a broadcast
society, there were these gatekeepers, the editors, and they controlled the flows of
information. And along came the Internet and it swept them out of the way, and it
allowed all of us to connect together, and it was awesome. But that's not actually
what's happening right now. What we're seeing is more of a passing of the torch from
human gatekeepers to algorithmic ones. And the thing is that the algorithms don't
yet have the kind of embedded ethics that the editors did. So if algorithms are going
to curate the world for us, if they're going to decide what we get to see and what we
don't get to see, then we need to make sure that they're not just keyed to relevance.
We need to make sure that they also show us things that are uncomfortable or
challenging or important -- this is what TED does -- other points of view.

07:00
And the thing is, we've actually been here before as a society. In 1915, it's not like
newspapers were sweating a lot about their civic responsibilities. Then people
noticed that they were doing something really important. That, in fact, you couldn't
have a functioning democracy if citizens didn't get a good flow of information, that
the newspapers were critical because they were acting as the filter, and then
journalistic ethics developed. It wasn't perfect, but it got us through the last century.
And so now, we're kind of back in 1915 on the Web. And we need the new gatekeepers
to encode that kind of responsibility into the code that they're writing.

07:48
I know that there are a lot of people here from Facebook and from Google -- Larry
and Sergey -- people who have helped build the Web as it is, and I'm grateful for that.
But we really need you to make sure that these algorithms have encoded in them a
sense of the public life, a sense of civic responsibility. We need you to make sure that
they're transparent enough that we can see what the rules are that determine what
gets through our filters. And we need you to give us some control so that we can
decide what gets through and what doesn't. Because I think we really need the
Internet to be that thing that we all dreamed of it being. We need it to connect us all
together. We need it to introduce us to new ideas and new people and different
perspectives. And it's not going to do that if it leaves us all isolated in a Web of one.

08:42
Thank you.

08:44
(Applause)

Canwen Xu: I Am Not Your Asian


Stereotype at

My name is Canwen, and I play both the piano and the violin. I aspire to some day be
a doctor, and my favorite subject is calculus.

My mom and dad are tiger parents, who won’t let me go to sleepovers, but they make
up for it by serving my favorite meal every single day: Rice.

And I’m a really bad driver. So my question for you now is: How long did it take you
to figure out? I was joking.

As you’ve probably guessed, today I am going to talk about race and I’ll start off by
sharing with you my story of growing up as Asian-American.

I moved to the United States when I was two years old, so almost my entire life has
been a blend of two cultures.
I eat pasta with chopsticks. I’m addicted to orange chicken, and my childhood hero
was Yao Ming. But having grown up in North Dakota, South Dakota, and Idaho, all
states with incredible little racial diversity, it was difficult to reconcile my so-called
exotic Chinese heritage with my mainstream American self. Used to being the only
Asian in the room, I was self-conscious at the first thing people noticed about me
was, that I wasn’t white. And as a child I quickly began to realize that I had two
options in front of me.

Conformed to the stereotype that was expected of me, or conformed to the whiteness
that surrounded me. There was no in between. For me, this meant that I always felt
self-conscious about being good at maths, because people would just say it was
because I was Asian, not because I actually worked hard. It meant that whenever a
boy asked me out, it was because he had the yellow fever, and not because he actually
liked me. It meant that for the longest time my identity had formed around the fact
that I was different.

And I thought that being Asian was the only special thing about me. These effects
were emphasized by the places where I lived. Don’t get me wrong. Only a small
percentage of people were actually racist, or, even borderline racist, but the vast
majority were just a little bit clueless. Now, I know you are probably thinking,
“What’s the difference?” Well, here is an example.

Not racist can sound like, “I’m white and you’re not”. Racist can sound like, “I’m
white, you’re not, and that makes me better than you.” But clueless sounds like, “I’m
white, you’re not, and I don’t know how to deal with that.” Now, I don’t doubt for a
second that these clueless people are still nice individuals with great intentions. But
they do ask some questions that become pretty annoying after a while.

Here are a few examples “You’re Chinese, oh my goodness, I have a Chinese friend,
do you know him?”

“No, I don’t know him. Because contrary to your unrealistic expectations, I do not
know every single one of the 135 billion Chinese people who live on Planet Earth.”

People also tend to ask, “Where does your name come from?”, and I really don’t
know how to answer that, so I usually stick with the truth: “My parents gave it to me.
Where does your name come from?” Don’t even get me started on how many times
people have confused me with a different Asian person.

One time someone came up to me and said, “Angie, I love your art work!” And I was
super confused, so I just thanked them and walked away. But, out of all the questions
my favorite one is still the classic, “Where are you from?”, because I’ve lived in quite
a few places, so this is how the conversation usually goes.
“Where are you from?”

“Oh, I am from Boise, Idaho.”

“I see, but where are you really from?”

“I mean, I lived in South Dakota for a while.”

“Okay, what about before that?”

“I mean, I lived in North Dakota.”

“Okay, I’m just going to cut straight to the chase here, I guess what I’m saying is,
have you ever lived anywhere far away from here, where people talk a little
differently?”

“Oh, I know where you’re talking about, yes I have, I used to live in Texas.”

By then, they usually have just given up and wonder to themselves why I’m not one of
the cool Asians like Jeremy Lin or Jackie Chan, or they skip the needless banter and
go straight for the, “Where is your family from?” So, just an FYI for all of you out
there, that is the safest strategy.

But, as amusing as these interactions were, oftentimes they made me want to reject
my own culture, because I thought it helped me conform. I distanced myself from the
Asian stereotype as much as possible, by degrading my own race, and pretending I
hated math. And the worse part was, it worked.

The more I rejected my Chinese identity, the more popular I became. My peers liked
me more, because I was more similar to them.

I became more confident, because I knew I was more similar to them. But as I
became more Americanized, I also began to lose bits and pieces of myself, parts of me
that I can never get back, and no matter how much I tried to pretend that I was the
same as my American classmates, I wasn’t.

Because for people who have lived in the places where I lived, white is the norm, and
for me, white became the norm too. For my fourteenth birthday, I received the video
game The Sims 3, which lets you create your own characters and control their lives.
My fourteen-year-old self created the perfect little mainstream family, complete with
a huge mansion and an enormous swimming pool.

I binge-played the game for about three months, then put it away and never really
thought about it again, until a few weeks ago, when I came to a sudden realization.
The family, that I had custom-designed, was white. The character that I had designed
for myself, was white. Everyone I had designed was white. And the worst part was,
this was by no means a conscious decision that I had made.

Never once did I think to myself that I could actually make the characters look like
me. Without even thinking, white had become my norm too. The truth is, Asian
Americans play a strange role in the American melting pot. We are the model
minority .Society uses our success to pit us against other people of color as
justification that racism doesn’t exist.

But does that mean for us, Asian Americans? It means that we are not quite similar
enough to be accepted, but we aren’t different enough to be loathed. We are in a
perpetually grey zone, and society isn’t quite sure what to do with us. So they group
us by the color of our skin. They tell us that we must reject our own heritages, so we
can fit in with the crowd. They tell us that our foreignness is the only identifying
characteristic of us.

They strip away our identities one by one, until we are foreign, but not quite foreign,
American but not quite American, individual, but only when there are no other
people from our native country around I wish that I had always had the courage to
speak out about these issues.

But coming from one culture that avoids confrontation, and another that is divided
over race, how do I overcome the pressure to keep the peace, while also staying true
to who I am? And as much as I hate to admit it, often times I don’t speak out,
because, if I do, it’s at the the risk of being told that I am too sensitive, or that I get
offended too easily, or that it’s just not worth it.

But I would point, are people willing to admit that? Yes, race issues are controversial.
But that’s precisely the reason why we need to talk about them.

I just turned eighteen, and there are still so many things that I don’t know about the
world. But what I do know is that it’s hard to admit that you might be part of the
problem, that, all of us might be part of the problem. So, instead of giving you a
step-by-step guide on how to not be racist towards Asians, I will let you decide what
to take from this talk. All I can do, is share my story. My name is Canwen, my favorite
color is purple.

And I play the piano, but not so much the violin. I have two incredibly supportive,
hardworking parents, and one very awesome ten-year-old brother. I love calculus
more than anything, despise eating rice, and I’m a horrendous driver. But most of all,
I am proud of who I am. A little bit American, a little bit Chinese, and a whole lot of
both.

Thank you.

You might also like