You are on page 1of 5

1.

Gap Year: How Different Societies Socialize Young Adults

Age transition points require socialization into new roles that can vary widely
between societies. Young adults in America are encouraged to enter college or
the workforce right away, students in England and India can take a year off like
British Princes William and Harry did, while young men in Singapore and
Switzerland must serve time in the military. Have you ever heard of gap year? It’s
a common custom in British society. When teens finish their secondary schooling
(aka high school in the United States), they often take a year “off” before entering
college. Frequently, they might take a job, travel, or find other ways to experience
another culture. Prince William, the Duke of Cambridge, spent his gap year
practicing survival skills in Belize, teaching English in Chile, and working on a dairy
farm in the United Kingdom (Prince of Wales 2012a). His brother, Prince Harry,
advocated for AIDS orphans in Africa and worked as a jackaroo (a novice ranch
hand) in Australia (Prince of Wales 2012b). In the United States, this life transition
point is socialized quite differently, and taking a year off is generally frowned
upon. Instead, U.S. youth are encouraged to pick career paths by their mid-teens,
to select a college and a major by their late teens, and to have completed all
collegiate schooling or technical training for their career by their early twenties. In
yet other nations, this phase of the life course is tied into conscription, a term
that describes compulsory military service. Egypt, Switzerland, Turkey, South
Korea, and Singapore all have this system in place. Youth in these nations (often
only the males) are expected to undergo a number of months or years of military
training and service. How might your life be different if you lived in one of these
other countries? Can you think of similar social norms—related to life age-
transition points—that vary from country to country?
2. The Long Road to Adulthood for Millennials

2008 was a year of financial upheaval in the United States. Rampant foreclosures
and bank failures set off a chain of events sparking government distrust, loan
defaults, and large-scale unemployment. How has this affected the United
States’s young adults? Millennials, sometimes also called Gen Y, is a term that
describes the generation born during the early eighties to early nineties. While
the economic recession was in full swing, many were in the process of entering,
attending, or graduating from high school and college. With employment
prospects at historical lows, large numbers of graduates were unable to find
work, sometimes moving back in with their parents and struggling to pay back
student loans. According to the New York Times, this economic stall is causing the
Millennials to postpone what most Americans consider to be adulthood: “The
traditional cycle seems to have gone off course, as young people remain
untethered to romantic partners or to permanent homes, going back to school for
lack of better options, traveling, avoiding commitments, competing ferociously for
unpaid internships or temporary (and often grueling) Teach for America jobs,
forestalling the beginning of adult life” (Henig 2010). The term Boomerang
Generation describes recent college graduates, for whom lack of adequate
employment upon college graduation often leads to a return to the parental
home (Davidson, 2014). The five milestones that define adulthood, Henig writes,
are “completing school, leaving home, becoming financially independent,
marrying, and having a child” (Henig 2010). These social milestones are taking
longer for Millennials to attain, if they’re attained at all. Sociologists wonder
what long-term impact this generation’s situation may have on society as a
whole.
3. Bullying and Cyberbullying

How Technology Has Changed the Game Most of us know that the old rhyme
“sticks and stones may break my bones, but words will never hurt me” is
inaccurate. Words can hurt, and never is that more apparent than in instances of
bullying. Bullying has always existed and has often reached extreme levels of
cruelty in children and young adults. People at these stages of life are especially
vulnerable to others’ opinions of them, and they’re deeply invested in their peer
groups. Today, technology has ushered in a new era of this dynamic.
Cyberbullying is the use of interactive media by one person to torment another,
and it is on the rise. Cyberbullying can mean sending threatening texts, harassing
someone in a public forum (such as Facebook), hacking someone’s account and
pretending to be him or her, posting embarrassing images online, and so on. A
study by the Cyberbullying Research Center found that 20 percent of middle
school students admitted to “seriously thinking about committing suicide” as a
result of online bullying (Hinduja and Patchin 2010). Whereas bullying face-to-
face requires willingness to interact with your victim, cyberbullying allows
bullies to harass others from the privacy of their homes without witnessing the
damage firsthand. This form of bullying is particularly dangerous because it’s
widely accessible and therefore easier to accomplish.

Cyberbullying, and bullying in general, made international headlines in 2010 when


a fifteen-year-old girl, Phoebe Prince, in South Hadley, Massachusetts, committed
suicide after being relentlessly bullied by girls at her school. In the aftermath of
her death, the bullies were prosecuted in the legal system and the state passed
anti-bullying legislation. This marked a significant change in how bullying,
including cyberbullying, is viewed in the United States. Now there are numerous
resources for schools, families, and communities to provide education and
prevention on this issue. The White House hosted a Bullying Prevention summit in
March 2011, and President and First Lady Obama have used Facebook and other
social media sites to discuss the importance of the issue. According to a report
released in 2013 by the National Center for Educational Statistics, close to 1 in
every 3 (27.8 percent) students report being bullied by their school peers.
Seventeen percent of students reported being the victims of cyberbullying. Will
legislation change the behavior of would-be cyberbullies? That remains to be
seen. But we can hope communities will work to protect victims before they feel
they must resort to extreme measures.

4. Does Day Care Create Unruly Brats?

Americans have fairly strong opinions concerning child care and ideal working
situations. According to a Gallup poll (May 4, 2001), 41% of Americans believe it is
best for one parent to stay at home solely to raise children while the other parent
works. Only 13% of Americans say the ideal situation is for both parents to work
full time outside the home. Of the 11.3 million children younger than five years
old whose mothers were employed, about 32% spent time in an organized care
facility such as a daycare center, nursery, or preschool. Thirty percent are cared
for by a grandparent, 25% receive care from their fathers, 3% from siblings, and
8% from other relatives. For more than two decades, the Study of Early Child Care
and Youth Development has tracked more than 1,300 children in various child
care arrangements, including staying home with a parent; being cared for by a
nanny or a relative; or attending a large day-care center. This is the largest and
longest-running study of American child care, and it found that the longer a child
spent in day care, the more aggressive and disobedient the child was in
elementary school, and the effect persisted through the sixth grade. The most
troubling aspect of this finding was that it held up regardless of the child’s sex or
family income or the quality of the day-care center. With more than three million
American preschoolers attending day care, the increased disruptiveness makes
the elementary school classroom harder to manage. On the positive side, the
study also found that time spent in high-quality day-care centers was correlated
with higher vocabulary scores through elementary school. The debate about child
care started in the late 1980s, when social scientists questioned the impact of day
care. Day-care centers and working parents argued that it was the quality of the
care that mattered, not the setting. Jay Belsky, one of the study’s principal
authors, in 1986 suggested that non-parental child care could cause
developmental problems even in good-quality day care. The cause of the
disruptive behavior later on could be due to preschool peer groups that model
disruptive behavior that is allowed to continue because of limited supervision in
day-care centers.

You might also like