You are on page 1of 12

A face for politics: New study shows we can tell Democrats from Republicans in head shots You can

tell a lot about a person by their faceeven their political affiliation, new research claims. In a study published in the January 18 issue of PLoS One, subjects were able to accurately identify candidates from the 2004 and 2006 U.S. Senate elections as either Democrats or Republicans based on black-and-white photos of their faces. And subjects were even able to correctly identify college students as belonging to Democratic or Republican clubs based on their yearbook photos. To investigate the basis of these judgments, subjects were asked to rate photos of faces on a seven-point scale assessing personality traits such as assertiveness, maturity, likeability and trustworthiness. Subjects consistently associated Democrats with warmth (likeable and trustworthy) and Republicans with power (dominant and mature). These findings were independent of the gender of the person in the photo. The authors concluded that people possess "a general and imperfect" ability to infer political affiliation based on facial appearance, which is related to stereotypes about Democrat and Republican personalities. The ability to surmise other perceptually ambiguous traits, such as sexual orientation and religious group membership, has been reported in similar studies. The consequences of this general and imperfect ability are worrisome, with career opportunities, court rulings and financial success potentially falling subject toprejudicial judgments, according to the authors. But surely these judgments dont interfere with election results, do they? In a study published in Science in February 2009, subjects were able to predict from a pair of photos of faces alone which political figure would win an election. Even children could pick the winner when asked who they would prefer to be captain of their boat. And in a study published in the Proceedings of the National Academy of Sciences in November 2007, researchers linked competence perceived from a candidates face to his or her electoral success. The latest study adds to the growing body of evidence suggesting that certain traits once thought to be indistinguishable based on looks alone are in fact written all over our faces. Running barefoot is better, researchers find Mother Nature has outpaced science once again: the bare human foot is better for running than one cushioned by sneakers. What about those $125 high-tech running shoeswith 648 custom combinations? Toss em, according to a new study published online January 27 in the journal Nature (Scientific American is part of Nature Publishing Group). "Most people today think barefoot running is dangerous and hurts," Daniel Lieberman, a professor of human evolutionary biology at Harvard University, said in a prepared statement. "But actually you can run barefoot on the worlds hardest surfaces without the slightest discomfort and painIt might be less injurious than the way some people run in shoes." Lieberman and his group used 3-D infrared tracking to record and study the running and strike style of three groups of runners: people who had always run barefoot, people who had always run with shoes, and people who had switched from shoe to shoeless. They found that when runners lace up their shmancy sneakers and take off, about 75 to 80 percent land heel-first. Barefoot runnersas Homo sapiens had evolved to beusually land toward the middle or front of the food. "People who dont wear shoes when they run have an astonishingly different strike," Lieberman said. Without shoes, landing on the heel is painful and can translate into a collision force some 1.5 to 3 times body weight. "Barefoot runners point their toes more at landing," which helps to lessen the impact by "decreasing the effective mass of the foot that comes to a sudden stop when you land," Madhusudhan Venkadesan, an applied mathematics and human evolutionary biology postdoctoral researcher at Harvard who also worked on the study, said in a prepared statement. But as cushioned kicks have hit the streets and treadmills, that initial pain has disappeared, and runners have changed their stride, leading to a way of high-impact running that human physiology wasnt evolved forone that the researchers posit can lead to a host of foot and leg injuries. Perhaps it should come as no surprise that our bodies are still better engineered than new-fangled trainers. When taking into account our ancient ancestors, "humans have engaged in endurance running for millions of years," the researchers wrote in their study. "But the modern running shoe was not invented until the 1970s." Another recent study, by the American Academy of Physical Medicine and Rehabilitation and published last December in the academys journal, PM&R, found that wearing running shoes "increased joint torques at the hip, knee and ankle," when compared to barefoot running. Even a jog in high heels was better for joints than specialized tennis shoes. Despite the growing movement of barefoot (or more lightly shod) runners, many researchers are calling for more evaluation before all those sweaty sneakers are abandoned. "There is no hard proof that running in shoes causes injuries," William Jungers, a professor of anatomical sciences at Stony Brook University in Long Island, NY, wrote in a commentary that accompanies the new study. But, he asserted, "In my view there is no compelling evidence that it prevents them either." And as a boost to the barefoot argument, he added: "There are data that implicate shoes more generally as a plausible source of some types of chronic foot problems." So perhaps you can skip those sneaks, say the study authors. "All you need is a few calluses," Lieberman said. Image comparing the footfall of two Kenyan runners from the study courtesy of Benton et al. The runner on the left has worn shoes most of his life and lands on his heal, whereas the runner on the right has primarily run barefoot and lands on the ball of her foot. Bacteria Transformed into Biofuel Refineries Synthetic biology has allowed scientists to tweak E. coli to produce fuels from sugar and, more sustainably, cellulose. The bacteria responsible for most cases of food poisoning in the U.S. has been turned into an efficient biological factory to makechemicals, medicines and, now, fuels. Chemical engineer Jay Keasling of the University of California, Berkeley, and his colleagues have manipulated the genetic code of Escherichia coli, a common gut bacteria, so that it can chew up plant-derived sugar to produce diesel and other hydrocarbons, according to results published in the January 28 issue of Nature. (Scientific American is part of Nature Publishing Group.) "We incorporated genes that enabled production of biodieselesters [organic compounds] of fatty acids and ethanoldirectly," Keasling explains. "The fuel that is produced by our E. coli can be used directly as biodiesel. In contrast, fats or oils fromplants must be chemically esterified before they can be used." Perhaps more importantly, the researchers have also imported genes that allow E. coli to secrete enzymes that break down the tough material that makes up the bulk of plantscellulose, specifically hemicelluloseand produce the sugar needed to fuel this process. "The organism can produce the fuel from a very inexpensive sugar supply, namely cellulosic biomass," Keasling adds.

The E. coli directly secretes the resulting biodiesel, which then floats to the top of a fermentation vat, so there is neither the necessity for distillation or other purification processes nor the need, as in biodiesel from algae, to break the cell to get the oil out. This new process for transforming E. coli into a cellulosic biodiesel refinery involves the tools of synthetic biology. For example, Keasling and his team cloned genes fromClostridium stercorarium and Bacteroides ovatusbacteria that thrive in soil and the guts of plant-eating animals, respectivelywhich produce enzymes that break down cellulose. The team then added an extra bit of genetic code in the form of short amino acid sequences that instruct the altered E. coli cells to secrete the bacterial enzyme, which breaks down the plant cellulose, turning it into sugar; the E. coli in turn transforms that sugar into biodiesel. The process is perfect for making hydrocarbons with at least 12 carbon atoms in them, ranging from diesel to chemical precursorsand even jet fuel, or kerosene. But it cannot, yet, make shorter chain hydrocarbons like gasoline. "Gasoline tends to contain short-chain hydrocarbons, say C8, with more branches, whereas diesel and jet fuel contain long-chain hydrocarbons with few branches," Keasling notes. "There are other ways to make gasoline. We are working on these technologies, as well." After all, the U.S. alone burns some 530 billion liters of gasoline a year, compared with just 7.5 billion liters of biodiesel. But Keasling has estimated in the past that a mere 40.5 million hectares of Miscanthus giganteusa more than three-meter tall Asian grasschewed up by specially engineered microbes, like the E. coli here, could produce enough fuel to meet all U.S. transportation needs.* That's roughly one quarter of the current amount of land devoted to raising crops in the U.S. E. coli is the most likely candidate for such work, because it is an extremely well-studied organism as well as a hardy one. "E. coli tolerated the genetic changes quite well," Keasling says. "It was somewhat surprising. Because all organisms require fatty acids for their cell membrane to survive, if you rob them of some fatty acids, they turn up the fatty acid biosynthesis to make up for the depletion." E. coli "grows fast, three times faster than yeast, 50 times faster than Mycoplasma, 100 times faster than most agricultural microbes," explains geneticist and technology developer George Church at Harvard Medical School, who was not involved in this research. "It can survive in detergents or gasoline that will kill lesser creatures, like us. It's fairly easily manipulated." Plus, E. coli can be turned into a microbial factory for almost anything that is presently manufactured but organicfrom electrical conductors to fuel. "If it's organic, then, immediately, it becomes plausible that you can make it with biological systems." The idea in this case is to produce a batch of biofuel from a single colony through E. coli's natural ability to proliferate and, after producing the fuel, dispose of the E. coliand start anew with a fresh colony, according to Keasling. "This minimizes the mutations that might arise if one continually subcultured the microbe," he says. The idea is also to engineer the new organism, deleting key metabolic pathways, such that it would never survive in the wild in order to prevent escapes with unintended environmental impacts, among other dangers. But ranging outside of its natural processes, E. coli is not the most efficient producer of biofuel. "We are at about 10 percent of the theoretical maximum yield from sugar," Keasling notes. "We would like to be at 80 to 90 percent to make this commercially viable. Furthermore, we would need a large-scale production process," such as 100,000 liter tanks to allow mass production of microbial fuel. Nevertheless, several companies, including LS9, which helped with the research, as well as Gevo and Keasling-founded Amyris Biotechnologies, are working on makingfuel from microbes a reality at the pumpnot just at the beer tap. Genetic testing may become a new weapon in the fight against chimpanzee smugglers DNA testing could be used as a tool to help fight smuggling of endangered chimpanzees, according to a study published this week in the journal BMC Ecology. Although they are still the most common apes in Africa, chimpanzees (Pan troglodytes and their related subspecies) have experienced population drops of around 75 percent in the past 30 years, and are listed as "endangered" on the IUCN Red List of Threatened Species. Much of this decline is because of poachers and smuggling. A live chimpanzee can fetch $20,000 on the international black market. The animals are also often victims of the bushmeat trade. Whereas wild chimpanzee populations plummet, more and more chimps end up in rescue and rehabilitation centers, such as Limbe Wildlife Center (LWC) in Cameroon. But the path taken before a chimpanzee ends up at a rescue center is long and convoluted, leaving few, if any, clues as to where in Africa the animal was first captured. The new study may help provide answers as to where these rescued chimpanzees came from, and therefore identify poacher hunting patterns and smuggling routes so they can be targeted and shut down by authorities. The study, a collaboration between the LWC and the University at Albany, State University of New York, examined the mitochondrial DNA (mtDNA) and microsatellite (STRP, or short tandem repeat polymorphic marker) genotypes of 46 chimpanzees from LWC, then compared it with geo-referenced chimpanzee samples from 10 locations throughout Cameroon and Nigeria. The team found that the chimpanzees came from throughout Cameroon, suggesting that smuggling is rife throughout the country and not all done internationally as previously believed. The tests also found that most of the chimps at LWC were NigeriaCameroon chimpanzees (Pan troglodytes vellerosus), the most endangered of the four common chimpanzee subspecies. The results indicate that "international smuggling is less of a problem than local trade," lead scientist Mary Katherine Gonder of the University at Albany said in a prepared statement. "The problem seems to occur throughout Cameroon, with some rescued chimps even coming from protected areas." Within Cameroon, a live chimpanzee sells for $100, far below the international black market price, but still a tempting amount for residents of the economically challenged country. Gonder and her co-authors say this study is just the first step, and are now calling for "a broader sample of rescued chimpanzees compared against a more comprehensive grid of geo-referenced samples" in the hopes of revealing what they call "hot spots" of chimpanzee hunting and smuggling routes in Cameroon. You'll Go Blind: Does Watching Television Close-Up Really Harm Eyesight? It seems the worst effects are not on one's eyes, and may come from watching too much television, no matter what the distance to the screen. Dear EarthTalk: Years ago I read that children should be kept at least two feet from the television because of harmful electronic emissions. Is this still relevant? Is there a difference regarding this between older and new flat-screen models? Horst E. Mehring, Oconomowoc, Wisc. Luckily for many of us and our kids, sitting too close to the TV isnt known to cause any human health issues. This myth prevails because back in the 1960s General Electric sold some new-fangled color TV sets that emitted excessive amounts of radiationas much as 100,000 times more than federal health officials considered safe. GE quickly recalled and repaired the faulty TVs, but the stigma lingers to this day.

But even though electronic emissions arent an issue with TVs made any time after 1968 (including todays LCD and plasma flat screens), what about causing harm to ones vision? Dr. Lee Duffner of the American Academy of Ophthalmology isnt concerned, maintaining that watching television screensclose-up or otherwisewont cause any physical damage to your eyes. He adds, however, that a lot of TV watching can surely cause eye strain and fatigue, particularly for those sitting very close and/or watching from odd angles. But there is an easy cure for eye strain and fatigue: turning off the TV and getting some rest. With a good nights sleep, tired eyes should quickly return to normal. Debra Ronca, a contributor to the How Stuff Works website, argues that some parents might be putting the cart before the horse in blaming close-up TV watching for their childs vision issues. Sitting close to the television may not make a child nearsighted, but a child may sit close to the television because he or she is nearsighted and undiagnosed, she reports. If your child habitually sits too close to the television for comfort, get his or her eyes tested. Of course, excessive TV viewing by kids can cause health problems indirectly. According to the Nemours Foundations KidsHealth website, children who consistently watch TV more than four hours a day are more likely to be overweight, which in and of itself can bring about health problems later. Also, kids who watch a lot of TV are more likely to copy bad behavior they see on-screen and tend to fear that the world is scary and that something bad will happen to them. Nemours also finds that TV characters often depict risky behaviors (like smoking and drinking) and also tend to reinforce gender-role and racial stereotypes. There has also been much debate in recent years on the effects of TV viewing on infants. A 2007 Seattle Childrens Research Institute study found that for every hour per day infants spent watching baby DVDs and videos they learned six to eight fewer new vocabulary words than babies who never watched the videos. But a 2009 study by the Center on Media & Child Health at Children's Hospital Boston found no negative cognitive or other impacts whatsoever on those infants exposed to more television than less. While it may be inevitable that your kids will watch TV, the key, experts say, is moderation. Limit kids exposure to screens of any kind, and monitor what they are allowed to watch. As KidsHealth points out, parents should teach their kids that the TV is for occasional entertainment, not for constant escapism. Early Cometary Bombardment May Explain the Divergent Paths of Jupiter's Biggest Moons Ganymede and Callisto, the two largest Jovian satellites, appear to have similar origins but have led very different lives. Ganymede and Callisto are the largest of Jupiter's so-called Galilean satellites, the four moons of the giant planet that were discovered 400 years ago, in January 1610, by Italian astronomer Galileo Galilei. Ganymede, the largest moon in the solar system, even bigger than the planet Mercury, boasts its own magnetic field andbears the marks of past tectonic activity. But Callisto, of roughly equal size and with a similar makeup of rock and ice, has neither a magnetic field nor an apparent history of tectonicsthe moons' geologic histories have proceeded very differently. Ganymede seems more evolved, so to speakits constituents appear to have differentiated further than those of Callisto. Specifically, most of its rock and metal have migrated to the core, whereas those components are more widely distributed throughout Callisto, which appears to host a smaller core as a result. The circumstances that could have led Ganymede to differentiation without fully affecting its sibling moon have been debated for years. One suggestion is that Ganymede's orbital history included a phase in which the moon experienced strong gravitational tides that heated the icy body and allowed the rock and metal to coalesce at its center. In a paper published online Sunday in Nature Geoscience, planetary scientists Amy Barr and Robin Canup of the Southwest Research Institute in Boulder, Colo., propose an alternative scenario: heating by cometary impacts, which should have been plentiful several hundred million years after the moons formed, could have liberated the materials that now constitute Ganymede's core. (Scientific American is part of Nature Publishing Group.) Callisto orbits much farther from Jupiter and so would have endured less bombardment from comets drawn in close to Jupiter by the massive planet's gravitational pull. Each time a comet strikes an icy satellite, Barr explains, a portion of the moon's surface melts from the heat of the impact; the heavier metallic and rocky constituents mixed in sink to the bottom of the melt pool. With enough impacts providing sufficient melting, the sinking rocks' gravitational potential energy is released as heat, producing more melting, and the separation of rock and ice becomes self-sustaining, a process known as "runaway differentiation." During the solar system's period of intense impacts about 3.8 billion years ago known as the late heavy bombardment, tremendous amounts of cometary material would have been flying around Jupiter and the outer gas-giant planets. Barr and Canup estimate that Ganymede's proximity to Jupiter, the latter of which acts as something of a gravitational sink, led to Ganymede's experiencing double the impacts of Callisto, and at higher velocities, to boot. "Ganymede gets 3.5 times as much energy in the late heavy bombardment as Callisto," Barr says. That energy differential, Barr and Canup realized, could account for Ganymede's much more complete state of differentiationthe so-called GanymedeCallisto dichotomy. By their calculations, a broad range of starting conditions for the source population of comets could produce Ganymede's full differentiation but stop short of runaway differentiation at Callisto. Importantly, the debris disk described by the so-called Nice model, a popular dynamical simulation for the solar system's evolution, would do the job. "There is a huge range of masses of planetesimal disks that lead to the formation of the dichotomy," Barr says, noting that prior hypotheses for the divergent histories of Ganymede and Callisto required fine-tuning of parameters or worked for only a very narrow set of circumstances. "This fits in with what is already known about dynamical sculpting in the outer solar system, and it works for a broad range of parameters," she says. Planetary scientist William McKinnon of Washington University in Saint Louis notes that work in recent years has complicated a competing explanation for the dichotomy, in which tidal heating during the orbital evolution of the Jovian moons melted Ganymede enough to differentiate it. Some research has in fact shown a strong dynamical preference for Ganymede to have settled quickly into its present orbital resonance with the moons Io and Europa. "And if that's true then there is no later special time for Ganymede to be tidally heated," McKinnon says. The fact that Barr and Canup's model dovetails with a primordial development of the moons' orbits makes it attractive, he adds. The new hypothesis is "a completely plausible explanation," McKinnon says. "What they've shown is that the effect of a strong late heavy bombardment might be the answer." Debt crisis threatens UK science Gabriel Aeppli has spent decades probing the nanostructure of materials, but today it is financial woes that are on his mind. As director of the London Centre for Nanotechnology, Aeppli is responsible for making sure the laboratory continues to bring in funding at the current level of about 10 million (US$16 million) each year. "I am worried, obviously," he says in his soft American accent. "I worry all the time." Like many scientists in Britain, Aeppli has grown more anxious in recent months. Faced with a rapidly growing debt that stood at 800 billion in March 2009, according to government figures, the United Kingdom is poised to make deep cuts in public spending. It is probably not alone: governments across the globe have run up high deficits in recent years (see 'Debt threat'), and economists expect that many will soon be faced with the stark choice of raising taxes or lowering budgets.

In Britain, policy experts say that funding for science will probably decline no matter who takes power in this year's general election, which must be held by 3 June. The cuts will mark the first time in more than a decade that research spending has not grown. It's a familiar worry for an older generation of British scientists who can still remember when a conservative government led by Margaret Thatcher slashed research spending in the 1980s. Denis Noble, a cardiologist at the University of Oxford, recalls serving on grant committees that had to choose just one out of dozens of applications. "The meetings became pointless really," he says. "At that level of cut you really can't cope." But British science has flourished under a science-friendly Labour government. Since the late 1990s, UK research has enjoyed strong growth in funding (see 'The British science boom'). The money has supported endeavours such as the Centre for Nanotechnology, a joint venture between Imperial College London and University College London (UCL). The centre was kicked off in 2002 with 14 million from a government grant and the Wellcome Trust, Britain's largest charity, which funds biomedical research. The commitment helped to lure Aeppli, then a senior scientist at the research campus of Japanese electronics manufacturer NEC in Princeton, New Jersey, to head the centre. At the time, other countries were making larger sums of money available for nanotechnology, but Aeppli says he was promised a great deal of independence in setting up his new lab. "That was quite unique," he recalls. "I was just handed [the money] and told to go figure it out." When the centre opened in 2006, it had drawn in additional funding and constructed an eight-storey laboratory in central London. Today the lab houses state-of-the-art scanning tunnelling microscopes and fabrication facilities, which are used to study a wide variety of basic and applied problems in materials science. The government could afford to finance such projects because the British economy was booming. Throughout the late 1990s and early 2000s, a strong property market and financial sector swelled the treasury's coffers. But the collapse of those areas in 2007 and 2008 has left the UK government short of revenue at a time when it needs it the most. The government has run up a heavy deficit bailing out banks and stimulating consumer spending, and public-spending cuts are the only way to balance its books. A global problem Britain's predicament is hardly unique. The United States, Japan and several European nations, including Ireland and Greece, have seen deficits rise sharply as they attempt to jump-start their economies. In December 2009, the newly elected Japanese government proposed cutting back or closing down several large science projects as part of a 3-trillion (US$33.7-billion) budget-trimming process (see Nature462, 835; 2009), sparking outrage in the scientific community the government subsequently revoked most of the proposed cuts. In the United States, a US$787-billion stimulus package has boosted research in the short term, but few expect to see further increases in this year's budget, and spending will drop after the stimulus runs out. Moreover, falling state budgets and university endowments are taking a heavy toll, says Matt Owens, an associate vice-president for federal relations at the Association of American Universities, a university advocacy group in Washington DC. "At US research universities, they're looking at significant budget shortfalls in the near future," he warns. The 'near future' may yet be a year or two away for American scientists. But in Britain, cuts will bite within months. The current Labour government has delayed drawing up its next three-year budget until after this summer's election, but a pre-budget report released on 9 December 2009 called for "efficiency savings" in several sectors, including research. In late December, Peter Mandelson, the UK minister for business, innovation and skills, warned that universities would lose 950 million in government support between 2010 and 2013 but he emphasized that the reduction was less than 5% of the total expected over that period. I don't think Watson and Crick could have existed under the current regime. Many universities, including leading research institutions such as Imperial and UCL, are already tightening their belts in preparation. Between July and September 2009, Imperial cut 48 jobs from its faculty of medicine; roughly half were academic posts. And on 14 January, UCL announced that it hopes to trim 3 million 6% of staff costs from its faculty of life sciences. Because much of Britain's research funding is distributed according to a formula based on a university's size and quality, smaller and less-research-intensive universities are expected to be hit harder. Research councils the government funding bodies that provide some 3 billion annually in grants are also likely to feel the pinch. For the past decade, research-council funding has increased steadily. But at a recent debate on science in central London, none of the research ministers of the nation's three major political parties could promise that funding would continue at present levels. One council in particular is already acutely aware of the recession. Since its creation in 2007, the Science and Technology Facilities Council (STFC), which distributes most of Britain's physics and astronomy grants, has been chronically short of cash. Battered by a falling pound, which raises the cost of overseas projects, and a flat budget that has never met its needs, the STFC is facing a 40-million spending deficit, which has forced it to make cuts to research grants. The cuts have also caused the council to rethink its international commitments: it is now planning to withdraw from the Gemini project, which operates twin telescopes in Chile and Hawaii (see Nature 462, 396; 2009). John Womersley, the STFC's director for scientific programmes, sees little relief in sight. The only sure thing, Womersley says, is that after the election ministers will be fighting over spending on big government items such as defence, health and education. "There's likely to be a tough scramble over how the budget will be allocated," he says. "The question is: is science special?" If research funding is to be protected, Womersley believes, scientists will have to present a unified case to all political parties. Some politicians are already talking about cutting back on fundamental research that lacks an obvious application, he says. It will be up to scientists to show the relevance of their work, either by providing a potential link to an area of economic activity or by showing its educational benefits. This emphasis on the broader impact of research is also expected to be a key part of the Research Excellence Framework, a system that will be used to assess university research quality to determine funding levels (see Nature 463, 291; 2010). But some researchers are uneasy about making such justifications. Too much emphasis is already being placed on publication rates and economic benefit, says Robert May, a zoologist at the University of Oxford and a former government chief scientific adviser. "I don't think Watson and Crick could have existed under the current regime," he adds. Back in London, Aeppli is preparing the nanotechnology centre as best he can for the difficult times ahead. He is encouraging his staff to bring in grants from non-governmental sources, including charities and the European Union. He is also steering the centre's research agenda towards issues such as climate and health, areas he believes the government will fund to help improve the economy in the long term. Balancing those projects against fundamental science will be tricky, he admits, but he believes the centre will survive and could even prosper in the difficult years ahead. "We've been planning for this downturn for a while," he says. "I think that if we're working in the right areas, there will be growth." Translational research: Talking up translation A short, laminated article has fallen off the wall of press cuttings outside Alan Ashworth's London office. It is an editorial published in 2000 by Britain's widely read and notoriously opinionated tabloid newspaper, The Sun and it is praising geneticists who study cancer. "They are, without doubt, the most important people alive," it crows. "People like Professor Alan Ashworth and his team at the Institute of Cancer Research are dedicated to the cause." Last year, Ashworth's work was lauded, somewhat more quietly, in an editorial in the New England Journal of Medicine (NEJM)1. "Readers may be surprised by the editors' decision to publish a small early-stage trial," the journal wrote of a study largely based on Ashworth's discoveries, "but this trial not only reports important results it also points to a new direction in the development of anticancer drugs". That Ashworth has won praise from these diverse camps is a testament to his record in 'translational' medicine moving developments in basic science into the doctor's surgery. In just under 15 years, he has gone from early work on the major cancer-risk gene BRCA2, to involvement in the development of a promising cancer drug based on knowledge about the gene's biology. His work "is really the shining example of successful translation of a basic biology idea into successful clinical application", says Julian Downward, from Cancer Research UK's London Research Institute. Ashworth who prefers the term 'integration' to translation plans to make his approach a central tenet of the Centre for Molecular Pathology, the London facility he will lead, which is scheduled to break ground early this year and should be completed in 2012 at a cost of 20 million (US$33 million). He has even become something of an integration evangelist, chastising those who do basic cancer research without considering the work's future application. "This integrated approach is something I demand of everybody," he says.

Yet what Ashworth makes look easy, others find extraordinarily hard. He says that the key to his success has been a thorough understanding of basic biology and a commitment to seeing it through to application, plus a sprinkle of serendipity. But others say that Ashworth is also set apart by his drive, his charm and his ability to win over others to build the networks of expertise needed for integration. "One of his big achievements," Downward says, "has been to actually hold together this grouping of people who aren't usually very good at talking to each other." This integrated approach is something I demand of everybody. In the 1980s, when Ashworth started a PhD in biochemistry at University College London, molecular biology was in the "really early phase of gene cloning", he says. It was his flair in this field that led him in the 1990s to Mike Stratton, then at the Institute for Cancer Research in Sutton, UK, who had recently shown2 that BRCA2 which, along with BRCA1, was known to be important for determining breast cancer risk was localized to a small region of chromosome 13. Ashworth says that he was brought in "as kind of a hired gun" to help clone the gene. And he was confident that he would. Over a bawdy meal in April 1995, Ashworth predicted that the gene would be in hand within a year and scrawled his prediction on a napkin, witnessed by several colleagues. The group didn't need the whole year. In December 1995, Nature reported3 the cloning of BRCA2. Ashworth recalls when the first woman was tested for BRCA2mutations; she had been scheduled for prophylactic mastectomy because of her family's cancer history. The tests were negative, and the surgery was called off. "It was only then that I realized I could apply what I was good at to patient benefits," he says. By the end of the 1990s, Ashworth was directing a team at the Institute of Cancer Research in London and had developed a mouse with a mutated Brca2 gene that was highly susceptible to cancer4. Work by several groups showed that the gene is involved in repairing DNA damage by a process called homologous recombination (see graphic). When it is mutated, DNA breaks start to accumulate, increasing the risk that a cell will turn cancerous. "We had to retool essentially and start to understand DNA repair," says Ashworth. In 1999, he was chosen to lead the Breakthrough Breast Cancer Research Centre at the Institute of Cancer Research, where he came into close contact with patients and physicians. Double jeopardy Ashworth became interested in the idea that a mutated BRCA gene could not only render cells susceptible to cancer it could also be exploited to target them. The team turned to a concept in genetics called 'synthetic lethality', in which mutations are harmless on their own but together will kill a cell. They theorized that cancer cells bearing a mutated BRCA2 were now reliant on another leg of their DNA repair machinery. Taking out a second repair pathway should bring them to the floor. The opportunity to test the theory presented itself when, as Ashworth puts it: "I met a bloke in a pub and he offered me some drugs." That bloke was Steve Jackson, a DNA-repair researcher from the University of Cambridge, UK. The company he had started, KuDOS Pharmaceuticals, had developed the drug olaparib, which inhibits an enzyme vital for the repair of DNA breaks: poly(ADP-ribose) polymerase, or PARP. Over a drink, Jackson and Ashworth hit on the idea of testing whether Jackson's 'PARP inhibitor' would take out the second leg in BRCA2 mutant cells (see graphic). "It wasn't months; it was days or weeks after the compounds went down to his lab that they told us about these absolutely stonking results," says Jackson. The team showed in 2005 that BRCA2 mutant cells died when they were hit by Jackson's drug5, and a back-to-back report from Thomas Helleday, then at the University of Sheffield, UK, echoed the finding6. He holds together people who aren't usually very good at talking to each other. When it came to testing the drug in people, Ashworth had a head start: olaparib had already passed many of the early safety and regulatory hurdles required for a new drug. But human trials bring other challenges: intellectual property, financing and mountains of regulatory bureaucracy can be enough to smother anyone's translational ambitions. Ashworth, who had access to clinical expertise via the cancer centre and nearby hospital, is sanguine, and says that the key has been to listen to those from different disciplines "so I can understand what they do". Part of Ashworth's success may lie in his air of being a regular, amiable scientist who can seem slightly uncomfortable in a suit. "It may be because he was slightly outside the normal operation of all those things," says Downward. "He's not a clinician, he's not a pharma person." Killer instinct Of the 60 participants recruited into the eventual phase I clinical trial, 23 had mutations in one of the BRCA genes; and after treatment, 12 of these people showed a 'clinical benefit'7, such as no progression of their disease for 4 months or longer. The synthetic lethality approach the new direction in anticancer drugs referred to in the NEJM editorial looked highly promising, and the cancer community was abuzz. AstraZeneca, which acquired KuDOS in 2006, is now developing the drug and the results of a phase II trial, presented at the American Society of Clinical Oncology meeting in 2009, showed that more than a third of patients taking the maximum dose showed some improvement in their tumours8. "I've no doubt this approach will work," says Ashworth. The drug might also work against cancers with other DNA-repair defects. Last year, his team showed that PARP inhibitors were also lethal in cancer cells with mutations inPTEN, one of the genes most commonly disrupted in cancers9. Ashworth cites another example of the integration he seeks between basic and applied research. His team knew thatBRCA- mutant tumours develop resistance to the platinum-based drugs such as cisplatin that are a mainstay of treatment; the group went back to the lab to work out whether the PARP inhibitors would run up against the same problem. The resulting paper in Nature10, along with one from another group11, showed that drug resistance arises when the mutant BRCA2 undergoes a deletion, restoring DNA repair. This means that PARP inhibitors might sometimes need to be used with other therapies. . Ashworth's evangelism is persuasive. "It is a fantastic feeling to think that the work you've done in the lab can actually have an output in patients," he says, "and I think many other people want to feel like that." He imagines a future in which all tumours are targeted according to their precise genetic characteristics, a vision that many researchers are now working towards. At the new Centre for Molecular Pathology, he plans to collect genetic and molecular profiles of patients as they enter trials to work out which people are most likely to benefit from the therapies being tested. He is also heavily involved in running Breakthrough Generations, a study into the genetic and environmental causes of breast cancer that has recruited 100,000 British women and has received 12-million in start-up funding from the Institute of Cancer Research and the Breakthrough Breast Cancer charity. The plan is to collect detailed health information over the next 40 years to improve understanding of the causes and prevention of cancer. Today, he finds some escape in the basic biology. "It's a relief to look at data rather than at higher-level political things," he says. "What really drives me is looking at experiments. I still get very, very excited some would say too excited when there's a hint of a good result." Planetary science: A whiff of mystery on Mars For the past decade, NASA's mantra for exploring Mars has been to 'follow the water'. The agency based its aqueous obsession on the idea that finding evidence of past or present water would yield clues to whether life once graced the planet or still exists there. Now, some scientists are chanting a new slogan: 'follow the methane'. This small hydrocarbon is tantalizing because much of the methane in Earth's atmosphere is produced by microbes, within soils and inside the guts of cows and other mammals, including humans. So the fact that methane has been discovered on Mars could signal the presence of life somewhere on the red planet. Or not. Methane can also form through geological processes. In either case, the methane findings signal that unknown processes are happening. "Methane is one molecule that really tells us that Mars is a coupled system of the interior, the surface and the atmosphere," says Sushil Atreya, director of the Planetary Science Laboratory at the University of Michigan in Ann Arbor. In November, fans of Martian methane gathered in Frascati, Italy, to puzzle over the new data. They hope to pinpoint where the methane is coming from and why it gets scrubbed from the atmosphere so quickly. Solutions to these conundrums will require better data, so researchers are working out how to sniff out the gas in future missions to Mars.

The excitement over methane started to build in 2003 and 2004, when three groups1, 2, 3 spotted methane in the atmosphere of Mars using spectroscopic measurements from telescopes on Earth and data from the European Space Agency's Mars Express spacecraft. The amounts of methane detected by the teams differed, but the planetary average was estimated to be about 10 parts per billion by volume3, an extremely low level. And the amount changed over time suggesting that the gas is still being released, even though it might have been produced at some point in the past. In January 2009, the leader of one of the teams, Michael Mumma from NASA's Goddard Space Flight Center in Greenbelt, Maryland, published a paper4 that reanalysed his 2003 and 2006 observations from the Keck telescope and the Infrared Telescope Facility in Hawaii. Mumma's team showed that three neighbouring areas on Mars, Nili Fossae, Terra Sabae and Syrtis Major, were 'hot spots' of methane production during 2003. But by 2006, methane levels had dropped at those sites, signalling that some active process was venting the gas. The quick drop also suggests that something is rapidly destroying the methane. "For us atmospheric chemists, it is still very difficult to understand how methane can vary so rapidly in time and space," says Franck Lefvre at the Laboratory of Atmospheres, Environments and Space Observations in Paris. Lefvre calculated that the atmospheric lifetime of methane is less than 200 days5 hundreds of times shorter than prevailing models of Mars's atmosphere predict. "If the measurements are correct, this means that we are missing something really important," says Lefvre. Raina Gough, a PhD student at the University of Colorado at Boulder, tried to fill that gap by testing how methane reacts with samples made up to resemble Martian soil. But none of the soil samples removed methane that quickly, she reported at the meeting. Gough's results were among the most important presented there because they "tend to rule out what was thought to be the most likely explanation for the observed methane variations on Mars", Lefvre says. "The mystery deepens." Where the gas is going is not the only problem. "The big question really is what is producing the methane," says Atreya. Is it biological or geological, past or present? Although conditions on the surface of Mars are extremely harsh, there may be spots below the surface where microbes could survive. In research in the high Arctic, microbiologist Lyle Whyte from McGill University in Montreal, Canada, has found methane-making microbes within extremely salty pools in the permafrost, the permanently frozen ground. Martian microbes, if they exist, could be making methane in conditions somewhat like this on Mars, he suggests. Alternatively, the methane could be forming geologically as a by-product of a process called serpentinization, which happens on Earth when water reacts with olivine, a mineral that is present on Mars6. Another debate that dominated the meeting revolved around the location of the methane sources. Vittorio Formisano and Anna Geminale from the Institute of Physics of Interplanetary Space in Rome have tried to pinpoint the seasonal bursts using the Planetary Fourier Spectrometer on the Mars Express spacecraft. Their findings suggest that trace amounts of methane are released from the northern polar cap when its surface melts during summer. The Keck data argue otherwise, say Mumma and Geronimo Villanueva, also at Goddard. The water released from the north pole in summer does not contain any methane, they say. And although the pair found their first three methane hot spots in the northern hemisphere, they have also identified methane releases in the southern part of the planet during that hemisphere's spring. The team is now using the Keck telescope to pinpoint the sources to regions as small as 80 kilometres, from the original best resolution of 500 kilometres. The reason for the different conclusions could stem from the methods the two teams used. The Mars Express data are averaged over many thousands of spectra, and over time, whereas the Keck maps are snapshots. The disagreement and the failure of atmospheric models to explain the methane measurements has meant that some researchers remain sceptical about the interpretation of the data. Todd Clancy at the Space Science Institute in Boulder is particularly troubled by the fluctuations in time and space that Mumma reports. "These variations are unphysical for any plausible photochemical processes active in the Mars atmosphere," says Clancy. Yet, he says, if the observations are right, "the existence of such methane in the current Mars atmosphere would be profound on so many levels as to constitute the highest priority for planetary mission studies". With so many questions swirling around the issue of methane, researchers agree that only new missions to Mars will provide the answers (see graphic). Chasing a gas First up is NASA's Mars Science Laboratory, which is due to launch next year. This car-sized rover will carry a tunable laser spectrometer, which should be able to answer one of the burning questions about Martian methane: what is the isotopic make-up of the carbon? If the carbon is mostly the lightest stable isotope, carbon-12, this could hint at a biological origin. In 2016, NASA and the European Space Agency are scheduled to send an orbiter to monitor trace gases. At the Frascati meeting, Richard Zurek, chief scientist for the Mars programme at NASA's Jet Propulsion Laboratory in Pasadena, California, described this newly selected mission, which would be the first to look expressly for methane and other trace gases in the Martian atmosphere. The mission has methane enthusiasts salivating, and the call for instrument proposals has just gone out. For researchers with questions about the methane data, the 2016 mission should provide some firm answers. "It will measure the level of methane on Mars unambiguously and determine whether it really is seasonally and latitudinally variable," says David Catling from the University of Washington in Seattle. But some researchers are looking beyond the planned missions and are drawing up proposals for other ways to study the gas. Mumma would like to find active vents and watch them closely to see how methane releases change over time. He has proposed the Mars Organics Observer, which would sit at Mars's first Lagrange point, a special orbit around the Sun that is locked to the movement of Mars by the gravity of both bodies. The telescope could monitor the red planet every day and locate methane bursts with a resolution of 10 kilometres. Others have placed drilling operations at the top of their wish lists. "Most astrobiologists believe that the best hope for detecting microbial life will be in the subsurface," says Whyte. Researchers are also dreaming up other schemes for tracing methane sources. Atreya, and Paul Mahaffy from Goddard, propose using a balloon to measure gases in the Martian atmosphere. Joel Levine, at Nasa Langley Research Center in Hampton, Virginia, is developing a Mars plane that could soar over the planet and monitor methane releases from the surface. But those not directly involved in methane research caution about leaping on to the gas bandwagon. "We know that methane can be made by processes that have nothing to do with biology," says David Stevenson from the California Institute of Technology in Pasadena. Water should remain the research priority, he says. Lefvre, however, says he is happy that mission planners are hearing the wishes of methane enthusiasts. "Atmospheric chemistry has rarely been the trendiest topic in Mars science, but this has completely changed since the discovery of methane," he says. "We are now designing space missions entirely devoted to the detection of trace species, which was impossible to imagine a few years ago." Literature mining: Speed reading Scientists are struggling to make sense of the expanding scientific literature. Corie Lok asks whether computational tools can do the hard work for them. In 2002, when he began to make the transition from basic cell biology to research into Alzheimer's disease, Virgil Muresan found himself all but overwhelmed by the sheer volume of literature on the disease. He and his wife, Zoia, both now at the University of Medicine and Dentistry of New Jersey in Newark, were hoping to test an idea that they had developed about the formation of the protein plaques in the brains of people with Alzheimer's disease. But, as newcomers to the field, they were finding it almost impossible to figure out whether their hypothesis was consistent with existing publications. "It's really difficult to be up to date with so much being published," says Virgil Muresan. And it's a challenge that is increasingly facing researchers in every field. The 19 million citations and abstracts covered by the US National Library of Medicine's PubMed search engine include nearly 830,000 articles published in 2009, up from some 814,000 in 2008 and around 772,000 in 2007. That growth rate shows no signs of abating, especially as emerging countries such as China and Brazil continue to ratchet up their research.

The Muresans, however, were able to make use of Semantic Web Applications in Neuromedicine (SWAN), one of a new generation of online tools designed to help researchers zero in on the papers most relevant to their interests, uncover connections and gaps that might not otherwise be obvious, and test and generate new hypotheses. "If you think about how much effort and money we put into just Alzheimer's disease research, it is surprising that people don't put more effort into harvesting the published knowledge," says Elizabeth Wu, SWAN's project manager. SWAN attempts to help researchers harvest that knowledge by providing a curated, browseable online repository of hypotheses in Alzheimer's disease research. The hypothesis that the Muresans put into SWAN, for example, was that plaque formation begins when amyloid-, the major component of brain plaques, forms seeds in the terminal regions of cells in the brainstem that then nucleate the plaques in the other parts of the brain into which the terminals reach. SWAN provides a visual, colour-coded display of the relationships between the hypotheses, as derived from the published literature, and shows where they may agree or conflict. The connections revealed by SWAN led the Muresans to new mouse-model experiments designed to strengthen their hypothesis. "SWAN has advanced our research, and focused it in a certain direction but also broadened it to other directions," says Virgil Muresan. The use of computers to help researchers drink from the literature firehose dates back to the early 1960s and the first experiments with techniques such as keyword searching. More recent efforts include the striking 'maps of science' that cluster papers together on the basis of how often they cite one another, or by similarities in the frequencies of certain keywords. As fascinating as these maps can be, however, they don't get at the semantics of the papers the fact that they are talking about specific entities such as genes and proteins, and making assertions about those entities (such as gene X regulates gene Y). The extraction of this kind of information is much harder to automate, because computers are notoriously poor at understanding what they are reading. Even so, informaticians and biologists are working together more and making considerable progress, says Maryann Martone, the chairwoman of the Society for Neuroscience's neuroinformatics committee. Recently, a number of companies and academic researchers have begun to create tools that are useful for scientists, using various mixtures of automated analysis and manual curation (see Table 1, 'Power tools'). Deeper meaning The goal of these tools is to help researchers analyse and integrate the literature more efficiently than they can do through their own reading, to hone in on the most fruitful experiments to do and to make new predictions of gene functions, say, or drug side effects. The first step towards that goal is for the text- or semantic-mining tool to recognize key terms, or entities, such as genes and proteins. For example, academic publisher Elsevier, headquartered in Amsterdam, has piloted Reflect in two recent online issues of its journal Cell. The technology was developed at the European Molecular Biology Laboratory in Heidelberg, Germany, and won Elsevier's Grand Challenge 2009 competition for new tools that improve the communication and use of scientific information. Reflect automatically recognizes and highlights the names of genes, proteins and small molecules in the Cell articles. Users clicking on a highlighted term will see a pop-up box containing information related to that term, such as sequence data and molecular structures, along with links to the sources of the data. Reflect obtains this information from its dictionary of millions of proteins and small molecules. Such 'entity recognition' can be done fairly accurately by many mining tools today. But other tools take on the tougher challenge of recognizing relationships between the entities. Researchers from Leiden University and Erasmus University in Rotterdam, both in the Netherlands, have developed software called Peregrine, and used it to predict an undocumented interaction between two proteins: calpain 3, which when mutated causes a type of muscular dystrophy, and parvalbumin B, which is found mainly in skeletal muscle. Their analysis found that these proteins frequently co-occurred in the literature with other key terms. Experiments then validated that the two proteins do indeed physically interact (H. H. van Haagen et al. PLoS One 4, e7894; 2009). Development role At the University of Colorado in Denver, bioinformatician Lawrence Hunter and his research group have developed a tool called the Hanalyzer (short for 'highthroughput analyser'), and have used it to predict the role of four genes in mouse craniofacial development. They gathered gene-expression data from three facial tissues in developing mice and generated a 'data network' showing which genes were active together at what stage of development, and in which tissues. The team also mined relevant abstracts and molecular databases for information about those genes and used this to create a 'knowledge network'. Using both networks, the researchers homed in on a group of 20 genes that were upregulated at the same time, first in the mandible (lower jaw area) and then, about 36 hours later, in the maxilla (upper jaw). A closer look at the knowledge network suggested that these genes were involved in tongue development, because the tongue is the largest muscle group in the head and is in the mandible. Further analysis led them to four other genes that had not been previously linked to craniofacial muscle development but that were active in the same area at the same time. Subsequent experiments confirmed that these genes were also involved in tongue development (S. M. Leachet al. PLoS Comput. Biol. 5, e1000215; 2009). Somebody staring at the data or using existing tools would never come up with this hypothesis. "I don't see that there is any way that somebody staring at the data or using existing tools would have ever come up with this hypothesis," says Hunter. Although extracting entities and the relationships between them is a common approach for literature-mining tools, it is not enough to pull out the full meaning of research papers, says Anita de Waard, a researcher of disruptive technologies at Elsevier Labs in Amsterdam. Scientific articles typically lay out a set of core claims, together with the empirical evidence that supports them, and then use those claims to argue for a conclusion or hypothesis. "Generally that's where the real, interesting science is," de Waard says. Capturing the higher-level argument is an even more difficult task for a computer, but a small number of groups, such as the SWAN group, are trying to do so. The SWAN website, which opened to the public in May 2009, was developed by two Boston-based groups, the Massachusetts General Hospital and the Alzheimer Research Forum, a community and news website for Alzheimer's researchers. For each hypothesis in the system, SWAN shows the factual claims that support it, plus links to the papers supporting each claim. Because claims from the various hypotheses are linked together in a network, a user can browse from one to the next and see the connections between them. The visualization tool uses a red icon to show when two claims conflict and a green icon to show when they're consistent, allowing the user to see at a glance which hypotheses are controversial and which are well supported by the literature (see graphics, above). At the moment, this information is unlikely to surprise experts in Alzheimer's disease. In its current stage of development, SWAN may be more useful for newcomers trying to get up to speed on the subject. Beneficiaries could include more established scientists such as the Muresans who want to move into a different field, or researchers with a pharmaceutical or biotech company who have just been put on an Alzheimer's disease project. Building up SWAN also has scalability issues. The vast majority of the hypotheses, claims and literature links in SWAN have been annotated and entered by the site's curator, Gwen Wong, with the help of authors. This curation is a painstaking process that has so far produced only 1,933 claims and 47 fully annotated hypotheses. But the intent is for these early hand-curation efforts to set a 'gold standard' for how the SWAN knowledge base should be built by the community as a whole. The SWAN developers plan to improve the user interface to encourage scientists to submit their own hypotheses, post comments and even do some of the curation themselves. The need for some level of manual curation is common to the various literature tools, and limits their scalability. The SWAN team is working to automate parts of the curation process, such as extracting gene names. Elsewhere, de Waard and other researchers are investigating ways of automatically recognizing hypotheses for example, by looking for specific word patterns. For most of these tools, however, curation is unlikely to become fully automated. "Literature mining is hard to do in a way that is both high scale and high accuracy," says John Wilbanks, director of Science Commons, a data-sharing initiative in Cambridge, Massachusetts. Developers say a more likely solution, at

least in the short term, is that papers will have to be curated and annotated through some combination of automated tools, professional curators and the papers' authors, who might, for example, be prevailed on to write their abstracts in a more structured machine-readable form. The right people Are authors willing to add to the already arduous task of writing an article? And are authors even the best people to do this job? The journal FEBS Letters experimented in 2009 with structured digital abstracts to see how authors would respond and perform in shaping their own machine-readable abstracts. The results were not encouraging. Authors presented their abstracts about proteinprotein interactions as structured paragraphs describing entities, the relationships between the entities and their methods using specific, simple vocabularies (for example, 'protein A interacts with protein B'). But the curators of a protein database didn't accept them, says de Waard. "Authors are not the right people to validate their own claims," she says. The community referees, editors, curators, readers at large is still needed. This could be a business opportunity for the publishers, says Wilbanks: they could curate and mark up their publications for text and semantic mining and provide that as a value-added service. "There's a lot of business out there for the publishers, but it's not the same business," says Allen Renear, associate dean for research at the Graduate School of Library and Information Science at the University of Illinois at Urbana-Champaign. "If they keep making PDFs, that's not going to work for them. They have to get into more of the semantic side of this." Perhaps the largest challenge is getting scientists to use these tools. It will be up to the developers to demonstrate the benefits and make their wares easy to use. That's going to be difficult, says Hunter. Academic informaticians are rewarded more for coming up with new algorithms, and less for making their programs usable and widely adoptable by biomedical scientists, he says. Only a few tools are being developed by companies for more widespread use. Major issues that all technology developers will need to tackle are transparency, provenance and trust. Scientists won't trust what a computer is suggesting in terms of new connections or hypotheses if they don't know how the results were generated and what the primary sources were. "We as informaticians are going to have to take on these more user-driven and less technology-driven problems," says Hunter. Even if researchers do start to trust the new tools, it's not clear how much of their reading they will delegate. "As reading becomes more effective," says Renear, "some people have speculated that we won't do as much because we'll get done what we need to do sooner." Or, he says, "it may be that we'll do more reading because it's more valuable. Which one is true is actually an empirical question." Analysing articles in new ways leads to the larger question of whether the articles themselves should change in structure. If an article is to be boiled down into machine-readable bits, why bother writing whole articles in the first place? Why don't researchers just deal with statements and facts and distribute and mash them up to generate hypotheses and knowledge? "Human commentary and insight are still extraordinarily valuable," says Martone. "Those insights don't immediately fall out of data without human ingenuity. So you need to be able to communicate that and that generally means building an argument and a set of supporting claims. These things are not going to go away any time soon." Colorizing Dinosaurs: Feather Pigments Reveal Appearance of Extinct Animals Long the range of the imagination, the coloration--and origin--of feathered dinosaurs and ancient birds has begun to be revealed through fossilized organelles. For nearly two centuries, people have struggled to imagine what the great extinctdinosaurs looked like. Thanks to modern paleontology and physiology, their shapes, masses and even how they might have moved and interacted have been deduced. But one of the most basic questions about their appearance, their coloring, seemed unanswerable. A new study, however, proposes some of the first cellular hints. Extrapolating from primitive pigment-giving organelles known as melanosomes (which contain the coloring compound melanin and are still prevalent in modern animals) that have been found in fossilized dinosaur feathers from the Cretaceous period, a research team paints a picture of dark wings and brightly striped reddish tails. The findings will be detailed in the January 28 issue of Nature. (Scientific American is part of Nature Publishing Group.) Using high-powered scanning electron microscopy, the researchers examined 125-million-year-old feathers found in the Jehol group, a geologic formation in northeastern China. One of the animals analyzed, the Sinosauropteryx, a small, meat-eating dinosaur, appears to have had alternating bands of dark and light along its tail. "In this case at least, the dark band was [a] russet, gingery color," Michael Benton, a professor of vertebrate paleontology at the University of Bristol in England and co-author of the study, said in a press conference Tuesday in London. "It's the first time anybody has had evidence of original color." A prevalent pigment The discovery of this pigment in ancient dinosaur and bird feathers did not begin at a dinosaur dig, but rather in a lab studying fossilized squid. Jakob Vinther, now a PhD student in paleontology at Yale University who was not involved with the new study, was examining preserved ink sacs from Jurassic period squid when he found that the fossilized melanosomes appeared identical to those in modern-day squid ink. The similarity led him to propose that "there must be melanosomes preserved in other kinds of structures" in other animals. "So I said, 'It would be interesting to look at bird feathers,'" he said in a phone conversation. In these fossilized feathers he saw the very same structures. After bringing his findings to the attention of his graduate supervisor, Vinther was told that he had only found a common bacteria that had long been known to exist in these feather fossils. But he was not convinced. After studying feathers with a known color pattern, he found that the dark-pigment particles appeared only where the black bands didand in the fossils they aligned perfectly with the individual feather filaments, an unlikely arrangement for bacteria. In 2008 Vinther and colleagues published a paper in Biology Letters describing the find, in which they proposed, "The discovery of preserved melanosomes opens up the possibility of interpreting the color of extinct birds and other dinosaurs." It was no great stroke of luck that these pigment particles were still left in the fossilized feathers. In fact, they are what had been fossilized from the feather, says Vinther: "The reason that you have a fossil feather is because there are pigments. So if you have a white feather, it would not leave a fossil." What does seem like good fortune, at least to the researchers studying these ancient particles, are their shapes. "We're extraordinarily lucky that each of the pigments of the melanosomes are contained in a different-shaped organelle," Benton said at the press briefing. The melanosomes that have been found no longer exhibit their original colors because their chemical properties have changed in the intervening millions of years. But researchers can still ascertain the extinguished hues by looking at the particles' sizes and shapes, which correspond to those in living animals. Those that are sausage-shaped are black, and rounder; smaller ones are responsible for red, rusty tints. Cautious coloring Although Benton and his colleagues have proposed some specific coloration patterns, Vinther is not convinced that paleontologists are quite ready to pick up a paintbrush. It is not yet known how different coloring particles work in combinationand in their original structure in feathersto create a perceivable hue. "We are getting much closer to putting colors on dinosaurs," he says, but we are not quite there yet. And although more fossil sampling will be necessary, he says, the real missing data has been right in front of us the whole time: "We need to know more about modern birds" because birds are living dinosaurs. Benton and his team admitted that despite the artistic representations published alongside their paper, "We can't say for sure [these are] all the colors, because there are other coloring agents other than melanosomes that may not be preserved with the fossils," he said. "Melanin is quite a tough protein," Benton pointed out at the press conference. But other coloring particles might not have survived these millions of years. Artistic representations of extinct animals often borrow

from extant animals, taking into account their habitat camouflaging as well as possible display characteristics, so, as Benton noted, artistic representations are "not completely fatuous." Vinther and his colleagues continue to search for more clues about coloration. By studying the structures of melanosomes' arrangement in more recent feathers, they were able to deduce when feathers would have iridescence, lending another level of sheen, and even purples, blues and greens to some surfaces. In a 2009 paper inBiology Letters, the team described evidence of iridescence they found on fossilized middle Eocene epoch feathers based on these structures, but older bird and dinosaur feathers have yet to yield any shiny clues. Given these organelles' prevalence, their discovery in fossilized feathers will likely open the door to a rush of new coloration studies. "We can definitely take this to look at skin, as well," Vinther says. As Benton noted, however, although the pigmentation is widespread in reptile and fish scales today, it is unusual to find remnants of organic material in dinosaur fossils. What we know about their skin is generally deduced from imprints left in surrounding material, which would not contain the organelles needed to predict coloration. Colorful clues In the meantime, even if only the wings and tails of certain feathered dinosaurs and early birds can be colored in, it will help researchers deduce much more about extinctanimals' ecology and behaviorand also about other physiological details such as vision. As Vinther notes, "If you do find spectacular colors in these animals, then you know they had an ability to see more colors than just black and white." Beyond coloration, the presence of these melanosomes in early Cretaceous dinosaurs helps to confirm the presence of protofeathers in some dinosaurs, which some have argued were simply connective tissue. The Sinosauropteryx that they studied, one of the earliest Sinosauropteryxs to have featherlike structures had, what Benton described as "a very clear rim of feathers running down the head, down the back and along the tail." These adornments were not feathers in the modern sense, such as those found on modern turkeys and peacocks. "These are very simple structures," Benton said. "They're sort of bristles" which were four to 10 millimeters long. But these unassuming bristles "really are feathers," he said. "In terms of the sequence of evolution of feathers, we can now say they start as simple bristles," Benton noted. If this is the case, it would support the idea that the structures that were to become feathers originally developed not for flight but rather for display purposes. "That's a display function," Benton said. "It's clearly not for flight." Apple introduces the iPad and iBooks What do you know? McGraw-Hill CEO Harold McGraw was on the money yesterday when he said Apple would announce a tablet on Wednesday. The iPad now has officially arrived, weighing in at less than a kilogram, with a 25-centimeter LED-backlit display that is just over a centimeter thick. It will be available by the end of March with a price tag starting at $499. The iPad, which marries a tablet computer with an electronic book reader, is less about cutting-edge technology and more about clever ways to package that technology to deliver a variety of content. Tablets have been available for nearly a decade (the concept of an electric clipboard goes back even further) and the ereader market already has several success stories, most notably the Amazon Kindle. From Apples description, the iPad is essentially a continuation of the multimedia and communications technology the company has been making since it introduced the iPod in 2001. Now, in addition to having a mobile, multi-touch screen device for listening to music, playing video games, e-mailing and surfing the Web, Apple fans will also be able to read electronic books with the help of the new iBook application (books bought through the new iBookstore, of course). The book that Apple CEO Steve Jobs demonstrated during his presentation at the iPad launch event on Wednesday costs $15. Jobs made sure to acknowledge Amazons role in popularizing the e-reader, adding that Apple would now "stand on their shoulders" and go farther, TechCrunchreported. Tablet PCs have been a hard sell, which is likely why Apple chose not to use "tablet" in its new products name. Tablets introduced by Acer, Compaq and others in the 2001 timeframe had some interesting features in their time, including handwriting recognition that could (usually) convert stylus scribbles into readable text or could display digital doodles and diagrams. But that generation of tablets (which sold for $1,500 or more) was primarily designed to use the Windows operating system and function as a next-generation laptop rather than a multimedia device for reading books, playing video games and watching video. Microsoft has taken up with Apples competition in the emerging market for e-reader/mobile multimedia devices. During his keynote at the Consumer Electronics Show, Microsoft CEO Steve Ballmer demonstrated a number of new "Slate PC" devices running Windows 7, which will go on sale later this year. Ballmer called these Slate PCs (there were devices from Hewlett-Packard, Archos and Pegatron) "almost as portable as a phone and as powerful as a PC running Windows 7." The Archos 9 PCtablet sells for $550, but the HP and Pegatrons devices exist only as prototypes at this time. Apple also appears to continue to respond to critics such as Greenpeace, which has cautioned consumer electronics vendors against the use of toxic materials when making their gadgets. Jobs also pointed out that the iPad is free of arsenic, BFR, mercury and PVC, and is "highly recyclable." Every iPad has Wi-Fi and some models will also offer 3G connectivity to the Web (those models start shipping by the end of April and will cost as much as $829). Battery life is expected to be up to 10 hours. Under the hood, the iPad runs on a 1-gigahertz processor called the A4 and can have up to 64 gigabytes of storage. Data plans are available through AT&T without iPad users having to sign a contract. Icy hunt for old air "We're checking out history books made of ice," says Kendrick Taylor. A palaeoclimatologist at the Desert Research Institute in Reno, Nevada, Taylor is the chief scientist of the West Antarctic Ice Sheet (WAIS) Divide drilling project, which is now three-quarters of the way towards pulling up the most temporally precise record of carbon dioxide for the past 100,000 years. The highly anticipated ice core promises to improve climatologists' understanding of the dynamic global climate system, and has already begun to illuminate how humans can affect it. On 25 January, drillers finished the season at a depth of 2,561 metres, about a kilometre off the project's final goal. Relentless winds and poor visibility at the WAIS Divide camp, about 1,170 kilometres from the South Pole, had permitted only 35 days of drilling this year. But bad weather is precisely why researchers chose this desolate stretch of ice. Snowfall accumulation here, about half a metre per year, is an order of magnitude greater than at the sites of other Antarctic cores covering the same time span. The heavy precipitation produces annual layers of ice 22 centimetres thick near the surface, allowing palaeoclimatologists to collect season-by-season data further back in time than any other Antarctic core. This is the best spot on the planet to get the record we're looking for. The drill site is perched atop an ice divide that sends ice flowing in opposite directions. There is little lateral flow on the divide itself, ensuring that ice sampled here has not travelled in from elsewhere on the ice sheet and scrambled the climate record. "This is simply the best spot on the planet to get the record we're looking for," says Taylor. "This is where the library is." That record should yield annual data for the past 40,000 years. It will extend back another 60,000 years, but annual layers cannot be reliably identified for that period because the weight of overlying ice has compressed them, making them too thin. Thirty-seven investigators lead 27 projects at the WAIS Divide, funded by the US National Science Foundation, studying everything from trace elements in the ice to the potential for life deep in the core. The bubbles in the ice are a special prize. Although temperatures at the camp are well below freezing, four

refrigeration units in the drilling facility keep the cores below 20 C, the temperature at which the bubbles can leak out. Their contents are precious: they trap air from the time of snowfall, offering snapshots of ancient atmospheric conditions. Last November, for instance, palaeoclimatologist Richard Alley and his colleagues at Pennsylvania State University in University Park reported the isotopic composition of methane extracted from the top layers of the WAIS core, representing the past 1,000 years1. They found an increase in heavy methane a byproduct of biomass burning and other human activity around the sixteenth century, which Alley attributes to Native Americans razing forests to expand their territory as the Americas experienced a population boom. This peak recedes as quickly as it rises, corresponding to widespread deaths as Europeans arrived with their diseases and weapons. The imprint shows up in the ice record because methane circulates around the globe in a matter of years. Yet of all the gases within the WAIS ice, Alley sees the most promise in CO 2. "We're going to get the highest-resolution, best-dated CO2record ever," he says. "That's what gets me really excited." High-resolution ice cores have been extracted before: Greenland's GRIP and GISP2 cores, drilled in the early 1990s, have the same season-by-season detail as those from the WAIS Divide, and have been the gold standard for palaeoclimatology in the Northern Hemisphere. But dust blown in from exposed land nearby interacted with acids in Greenland's ice to produce extra CO2, which has stymied attempts to establish a reliable CO2 record there. Most of what is known about past CO2 levels thus comes from the dust-free Antarctic ice. "WAIS will combine the high time resolution we see in Greenland with a record only retrievable in Antarctica," Taylor says. A question of timing The gas record may help researchers to better understand the precise timing of past increases in CO2 and temperature. Palaeoclimatologists already know that these changes have taken place roughly in step in the past, but which rises first, the thermometer or the greenhouse gas? "All the analysis everyone has done suggests that CO2lags temperature on timescales of several hundred years," says Ed Brook, a palaeoclimatologist at Oregon State University in Corvallis. The rising CO2 presumably acts as an amplifier to drive the temperature up further, but the margin of error in deep Antarctic cores is too large to nail down the timing of that relationship precisely. "The uncertainty is of the same order as the actual lag," says Brook. "WAIS Divide should help us solve that problem." The CO2 record could also help to solve a recently discovered climate puzzle. Other cores have shown that past temperature changes in the Arctic are inversely coupled to changes in the Antarctic, such that hot and cold periods 'seesaw' between the poles 2, 3. One driver is the oceanic 'Atlantic conveyor belt', in which cool, salty water sinks in the North Atlantic and flows southward. Abrupt changes in heat or salinity can disrupt the flow; warm periods in the Arctic, for example, are thought to have led icebergs to dump fresh water into the North Atlantic, decreasing the salinity and density of northern seawater and preventing it from sinking. The cold water stays in the Arctic, cooling it down, while Antarctica does not receive the cold water it normally would, shifting southern temperatures up. That, at least, is the theory. But temperature can be exchanged between the poles through other channels, including the atmosphere, which moves heat faster than ocean currents do, says Bo Vinther, a postdoc at the University of Copenhagen, who has worked on coring projects in Greenland and Antarctica. If palaeoclimatologists see long lags between temperature changes at the poles when they stack the WAIS cores up against those from Greenland, that evidence would suggest a lead role for the ocean in the bipolar seesaw. But a shorter lag would suggest a more prominent role for the atmosphere. The CO2 record will help to answer this question, too: CO2 from deep in the ocean, where large masses of organic matter have decomposed, has a different isotopic signature from terrestrial or atmospheric CO2. Heavier CO2 in the WAIS record would suggest that ocean circulation bringing up deep water was largely responsible for the bipolar seesaw. With about a kilometre of ice left to pull up, Taylor is hopeful that the core will be complete by the end of the next drilling season. But the final metres can be the toughest to dig up because the drill's descent grows longer with each segment of extracted core. "We can get it all done next season if everything goes perfectly," he says. "But this is Antarctica. Every day is a surprise." Mixed Impressions: How We Judge Others on Multiple Levels Researchers are developing a new understanding of how we judge people. Weve all heard that people favor their own kind and discriminate against outgroupsbut thats a simplistic view of prejudice, says Amy Cuddy, a professor at Harvard Business School who studies how we judge others. In recent years she and psychologists Susan Fiske of Princeton University and Peter Glick of Lawrence University have developed a powerful new model. All over the world, it turns out, people judge others on two main qualities: warmth (whether they are friendly and well intentioned) and competence (whether they have the ability to deliver on those intentions). A growing number of psychological researchers are turning their focus to this rubric, refining it and looking for ways in which we can put this new understanding of first impressions to use. When we meet a person, we immediately and often unconsciously assess him or her for both warmth and competence. Whereas we obviously admire and help people who are both warm and competent and feel and act contemptuously toward the cold and incompetent, we respond ambivalently toward the other blends. People who are judged as competent but coldincluding those in stereotyped groups such as Jews, Asians and the wealthyprovoke envy and a desire to harm, as violence against these groups has often shown. And people usually seen as warm but incompetent, such as mothers and the elderly, elicit pity and benign neglect. New research is revealing that these split-second judgments are often wrong, however, because they rely on crude stereotypes and other mental shortcuts. Last year psychologist Nicolas Kervyn and his colleagues published studies showing how we jump to conclusions about peoples competence based on their warmth, and vice versa. When the researchers showed participants facts about two groups of people, one warm and one cold, the participants tended to assume that the warm group was less competent than the cold group; likewise, if participants knew one group to be competent and the other not, they asked questions whose answers confirmed their hunch that the first group was cold and the second warm. The upshot: Your gain on one [trait] can be your loss on the other, says Kervyn, now a postdoctoral researcher at Princeton. This compensation effect, which occurs when we compare people rather than evaluating each one separately, runs counter to the well-known halo effect, in which someone scoring high on one quality gets higher ratings on other traits. But both effects are among several mistakes people often make in inferring warmth and competence. We see high-status individuals as competent even if their status was an accident of birth. And when we judge warmth, rivalry plays a role: If someone is competing with you, you assume theyre a bad person, Cuddy says. The good news is that if you belong to a stereotyped group or otherwise know how people see you, you can try changing your image. A competent politician who strikes the public as cold, for example, can draw on his warmth reserves to better connect with voters. After all, Cuddy points out, Everybody comes across as warm or competent in some area of their lives. A softer ride for barefoot runners Barefoot endurance runners may have a more cushioned ride than most people who run in shoes, according to a biomechanical analysis. Barefoot long-distance running has been a small but growing trend in the athletics community, driven in part by popular books and articles that maintain that runners might get fewer injuries if they ran the way humans evolved to without supportive, cushioned running shoes. But although the idea is seductive, with running shoe companies recently marketing 'minimal' footwear, there has been little evidence to support the claims.

Now, evolutionary biologist Daniel Lieberman at Harvard University in Cambridge, Massachusetts, has found crucial differences in the way that barefoot and shod runners land, with dramatic consequences for how the body takes the impact (see Nature 'svideo). "The first time we had a long-distance barefoot runner come to the lab and run across the force-plate, I was amazed," says Lieberman. Whereas most runners hit the ground with a sudden collision force as the body abruptly decelerates, says Lieberman, this barefoot runner made no such impact. Hard impact Lieberman found that long-distance runners who usually wear shoes, in both the United States and in Kenya, tend to land directly on their heels, abruptly bearing the full force of the impact. The force of the collision, even with a cushioned sole, was the equivalent to up to three times their bodyweight. The force could be linked to common running injuries such as stress fractures and plantar fasciitis, although this has yet to be demonstrated. Americans and Kenyans accustomed to running barefoot, however, tend to strike the ground with the ball of their feet before touching down the heel a forefoot strike allowing the tendons and muscles in the foot and lower leg to act as shock absorbers, bringing the impact force down to 60% of their bodyweight. The team's research is published in Nature1. "The ankle is a very compliant, springy joint, and barefoot runners use it a lot," says Lieberman. "It isn't available to you when you rear-foot strike. Then you're relying solely on the spring on the heel of the shoe." By not taking advantage of the structure and function of the foot, says biologist Dennis Bramble at the University of Utah in Salt Lake City, runners may be risking injury. "Ignoring how we evolved and what our bodies were made to do is risky business," he adds. Together with Bramble, Lieberman has proposed that features of the human body such as longer legs, shorter toes and a highly arched foot are linked to longdistance running and enabled early hominins to chase and eventually exhaust prey2. But just because human ancestors landed on the balls of their feet when they ran, that doesn't mean it's ideal for today's runners who grew up with shoes. There's no evidence showing that running shoes prevent injuries. But nor is there much evidence that people who run barefoot get fewer injuries than those who run in shoes. Altered microbe makes biofuel In a bid to overcome the drawbacks of existing biofuels, researchers have engineered a bacterium that can convert a form of raw plant biomass directly into clean, road-ready diesel. So far, biofuels have largely been limited to ethanol, which is harder to transport than petrol and is made from crop plants such as maize (corn) and sugarcane, putting vehicles in competition with hungry mouths. In this week's Nature, researchers from the University of California, Berkeley, and the biotech firm LS9 of South San Francisco, California, among others describe a potential solution: a modified Escherichia coli bacterium that can make biodiesel directly from sugars or hemicellulose, a component of plant fibre (see page 559). The method can be tailored to produce a host of high-value chemicals, including molecules that mimic standard petrol, and could be expanded to work on tougher cellulosic materials, the researchers say. The work identifies a potentially cost-effective way of converting grass or crop waste directly into fuel, filling gas tanks without raising global food prices or increasing hunger and deforestation in far-flung locales. Moreover, the process is much more climate friendly than manufacturing ethanol from maize, and produces higher-energy fuels that are interchangeable with current petroleum products. The next step is to scale the process up and adapt it to cellulose, which makes up the bulk of plant material. The process has a lot of promise for actually being commercialized. "It's a nice milestone in the field of biofuels, and it has a lot of promise for actually being commercialized," says James Liao, a metabolic engineer and synthetic biologist at the University of California, Los Angeles. LS9's calculations, performed with the help of the Argonne National Laboratory in Illinois, show that the biodiesel that it is preparing to market reduces greenhouse-gas emissions by 85% compared with standard diesel. That calculation is based on using Brazilian sugarcane, which is a much more efficient feedstock than maize; LS9 says that the shift from sugars to biomass as a feedstock would reduce greenhouse gases even further. The company has been working to convert sugars into tailored molecules for several years, says co-author Stephen del Cardayre, LS9's vice-president for research and development. However, their university collaborators went two steps further, eliminating the need for additives and then folding in the ability to use hemicellulose as a feedstock. "This paper is a representation of the types of efforts that are going to move us to biomass," he says. The researchers basically amplified and then short-circuited E. coli 's internal machinery for producing large fatty-acid molecules, enabling them to convert precursor molecules directly into fuels and other chemicals. The team then inserted genes from other bacteria to produce enzymes able to break down hemicellulose. In all, the authors report more than a dozen genetic modifications. The results could buoy LS9, says Mark Bnger, a research director at business consultancy Lux Research in San Francisco. Like its competitors, including Amyris of Emeryville, California, and South San Francisco-based Solazyme, LS9 struggled for funding in 2008 and early 2009 because of the drop in oil prices and the economic downturn, Bnger says. But LS9 made it through, securing US$25 million in new funding from various sources, including a strategic partnership with oil giant Chevron last September. The company plans to open a commercial-scale demonstration plant later this year. Fossil feathers reveal dinosaurs' true colours Pristine fossils of dinosaur feathers from China have yielded the first clues about their colour. A team of palaeontologists led by Michael Benton of the University of Bristol, UK, and Zhonghe Zhou of the Institute of Vertebrate Paleontology and Paleoanthropology in Beijing, has discovered ancient colour-producing sacs in fossilized feathers from the Jehol site in northeastern China that are more than 100 million years old. These pigment-packed organelles, called melanosomes, have only been found in fossilized bird feathers before now. The team discovered the melanosomes in fossils of the suborder Theropoda, the branch of the dinosaur family tree to which the flesh-eating species Velociraptor and Tyrannosaurus belong. However, it was not in these two iconic dinosaurs that the organelles were found, but in smaller species that ran around low to the ground with tiny feathers or bristles distributed across their bodies. The team discovered two types of melanosome buried within the structure of the fossil feathers: sausage-shaped organelles called eumelanosomes that are seen today in the black stripes of zebras and the black masks of cardinal birds, and spherical organelles called phaeomelanosomes, which make and store the pigment that creates the rusty reds of red-tailed hawks and red human hair. The team didn't find cell structures or machinery responsible for other colours, such as yellows, purples and blues. They suggest that dinosaur cells might have produced coloured pigments such as carotenoids and porphyrins, but that the proteins that make them degrade more rapidly than organelles, so do not leave a trace in fossils. Melanosomes, by contrast, are an integral part of a feather's tough protein structure so can survive for longer. "We always tell introductory palaeontology students that things like sound and colour are never going to be detected in the fossil record," says Benton. "Obviously that message needs to be reconsidered." Fossils of one theropod dinosaur, Sinosauropteryx, reveal that it had light and dark feathered stripes along the length of its tail. The team found that feathers from the darker regions of the tail were packed with phaeomelanosomes, indicating they were russet-orange in colour. The lighter stripes could have been white, says Benton, but because some pigments degrade and don't leave a fossil signature, it is difficult to be sure. Sinosauropteryx was not the only colourful feathered species. Another small theropod species, Sinornithosaurus, had feathery filaments that were dominated by either eumelanosomes or phaeomelanosomes, hinting that its individual feathers varied in colour between black and russet-orange.

Birds of a feather A lot of questions have been raised about the structures that are found on the earliest of the feathered dinosaurs, such asSinosauropteryx. Some palaeontologists argue that these bristle-like structures are actually fossilized connective tissue rather than early feathers. However, says Benton, in modern birds which evolved from the theropod dinosaurs melanosomes are found only in the developing feathers and not in the connective tissue. The fact that the melanosomes in the dinosaur feathers are also found inside the bristles themselves resolves the debate, he says. "This paper puts the nails in the coffin of arguments countering the feather nature of these structures," agrees Luis Chiappe, a palaeontologist at the Natural History Museum of Los Angeles County, California. "It is deeply gratifying that this colour discovery is allowing us to finally agree that the structures on Sinosauropteryxwere actually early feathers," says evolutionary ornithologist Richard Prum of Yale University in New Haven, Connecticut. "Now we can get on with studying their evolution." In addition, the discovery of colour in the earliest feathers may also sway the biggest dinosaur debate of them all what the feathers were actually being used for. The tiny bristles on early feather-bearers could not have been used for flight, as some have suggested, because they would have provided no lift. But they could have served as insulation, or for display. "It is looking increasingly likely to me that these dinosaurs were making a visual statement," says Benton. "What that statement was, we don't know, but you don't have a orange-and-white striped tail for nothing." Coughs Fool Patients into Unnecessary Requests for Antibiotics No one wants a hacking cough for days or weeks on end. But research shows that it generally takes about 18 days to get over a standard cough-based illness. Most of us grow impatient after a week or so andhead to the doctor to get a prescription. The problem with that recourse, however, is that antibiotics are usually useless against typical respiratory infections that cause coughs. A new analysis shows that even though antibiotics might be ineffective against a lingering cough, the timing of their prescription might be fooling people into thinking that the medication worked. This pattern might increase the frequency of these unnecessary prescriptions, a hazardous practice that can increase drug resistance across many bacteria strains. The findings were published online January 14 in Annals of Family Medicine. A cough is one of the most common reasons patients go to the doctor. One quick fix, patients might assume, is a round of antibiotics. Not exactly, according to a randomized trial described last month in The Lancet Infectious Diseases. The trial showed little difference in the duration of lower-respiratory infections in people who got antibiotics and those who received placebos. Why? Like the full-blown flu, coughs are usually triggered by virusesnot bacteriaand thus are unaffected by antibiotics. Most often coughs and associated infections get better on their own. This happy outcome, however, can cause some confusion about the efficacy of antibiotics for treating cough-based sicknesses. The researchers for the new study, led by Mark Ebell of the Department of Epidemiology and Biostatistics in the College of Public Health at the University of Georgia, combed through data on acute coughs. They found that the average duration of symptoms reported in the medical literature was 17.8 days. They then compared these findings to results from a poll of 500 adults, who were asked to estimate how long they would expect to be sick if their main symptom was a cough and they were not taking any medicine (under various scenarios with or without fever and with or without mucus). The expected duration was about seven to nine days (on the longer side if the cough was accompanied by a fever or green mucus). In other words, far less than the mean duration of these types of ailments. This mismatch between patients expectations and reality for the natural history of acute cough illness has important implications for antibiotic prescribing, the authors noted in their paper. As they explained, if a patient has not started getting better after about a weekwhen they expect the cough should be tapering offthey might head to the doctor to get antibiotics. This timing, however, is troublesome. If they begin taking an antibiotic seven days after the onset of symptoms, they may begin to feel better three or four days later, with the episode fully resolving 10 days later, the researchers wrote. Although this outcome may reinforce the mistaken idea that the antibiotic worked, it is merely a reflection of the natural history of the illness. Helping patients understand this common coincidenceand the actual expected duration for their coughcould help reduce the amount of antibiotics needlessly prescribed for such ailments. Unnecessary antibiotic prescriptions can contribute to the growing trend of antibiotic resistance, which reduces the efficacy of these drugs in situations in which they really are needed. Doctors often give in to pressure from patients to prescribe something for their illness. Patients should be told that it is normal to still be coughing two or even three weeks after onset, and that they should only seek care if they are worsening or if an alarm symptom, such as high fever, bloody or rusty sputum, or shortness of breath, occurs, Ebell and his colleagues wrote. Otherwise, a thoughtless quick-fix Rx is likely to just increase the belief in their efficacy, creating the potential for a cycle of expectation and prescription, the researchers noted. And that is not good for anyones health. Food versus Fuel: Native Plants Make Better Ethanol New research reveals that native grasses and flowers grown on land not currently used for crops could make for a sustainable biofuel. A mix of perennial grasses and herbs might offer the best chance for the U.S. to produce a sustainable biofuel, according to the results of a new study. But making that dream a reality could harm local environments and would require developing new technology to harvest, process and convert such plant material into biofuels such as ethanol. Biofuels have become controversial for their impact on food production. The ethanol used in the U.S. is currently brewed from the starch in corn kernels, which has brought ethanol producers (and government ethanol mandates) into conflict with other uses for corn, such as food or animal feed. Already,corn ethanol in the U.S. has contributed to a hike in food costs of 15 percent, according to the Congressional Budget Office, and the U.N. Food and Agriculture Organization blames corn diverted to biofuels for a global increase in food prices. To see if nonfood plants could be a source of a biofuel the way corn is, researchers followed six alternative crops and farming systems in so-called marginal lands over 20 years, including poplar trees and alfalfa. Such marginal lands face challenges such as soil fertility and susceptibility to erosion. The new analysis found that conventional crops such as corn had the highest yield of biomass that can be turned into biofuel on marginal lands, although their ability to reduce CO2 is harmed by tilling, fertilizing and other CO2-producing activities necessary to turn them into fuel. (Such factors have caused considerable scientific disagreement over whether ethanol from corn delivers any useful greenhouse gas reductions, although the researchers find that even corn provides some climate benefits as long as oil production and combustion is included in the comparison.) In contrast, the grasses and other flowers and plants that grow naturally when such lands are left fallowspecies such as goldenrod, frost aster, and couch grass, among otherscan deliver roughly the same amount of biofuel energy per hectare per year if fertilized, yet also reducing CO2 by more than twice as much as corn. "When biofuel is produced from such vegetation, the overall climatic impact is very positive," says lead researcher Ilya Gelfand of Michigan State University. The research was published inNature on January 17. (Scientific American is part of Nature Publishing Group.) By taking those field results and feeding them into a computer model that calculated how much such marginal land was available within 80 kilometers of a

You might also like