You are on page 1of 60

Fall 2011 Issue 21

sciencereview.berkeley.edu
20 QB3 Garage

Start-ups get a space of their own The psychology of good behavior Algorithms to ease your commute

28 Innate altruism

36 Theres a map for that

Digitizing the Drawers

berkeley

from the edito r


Editors
Fall 2011 Issue 21

Editor in Chief
Allison Berke Crystal Chaw Mary Grace Lin Chris Holdgraf Sebastien Lounis Anna Schneider Josh Shiode
Dear readers,

Art Director
Amy Orsborn

Layout Staff
Leah Anderson Marek Jakubowski Asako Miyakawa Helene Moorman Valerie OShea Gregory Thomas

Welcome to the 21st issue of the Berkeley Science Review. As we saw when this years Nobel prizes were awarded last month, UC Berkeley research is perpetually interesting and relevant. (UC Berkeley graduate students are also interesting to Nobel laureates, as Greg Alushin describes (page 6) in his travelogue from the Lindau Meeting.) What wed like to highlight in this issue is the driving force making science relevant and interesting to the public: mathematics and statistics. In our cover story Digitizing the drawers (page 46), Joan Ball relates the efforts of Digitizing programmers and archivists working with Berkeleys natural the Drawers history collections to contextualize and coordinate massive numbers of specimens. In UC Berkeleys herbaria, there are 360,000 specimens, 14 per Cal undergraduate. The number of cacti in our Botanical Garden alone is equal to the number of professors at UC Berkeley.
sciencereview.berkeley.edu

Copy Editor
Greg Alushin

Managing Editor
Mary Grace Lin

Tracking down every miniscule insect specimen in a museum can be a challenge, but at least pinned insects stay in one spot; some researchers on campus are trying to track all of us, every time we leave our houses. Ginger Jui, in Theres a map for that (page 36), gives us a broad view of the algorithmic and statistical analysis behind how we travel, from traffic data on a Google map to cell-phone tracking of road usage and transit time. Robert Gibbonis Toolbox (page 56) delves into the calculations behind routing algorithms, and explains why simply choosing the faster road is sometimes a poor decision. Researchers hope that, along with taking convenience into account, understanding the costs of commuting in time and energy can drive us to make better choices. Driving more efficiently can make a tangible difference: the typical US household releases 50 tons of carbon dioxide into the atmosphere each year. A 50-ton carbon offset costs $750 on the Chicago Climate Exchange, a kind of stock exchange for greenhouse gases, but of course Berkeley puts a higher value on the environment; to purchase the same amount of carbon dioxide for a campus lab is $1532. Statistics can also help us analyze the behavior of individuals. In The brain is half full (page 28 ), Azeen Ghorayshi investigates the Greater Good Science Center, an initiative to quantify our better nature, from altruism to collaboration. Even if our capability for good behavior may seem inextricably linked to factors outside of our control, Audrey Chang and Kristina Garfinkel report that anxiety and sleep deprivation are chemically predictable; even itchiness is attributable to a few proteins, described on page 11. Further afield from the Greater Good Science Center, UC Berkeley students are still behaving altruistically, bringing Hepatitis B vaccination to underserved communities in Alameda County, as Sharmistha Majumdar describes (page 4). Each issue of the BSR requires the coordination of six editors, eighteen authors, seven layout editors, and a statistically significant amount of time and care. Im overwhelmingly grateful to have Amy Orsborn as my counterpart leading the Layout team, and Mary Grace Lin keeping our resources on track, from finances to enthusiasm. Were also lucky to have six regular authors (including one combination author and editor) on our blog team, who will keep you updated this semester on posts that tie in with some of the print articles and our second Readers Choice Award, where you can vote for the best feature article or brief. Enjoy the issue,

Web Editor
Brian Lambson

Web Director
Chris Holdgraf

Printer
Sundance Press

Allison Berke Editor in Chief


Fall 2011 Berkeley Science Review

contents features
28 The brain is half full

28

20 A lab space of ones own

The QB3 Garage: an incubator for innovation by Susanne Kassube

The science behind positive psychology by Azeen Ghorayshi

36 Theres a map for that


Cell phones for a better commute by Ginger Jui

20

44 Digitizing the drawers

Moving natural history collections online by Joan Ball

36
2011 Berkeley Science Review. No part of this publication may be reproduced, stored, or transmitted in any form without the express permission of the publishers. Financial assistance for the 2011-2012 academic year was generously provided by the Office of the Vice Chancellor of Research, the UC Berkeley Graduate Assembly (GA), the Associated Students of the University of California (ASUC), and the Eran Karmon Memorial Fund. Berkeley Science Review is not an official publication of the University of California, Berkeley, the ASUC, the GA, or Lawrence Berkeley National Laboratory. The views expressed herein are the views of the writers and not necessarily the views of the aforementioned organizations. All events sponsored by the BSR are wheelchair accessible. For more information email sciencereview@gmail.com. Letters to the editor and story proposals are encouraged and should be emailed to sciencereview@gmail.com or posted to the Berkeley Science Review, 10 Eshleman Hall #4500, Berkeley, CA 94720. Advertisers: contact sciencereview@gmail.com or visit sciencereview.berkeley.edu.

Berkeley Science Review

Fall 2011

berkeley
Fall 2011 Issue 21

departments
1 From the Editor 4 Labscopes
Can that thing fly? by Mica Smith The frightened brain by Audrey Chang Hyena gender roles by Erin Jarvis

52 Book review
by Erin Jarvis

The Instant Physicist

Professor Richard A. Muller, illustrated by Joey Manfre

54 Faculty Profile
Karen De Valois True visionary by Amanda Alvarez

46

Think globally, treat locally by Sharmistha Majumdar

6 From the field


by Greg Alushin

56 Toolbox

Probability and statistics by Robert Gibboni

current briefs 8 Mind over matter 14 A perennial problem


Brain machine interfaces come online by Samantha Cheung

10 Red eye science


Sleep now, learn later by Kristina Garfinkel

Developmental deficits stemming from pesticides by Molly Sharlach

11 Core issues 12

small 16 OneCal, onestep for giant


Advances in quantum computing by Zlatko Minev

Modeling the thermodynamic instability of planetary cores by Keith Cheveralls

leap for mankind

A real headscratcher
The molecular basis of itch by Nikki Kong

17 Rising above Plateaus


problem
250 years of mathematical exploration by Alireza Moharrer

CLOCKWISE FROM TOP-RIGHT: MAREK JAKUBOWSKI; ARTURO NAHUM; ASAKO MIYAKAWA; ERIC FISCHER; ELISABETH FALL; CHARLES DARWIN

10

COVER: This pinned beetle is one of 6 million invertibrate specimens in UC Berkeleys Essig Museum of Entomology. The Calbug team is working to build databases to digitize the museums collection.

Fall 2011

Berkeley Science Review

l abscopes
he giant hummingbird, Patagona gigas, native to the Andes, is an oversized cousin of the Annas hummingbird beloved in North American backyards. It weighs about 20 grams, as much as a toothbrush, which is twice as much as the next largest of more than 300 known hummingbird species. Professor Robert Dudley and Mara Jos Fernndez of UC Berkeleys integrative biology department analyzed the flight mechanics and metabolism of Patagona gigas in search of an explanation for its extreme body size. Muscle efficiency in general tends to be greater for larger things, said Dudley, who wondered whether Patagona gigas evolved to take advantage of more effective energy use in the high elevation of the Chilean Altiplano. However, the giant hummingbirds metabolic rate relative to its size correlates with data Dudleys group collected for a number of smaller species. So if the giant hummingbird doesnt expend more energy to sustain itself in flight, why havent hummingbirds evolved to grow even larger? Dudley cites the example of the nectar feeding bat, another vertebrate species that hovers, and can weigh up to 40 grams. The wing design is very different, says Dudley. Bat wings connect to the hind legs, so theyve got about twice the wing area. Depending on the aerodynamic mechanisms involved, the motor may be irrelevant: its your ability to convert that to aerodynamic force. He speculates that the counterbalance between increasing wingspan and decreasing wingbeat frequency may place an upper limit on body size, which is represented by Patagona gigas. Though the results show that the hummingbirds characteristically high metabolism is unaffected by size, itll be awhile on the evolutionary timeline before we see birds as big as falcons hovering at our feeders. -Mica Smith

Can that thing fly?

-Audrey Chang

Berkeley Science Review

Fall 2011

FROM TOP: ARTURO NAHUM; DESIGN: MAREK JAKUBOWSKI,BRAIN: KIRSTIE WHITAKER, SPIDER: TROUNCE

o you tend to sweat the small stuff? Chronic anxiety and similar neurological disorders affect over 25 million Americans, and researchers at UC Berkeley have identified two distinct pathways in the brain that can predict an individuals susceptibility to anxiety. In this collaborative study between the Bishop lab at UC Berkeley and colleagues at Cambridge University, changes in blood flow to the brain were measured while subjects viewed a computer-generated image of a person in a room just before a loud scream is heard. For some trials, the virtual figure in the room moves to cover its ears just before the scream, while in other trials the gesture does not predict the sound, keeping subjects in a perpetual state of anticipation. Some participants with abnormal fear responses displayed a stronger reaction to the virtual figure covering its earsthe anticipation of the loud scream led to overactivity in the amygdala region of the brain, which is known to process emotional memories. Others showed less activation in the ventral prefrontal cortex, a region responsible for decreasing the fear response. These are two separate mechanisms, but failure in either one results in a heightened fear response. This model of anxietyactivation instead of regulationcan better guide targeted therapies for anxiety disorders in the future. By understanding which mechanism is the source of an increased fear response, the success of different treatments can be better predicted.

The frightened brain

potted hyenas are a curious species. They are one of the very few mammals that maintain complicated social hierarchies within a female-dominated society, and are one of the only non-primate species that can recognize animal relationships in which they are not directly involved. And, to the surprise of many, there is a clan of 26 hyenas living in the Berkeley hills. Before you start questioning the safety of your evening jogs above campus, rest assured that the hyenas are part of a captive breeding colony housed at the Field Station for the Study of Behavior, Ecology, and Reproduction (FSSBER), maintained by the University of California, Berkeley. UC Berkeley researchers Frederic Theunissen (Professor of Psychology), Mary Weldele (psychology and integrative biology), Aaron Koralek (graduate student at the Helen Wills Neuroscience Institute), and Stephen Glickman (Professor of Psychology and Integrative Biology) observed the hyena colony to decipher a distinctive yet cryptic hyena trait: the giggle, and its meaning in hyena society. Hyena social status is maintained by a complicated social network of coalitions and alliances, which requires an intricate system of communication. Hyenas communicate visually, chemically, and, as a major backdrop to the nightly chorus of the African Savannah, acoustically. The quintessential hyena call is the gigglea high-pitched sound emitted in bouts that sounds like laughter. The function of the giggle call is actually very poorly understood, says Aaron Koralek. This was some of the first work looking at if [the hyena giggle] could potentially serve a social function. The hyena giggle is most frequently emitted by a subordinate competing over a carcass with a dominant member, and previous observations suggested that the giggle was a subordinate call. Research further suggests that the giggle is a frustrated response to wanting something and not getting it: another good reason not to tease a hyena. -Erin Jarvis

Hyena gender roles

FROM TOP: ERIN JARVIS; GRAPHIC: MAREK JAKUBOWSKI,DATA: COMMUNICABLE DISEASE CONTROL AND PEVENTION, SF DEPT

Think globally, treat locally

pproximately 1.25 million Americans are chronically infected with hepatitis B, a blood-borne, sexually transmitted virus that causes severe liver diseases like cancer and cirrhosis. Of this group, over 50% are Asian and Pacific Islanders (API), many of whom live in California. However, such high numbers arent due to a lack of medical technology. What is ironic is that a vaccine has been available for over twenty years! muses Adele Feng, a UC Berkeley undergraduate and the Director of the campus-based Hep B Project, a volunteer organization founded in June 2009 to address the lack of hepatitis B services in Alameda County, California. Working with other community partners, Fengs group has been involved in organizing free hepatitis B screenings and vaccinations as well as spreading awareness among low-income Asian immigrant communities, one of the most at-risk populations in Alameda County. The county lacks a central database with reliable information about the statuses of local hepatitis B patients, resulting in an inefficient allocation of resources, outreach, and services. After two years in the community, the Hep B Project realized that they needed to expand beyond their motto of Educate, Screen, Vaccinate and start addressing this problem. The organization recently embarked on a project to integrate existing databases of non-sensitive patient data from local organizations and the Alameda County Public Health Department. With this more comprehensive dataset, they hope to create maps of hepatitis B prevalence and at-risk populations in the Bay Area using Geographic Information System (GIS) technology. This effort was a recent winner at the Big Ideas contest conducted by CITRIS (Center for Information Technology Research in the Interest of Society) at UC Berkeley. The groups proposal included building an easy-to-navigate, user-friendly interface that allows users to look at overlapping demographic factors such as ethnicity, age, and infection distribution on one map. Eventually, public health officials and community organizations should be able to identify and predict areas of greatest need for preventative and disease-management services, as well as target at-risk and affected populations in a more efficient and cost-effective manner.

-Sharmistha Majumdar

Chronic Hepatitis B cases in SF area

Asian / Paci c Islander

Black

White

Other
Fall 2011 Berkeley Science Review

f rom the field


M
y advisor, molecular and cell biology professor Eva Nogales, burst into the room: Greg, I need to speak with you right now. Spurred from my mild Friday afternoon torpor, I felt a momentary panic. Once we were out in the hallway, though, instead of harsh words she offered me an opportunity: the chance to apply to the Lindau Meeting, a unique conference where Nobel laureates and young scientists from around the world come together for a week of scientific discourse. There was, however, a catch. Berkeley was forwarding the first two applications it received to the reviewing committee, and one had already been submitted. I had one hour to write an essay and prepare my C.V. so the materials and a recommendation could be submitted by the end of the day. Confident that such frantic last minute efforts would be sub-par, I cranked out an application and promptly forgot about the whole thing. To my amazement, a few weeks later I was informed that I had made it to the next stage in the selection process. Then the final word came: I was in, one of 567 winners from a pool of 20,000 applicants heading to Lindau. Only many months later did I come to appreciate the gravity of the event. Instead of a summer prize for lucky graduate students, the meetings began as an effort to reintegrate Germany with the international scientific community after World War II. Two doctors from Lindau, a picturesque medieval town on Lake Constance in southern Germany, persuaded the local government to host an international conference and boost its visibility by specifically inviting Nobel laureates. They also won the backing of a local nobleman, Count Lennart Bernadotte, who became the driving organizational force. The first meeting, held in 1951, was small but successful. It focused on reestablishing ties between expatriate German scientists and those who had remained during the war, including Nobel laureates from both categories. In 1954, young researchers were invited for the first time, beginning a shift in focus toward laureates mentoring students. An increasing international presence steadily gathered, and today the participants hail from more than 70 countries. Nevertheless, the event maintains strong ties to its origins: Count Bernadottes daughter, Countess Bettina Bernadotte, is the reigning president of the Lindau council. Living up to its varied history, this years conference see-sawed between prestigious international discourse and truly local flavor. The theme of the conference was global health, and the opening ceremony featured Bill Gates engaging in a panel discussion with two young scientists and Nobel laureate Ada Yonath about how best to confront the health challenges of the developing world. In contrast, the Elite Bavarian Network hosted an evening session with local officials who tried to sell the audience on the joys of living and working in the vicinity and, bizarrely enough, lederhosen-clad dancers. Most days featured plenary lectures by several Nobel laureates in the morning, which varied

Greg Alushin (center) talking with Chemistry nobel laureate Robert Huber (right) along with a student from Indonesia (left).

Berkeley Science Review

Fall 2011

SAM HELD

widely, from the extremely technical, such as Ei-ichi Negishis detailed explanation of the reaction mechanisms of palladiumcatalyzed carbon-carbon bond formation, to personal history and anecdotes, such as Oliver Smithiess declaration that, Out of laziness, I invented gel electrophoresis. Each of the speakers hosted a smaller session in the afternoon, which were closest to what I had imagined I would experience: an intimate, intergenerational conversation between some of sciences leading lights and those who hope to follow in their footsteps. As I navigated the whirlwind of talks, discussion panels, and round-the-clock networking with some of the most intense young scientists I have encountered, it occurred to me that the meetings were, once again, evolving into a new stage. With high-powered VIPs and international delegations, including government officials, vying for the attention of the laureates, the Lindau Meetings have gone big. Of course, that means there are power lunches, invitation only. At one such event, laureate Bert Sakmann told me that when he was a medical student in Germany in the late sixties and early seventies he and his friends used to simply drive over to the meeting every year to hang out and chat with the laureates. Now, laureates are whisked from event to event in private cars provided by Audi, one of many corporate sponsors. Journalists and bloggers cover every lecture, and Nature even sends a team to film a documentary. The scope is changing, from strictly promoting dialogue between the best scientists and the best students, a concept that, while wonderful for the participants, is surely elitist, to displaying that dialogue, and all the hope and excitement that it generates, to a world that is hungry for inspiration. This shift in character implicitly raises the question, a central thread running through the conference, of what scientists should actually be doing. With their choice of global health as a theme, the organizers are clearly on the side of science for society, that is, scientists creating and disseminating knowledge for a better world. Sessions

were geared towards getting young researchers interested in working on neglected diseases and problems whose solutions would benefit all of humanity, not just rich, technologically advanced nations. Many of the laureates, however, did not jive with this message, making it clear that they had simply worked on a problem they found personally interesting without thinking about what impacts it might have, or had made their prize-winning discoveries by accident, following up on an unexpected observation. A few laureates, particularly Sir Harold Kroto of bucky-ball fame, advanced a similar argument about technological developments; many follow basic science discoveries made by scientists in seemingly unrelated fields, performing research for its own sake. Throughout my own scientific journey thus far, I have been partial to the trickle-down model of societal benefits from pure science, echoed by many of the laureates. However, my experience at Lindau did get me thinking about working on a problem with a more explicit goal of improving the world while also following my curiosity. And, if I get extraordinarily lucky, it might earn me a second trip to Lindau.

Greg Alushin is a graduate student in biophysics.

current briefs
Mind over matter
Brainmachine interfaces come online
Possessing the power to control artificial devices with thoughts alone seems straight out of a science-fiction movie. However, with the help of a team of neural prosthetics researchers at UC Berkeley, this idea is moving closer to reality. Although common sci-fi abilities such as the superhuman strength to destroy buildings or the ability to download martial art skills from a computer are still the stuff of fiction, there is one fantastic ability that the newlyformed Center for Neural Engineering and Prostheses (CNEP) would like to make a reality: giving paralyzed patients control of prosthetic devices using only their thoughts, allowing them to perform simple daily tasks independently. The CNEP, co-directed by Professor Jose Carmena from UC Berkeley and Professor Edward Chang from UC San Francisco, is a collaborative effort to develop neural prosthetic devices, drawing from the efforts of neuroscientists, neurosurgeons, and engineers from both universities (see Plugging back in, BSR Fall 2009). These devices will prove particularly useful for patients with neurological disorders that impair sensorimotor and linguistic ability, such as stroke, spinal cord injury, and amputations. The center hopes to develop technologies that can restore sensations, motor movement, and speech by relaying neural communication between the brain and external devices. The concept of such devices is not novel; for decades neural prosthetics have performed a range of services such as helping the deaf hear or treating symptoms of neurological disorders like Parkinsons disease. Available technologies work by stimulating relevant sensory nerves or regions of the brain and have paved the path for more neural prosthetic research. However, they are not sophisticated enough to interpret and enact the intentions of their users. Brain-Machine Interface (BMI) technology consists of components that allow robotic devices to directly communicate with the users brain. These include a sensor that reads neural signals signifying the patients intentions, a processor that decodes the users intention into an understandable task for the prosthetic device, and the prosthetic device that carries out the intended task. The goal, according to Carmena, is to allow a patient to have naturalistic control of a prosthetic deviceto feel the prosthetic arm as part of the body. In addition to the engineering challenges of developing extremely small, high bandwidth, ultra-low power implantable devices, says Carmena, there is a challenge involving movement replication. At the core of this problem is the fact that motor actions in our everyday life are far more complicated than we assume. Even the seemingly simple task of grabbing an object involves calculated and continuously refined movement. To properly navigate the hand toward an object in three-dimensional space, the correct muscles must be activated to move the hand toward its goal. Meanwhile, off-course movement must be detected and corrected through cross-talk between motor regions of the brain with visual and spatial feedback to allow trajectory correction toward the object. Moreover, there are many possible ways for us to move our limbs to get to our endpoint. For us, the task appears simple because of neural networks that were fine-tuned as we learned how to utilize our limbs to achieve our intended task. Achieving the same naturalistic capability with a BMI becomes much more difficult, says Carmena. Regardless of the challenges, the neural prosthetic field is flourishing as technological barriers are broken. Carmenas previous work in macaque monkeys revealed the feasibility of neural prosthetics by demonstrating that the monkeys could learn how to use a robotic arm with brain signals that are normally involved in motor control. Another important step for BMI research came from a study at Brown University where devices similar to those used in macaques were implanted in the brain of a quadriplegic human patient. Impressively, BMI implantation in the region of the brain that controls motor movement gave the patient the ability to move a computer cursor around a screen. In addition to enabling people to control computers, CNEP researchers also hope that their neural interfaces will help us understand how brains adapt to these devices. You might not need a full understanding of how we control our body to get the patients brain to control the prosthetic device, mentions Carmena, but it will facilitate us in getting there sooner. By training primates with implanted BMIs to do behavioral movement tasks and simultaneously observing properties of neurons in the motor region of the brain, CNEP researchers were able to determine that there are reversible, large-scale changes in these motor areas. Interestingly, the amount of change, both structurally and functionally, was correlated with involvement in controlling the BMI. The study also suggested the formation of a prosthetic motor memory, because the monkeys tested do not have to relearn how to use the BMI

Berkeley Science Review

Fall 2011

AUGUSTE RODIN;ASAKO MIYAKAWA

on a daily basis. According to Carmena, the brain establishes in neural space, a network to control this device as if it were part of its own body. In the short period since its launch in December 2010, CNEP has made impressive progress. Nevertheless, many challenges remain. For example, controlling a computer cursor is not the same as controlling a robotic arm to do an array of daily tasks. Controlling a multi degree-of-freedom (DOF) robotic arm to interact with the real world for tasks of daily living is significantly more complicated than a 2-DOF computer cursor, says Carmena. Still, CNEP is moving closer to developing patient-ready BMI technology. Although the primary goal is to help paralyzed individuals perform daily tasks independently, relieving their families of the cost and time of aided living, CNEP also hopes to obtain a better understanding of neural tasks such as processing sensory information, executing movements, and learning to adapt to novel external systems. Although CNEP scientists still have a way to go before restoring or providing thought-controlled motor tasks to patients with neural deficitsthey are beginning to understand more of the complex properties of the brain, one step at a time.

Samantha Cheung is a graduate student in molecular and cell biology.


Fall 2011 Berkeley Science Review

BRIEFS Sleep deprivation

Red eye science


Sleep now, learn later
Why are we alert at some times of the day and not others? Why are we hungry in the morning? Why do we (and other diurnal creatures) become sleepy when darkness encroaches? It is not possible to be awake and asleep at the same time: something favors specific behaviors at specific times. Psychologists and biologists have long been attempting to define what exactly drives the timing of animal behavior and, more recently, what happens when these adaptive rhythms are disrupted. Over the past 300 years, scientists have discovered that the mechanisms controlling our behaviors are internal; we all have daily (circadian) and annual biological clocks. The circadian clock is generated at the single-cell level through control of the expression of specific genes and their protein products. Biological rhythms use chemical signals to ensure that throughout the day our bodily processes and behavior follow a specific pattern that is coordinated with the local time. For example, a surge of cortisol peaks in the morning and encourages us to wake up, and a rise of melatonin later follows to induce sleepiness. In mammals, these daily signals are coordinated and sustained by a specific region of the brain, the suprachiasmatic nucleus (SCN). Information about the time of day is relayed from the light-sensitive cells in the retina of the eye and travels down the retinohypothalamic tract to reach the SCN. The SCN uses the input of light and darkness to synchronize every cell in our bodies with the external environment, influencing our behavior. This results in rhythms in gene expression that form approximately 24-hour cycles that are ubiquitous among living organisms. Yet humans have developed modern rituals that interfere with our internal systems. Frequent traveling, alternating work schedules, fluctuating sleeping patterns, and other exogenous factors that humans face can disrupt both the phase relationships between different hormones and the relationship between hormones and time. By mimicking jet lag in hamsters, UC Berkeley neuroendocrinologists Lance Kriegsfeld and Erin Gibson, a graduate student in Kriegsfelds lab, have recently conducted the first controlled study on how circadian disruption affects brain function in general, and memory in particular. Twice a week for one month, researchers subjected female hamsters to six-hour time shifts in their light/dark cycles, equivalent to the shift one would experience from a New York-toParis flight. The hamsters were given a learning and memory task before and immediately after the time shifts to assess the effects of jet lag on cognitive function. As anticipated, the hamsters performed poorly at learning new tasks during the jet lag weeks. Surprisingly, when the hamsters were re-tested 30 days laterafter they had returned to their regular day-night cyclesthe hamsters continued to display prolonged memory deficits. Most people assume the reason they cannot learn new information at the time of feeling jetlagged is because they are just not feeling well. This study suggests that this may not be due to simply feeling tired, but because you are disrupting specific bodily processes that are necessary for the brain to form new memories, says Kriegsfeld. Those that have traveled to a new time zone are familiar with the feelings associated with jet lag. The feeling of trying to stay awake while your body thinks its time to sleep, the feeling of not being hungry and then suddenly having a ferocious appetite. The hormones we release when our bodily states become incongruent with environmental cues cause most of these symptoms. Previous studies have shown that animals often respond to circadian disruption by releasing stress hormones such as cortisol inan attempt to regain their allostatic balance (stability achieved by altering behavior or physiology). To ensure that the findings of Kriegsfields study were not caused by changes in hormone levels, one group of the hamsters had their adrenal glands and ovaries removed and received supplements to maintain normal levels of cortisol and estrogen. Although the amount of hormones circulating through their bodies was controlled, the hamsters performed just as poorly on their tasks. If poor performance cannot be blamed on feeling fatigued or jet-lagged, something

ASAKO MIYAKAWA

BRIEFS Planetary cores

independent of hormonal response caused the hamsters to be less efficient at forming new memories. It appears that circadian disruption directly impairs cognitive function without being mediated by stress, and that impairment persists long after returning to a regular cycle. What caused the hamsters to experience cognitive deficits? Human studies have shown that jet lag increases both atrophy in the temporal lobe of the brain and deficits in learning and memory. Kriegsfeld and Gibson were the first to look at the direct relationship between jet lag and cognitive dysfunction, and did so by examining whether neurogenesisthe birth and maturation of neuronsin the brain is associated with the learning and memory impairments seen in the hamsters. The past studies are all correlational. To find out what is really going on, the first step is looking to see if fewer neurons are being born and incorporated in the hippocampus, the memory center of the brain, because this is critical for learning and retaining new information, says Kriegsfeld. Neurogenesis was monitored and, sure enough, the findings were in agreement with performance on the learning and memory tasks: neurogenesis in the jet-lagged hamsters decreased by 50%. Jet lag seems to impair cognitive function, likely by affecting neurogenesis. While Kriegsfelds study shows that circadian disruption has pronounced negative effects on the brain, the root cause behind the reduction of neurogenesis has yet to be elucidated. Future studies are planned to investigate the mechanism of this reduction, and to see if neuronal death (as opposed to birth and maturation) is impacted as well. The findings of this study emphasize the long-term detrimental consequences that arise when individuals partake in fluctuating schedules or poor sleeping habits. Unfortunately, these fluctuations are often inevitable, and we are forced to disrupt our circadian rhythms. Kriegsfeld advises the use of melatonin pills to help properly adjust to phase changes; allowing a one-day recovery for every hour of phase shift may also help avoid problems associated with jet lag.

Modeling the thermodynamic instability of planetary cores


The interior of a planet is an unimaginably inhospitable place. Pressures reach millions of times those on the surface of the earth, and temperatures far exceed any which we experience. At these extreme conditions, strange things happen. Rocks flow like fluids, water freezes into exotic kinds of ice-like solids, and hydrogen begins to conduct electricity. Although the interior of the Earth is relatively well understood, the interiors of the gas giants in our solar systemJupiter and Saturnare nearly complete enigmas. Unlike Earth, these planets lack a welldefined solid surface. Instead, the atmosphere of a gas giant simply becomes denser and more liquid-like towards the planets center until, in effect, it becomes an ocean. Deep beneath this liquid ocean, or mantle, scientists believe that gas giants have cores of rock and ice. Little is known about these obscured cores, and understanding their basic propertieshow large they are, how they form, or how they interact with the surrounding mantleremains one of the greatest mysteries of the gas giants. In a new paper, Burkhard Militzer and his group in Berkeleys earth and planetary sciences department have provided evidence for a startling new possibility about the core of our solar systems largest gas giant, Jupiter. Using computer simulations and information about the composition of Jupiters interior, they show that Jupiters core is likely unstable and is dissolving into the planets mantle of liquid hydrogen. Eventually, their results suggest, it may disappear entirely. While others have proposed the possibility that the cores of Jupiter and other gas giants are unstable, the plausibility of this hypothesis has remained untested because replicating Jupiters intense temperatures and pressures here on Earth is extremely difficult. Its not obvious whether the core will dissolve, says Militizer. We ask the best high-pressure physicists, and they dont know. The pressures are obscenely high, and there are no experiments that address this question.

Core issues

Jupiter
Molecular hydrogen Metallic hydrogen Core
An illustration of the basic structure of Jupiters interior. A mantle of liquid hydrogen, in both metallic and molecular forms, surrounds the core. Militzers work suggests that the icy outer layer of the core (not shown) is dissolving into the surrounding liquid hydrogen.

Kristina Garfinkel is an undergraduate student in psychology.

Whether the core is stable or whether it dissolves depends upon a competition between pressure and temperature. On the one hand, the extreme pressures inside Jupiter should generate forces that stabilize the core and combat erosion. On the other hand, the high temperatures in the core tend to favor disordered states in which the core dissolves and mixes with the mantle. If temperature wins, then its like sugar in coffee, and the core erodes, Militzer explains, referring to the fact that higher temperatures make it easier for solids to dissolve into their liquid surroundings. Because experiments cannot replicate the conditions within Jupiter, Militzer and a postdoctoral fellow, Hugh Wilson, turned to computer simulations to test the core erosion hypothesis. While Jupiters core consists of both rock and ice, the pressures are so great that the ice, which is less dense than the rock, is squeezed from the rock and forms a layer surrounding it, much as the oil and vinegar in salad dressing separate into layers. If Jupiters mantle of liquid hydrogen could dissolve this layer of ice, then the icy outer layer of the core would be unstable.
Fall 2011 Berkeley Science Review

NASA

11

BRIEFS Methylation Planetary cores

A real headscratcher

The molecular basis of itch

An artists conception of Juno in orbit around Jupiter. Launched in August, the satellite will reach Jupiter in 2016 and will make precise measurements of Jupiters gravitational field, allowing scientists to calculate the size of Jupiters core. These calculations will rigorously test Militzers hypothesis that the core of the planet is eroding.

Using techniques from statistical mechanics, Wilson and Militzer wrote simulations to calculate whether the ice would be more thermodynamically stable if it were dissolved in the liquid hydrogen. They found that the dissolved state was more favorable, just as sugar in coffee is more thermodynamically stable dissolved than as solid crystals. This result implies that the interface between the hydrogen and the icy core is unstable and that the mantle of liquid hydrogen is, in effect, melting the ice. Militzer is particularly excited about this result because a new NASA mission to Jupiter, launched this August, will offer an unprecedented opportunity to directly test his hypothesis. The satellite, called Juno, will reach Jupiter in the year 2016 and will study the planet in greater detail than ever before. Of interest to Militzer are the measurements that Juno will make of the planets gravitational field, because they will allow scientists to precisely calculate the size of the core. If they find a giant core, then this

process will either not exist or be really slow, says Militzer. But a smaller coreone smaller than predicted by theories of planet formationwould be strong evidence in favor of core erosion. Indeed, a very small core would indicate that even the rock beneath the ice has eroded. Militzer is currently testing that possibility by running simulations to test whether the hydrogen mantle would also dissolve the rock at the center of Jupiters core. For all the exotic forms of matter and complex simulations involved, the effects of the core instability predicted by Militzers work are subtle and only observable by the kind of precise gravitational measurements the Juno mission will generate. However, Militzer believes that if core erosion is occurring in Jupiter, then it probably also occurs in many other Jupiter-sized gas giants and, indeed, is likely a fundamental feature of planetary dynamics.

Keith Cheveralls is a graduate student in biophysics.

Why do you itch? Yes, yes, that rash, those hives, sure. But what is that sensation? Why does scratching sometimes increase pain but reduce itch? And why are drugs that treat the itch from a mosquito bite powerless against the itch that accompanies an effective and widely-used malaria treatment? The first step in answering these questions is to map the molecular route between your brain and the stimuli causing pain or itch. Discoveries at this molecular level are crucial for drug design and medical therapy. UC Berkeley molecular and cell biology professor Diana Bautista and her colleagues are exploring the molecules that function in the neural pathways behind itch and pain. Her research may someday mean that we can put all that nasty scratching behind us. The sensation of itch, or pruritus, has traditionally been viewed as a milder form of pain, suggesting that both sensations are mediated by common chemical signals and pathways. Recent evidence challenges this long-standing model by supporting the idea that pain and itch are distinct sensations mediated by separate groups of neurons, or lines of communication to the brain. This theory better explains why vigorous scratching, which produces mild pain, can have an inhibitory effect on itch; chemicals released by the pain line can mask the effects of the itch line. In a recent paper published in Nature Neuroscience, the Bautista lab reports that a particular neural ion channel called TRPA1 may bridge these two theories and provide a long-sought-after target for treating a variety of pains and itches. TRPA1 is something of a gatekeeper in pain and itch signaling, acting in response to both pain and itch stimuli while residing in a subset of the neurons associated with itch. As an ion channel, TRPA1 operates at the cell membrane to activate neuronal firing in response to different signals. In biological signaling, a stimulus activates a receptor (molecule A) that results in a signal to molecule B, which in turn activates C and so on. Molecules B and C are defined as downstream of A. Receptors are often very specific, binding their activating molecules

12

Berkeley Science Review

Fall 2011

NASA

BRIEFS Itch and pain

like a lock and key. Often, a single event like a mosquito bite results in a barrage of different molecules that activate a variety of signaling pathways. Most itch conditions involve more than one activating molecule, says Sarah Wilson, a graduate student in the Department of Molecular and Cell Biology at UC Berkeley and the lead author on the paper. Blocking one receptor [upstream] might inhibit some of the mosquito-bite itch, for example, but other itch signals can still get through. So, targeting a single receptor is ineffective at stopping the wide variety of stimuli that can cause itch and pain. This is where TRPA1 comes in. In the pain and itch-signaling pathway, TRPA1 is found downstream of the receptors that initially signal irritation to the body. So when a mosquito bites, the sensation is not immediately relayed by TRPA1. This positioning is important because it means that many different itch and pain-causing chemicals all go through one gatekeeper, TRPA1. Inhibiting TRPA1 will block many more of the signals relayed by a single bug bite, says Wilson. The Bautista lab team and other neurobiologists initially identified TRPA1 as a pain sensor, but they noticed that it is expressed in the dorsal root ganglion (DRG)a fraction of neurons that also sense itch. They tested whether the TRPA1 ion channel plays a role in mediating itch using multiple techniques. First, they genetically engineered mice to lack TRPA1. Then, they isolated their DRG neurons and treated them with itch-inducing compounds: the malaria drug chloroquine (CQ) and an endogenous pruritogen (something produced naturally in your body to induce itch), BAM peptide. As compared to normal DRG neurons, those that

lacked TRPA1 failed to activate in response to the compounds. To confirm this result, itch-inducing compounds were injected into mice lacking TRPA1 and their behavior was compared with normal mice. The normal animals responded to the injection by distinct, quantifiable scratching behaviors. However, mice lacking TRPA1 outwardly showed decreased responses to both compounds; when injected with them, the engineered mice did not scratch as furiously as their normal counterparts. The Bautista labs discovery that TRPA1 acts not only as a pain sensor but also as an itch relay has far-reaching implications for drug design. Although treatments for the histamine-mediated mosquito-bite itch already exist in the form of anti-histamines, these drugs are like fighting only one platoon of an enemys army while being attacked on multiple fronts. TRPA1 might be the key to a strategy for cutting off the enemys supply route and thus crippling all of its forces. Thats why anti-histamines dont always work, says Bautista. The same cells that release histamine also release BAM peptides, among many other compounds. Furthermore, itch

sensations associated with chronic illnesses such as liver disease, atopic dermatitis, or side effects of CQ are all unaffected by antihistamines. The Bautista labs discovery has the potential to make finding the specific pathways behind these chronic itch conditions unnecessary. TRPA1 appears to be a specific and important target for drug design. Currently, the Bautista lab is working with Hydra Biosciences to test TRP channel inhibitors in mouse models that exhibit chronic itch. Encouragingly, inhibiting TRPA1 activity reduces both CQ and BAM-induced itch responses in mice, according to Bautista. In the future, they hope to have effective and specific treatments against chronic, intractable itch.

Nikki Kong is a graduate student in molecular and cell biology.

KRISTIN GERHOLD

Fall 2011

Berkeley Science Review

13

BRIEFS Methylation Pesticide exposure

A perennial problem

Developmental defects stemming from pesticides

Anyone who has spent time around small children knows that they are constantly experimenting: they are compelled to taste, smell, touch, and sometimes break, almost every object they come across in order to find out what it is and how it works. These are all normal behaviors, and it has long been understood that childrens natural curiosity is vital to their cognitive, social, and emotional development. However, the same curiosity that nurtures young minds can also put them in danger, especially when it comes to potentially toxic chemicals in the environment. Moreover, the undeveloped nature of fetuses and infants leaves them

particularly vulnerable to the outside world. By the 1990s, researchers in the field of environmental health had become more aware of the unique effects that chemical exposures can have on children, prompting authorities to create programs and rules designed to protect our carefree youngsters. Spurred by an executive order from President Clinton in 1997, the Environmental Protection Agency (EPA) and the Department of Health & Human Services launched eight Childrens Environmental Health Research Centers around the country. The goals of these research centers are to broaden our understanding of the relationship between environmental exposures and child health, and to facilitate the translation of basic research into new strategies for intervention and prevention. Focusing largely on the effects of pollutants such as pesticides,

each center is a communityuniversity partnership that fosters collaborations among academic researchers, health professionals, community leaders, and policy makers. One of these centers was established by UC Berkeley in the Salinas Valley, an agricultural area south of San Francisco whose population is mainly low-income and Mexican-American. The first major research study on this population was led by Brenda Eskenazi, a professor at the School of Public Health, and was titled The Center for the Health Assessment of Mothers and Children of Salinas, or CHAMACOS (which means little children in Mexican Spanish). The center enrolled a cohort of six hundred pregnant women with the goal of observing the development of their children from before birth through age twelve. To date, the group has collected data on everything from

A common sight in the Salinas Valley: strawberries being sprayed with pesticides. Dr. Brenda Eskenazi and colleagues found that pesticides such as organophosphates can be harmful to developing fetuses even at low exposure levels.

14

Berkeley Science Review

Fall 2011

LORA SANTIAGO

the childrens environmental exposures to their growth and development, even detailing their genetic backgrounds. Of particular interest to CHAMACOS is a harmful class of chemicals known as organophosphate (OP) pesticides. OPs are widely used in agriculture, especially on commonly eaten fruits like strawberries and grapes; until 2003, they were also authorized for both indoor and outdoor home use. Their effects on the human body at low levels are not well understood, but at high doses, organophosphates inhibit acetylcholinesterase, an enzyme that breaks down the neurotransmitter acetylcholine (ACh). ACh is a chemical that stimulates brain activity, and without acetylcholinesterase to break it down, excess levels of ACh cause uncontrollable excitation in the brain. Bradley Voytek, a post-doctoral researcher at the University of California, San Francisco, explains the problems caused by chemicals like OPs: Acetylcholinesterase inhibitors are a class of drugs that are found naturally in snake venoms and plant poisonsthey increase acetylcholine available to neurons, which slows the heart rate and contracts muscles. Too much of it can obviously be a bad thing. Thus, increased levels of OPs can lead to a wide variety of neurological problems. In fact, they are so effective at harming human beings that they were developed as chemical weapons (such as sarin) by the Nazis during World War II, and were infamously used in a 1995 terrorist attack on the Tokyo subway. While most earlier studies of OP exposure have focused on agricultural workers exposed to high doses, the CHAMACOS study is concerned with the lifelong, lower doses that the general population receives, particularly from pre-birth to age seven. As Professor Eskenazi explained, Before we did these studies it was well known that pesticide poisonings occur in childrenthats highdose exposure. Our question was, What happens with low-dose [exposure] during the course of pregnancyduring fetal development, when the fetus is probably more susceptible than the adult? These children could have been exposed to the pesticides in several ways: through farm workers in their families, drift from nearby farms, and home use. However, the most common source of exposure is residue on fruits and vegetables.

To study the effects of OPs on childrens development, CHAMACOS researchers collected urine samples from the mothers (before birth), as well as their children throughout the first few years of life, looking for remnants of OP pesticides. Studies in 2007 and 2010 revealed associations between prenatal OP exposure and lower mental development scores and attention skills at two and five years of age, respectively. The seven-year visit helped to determine whether these effects were more long-term, with significant implications for the childrens future. Dr. Maryse Bouchard, Dr. Brenda Eskenazi, and colleagues reported the results of the seven-year visit in a recent paper in the journal Environmental Health Perspectives. To assess the mental development of the children, CHAMACOS employed the Wechsler Intelligence Scale for Children, a test that measures four different areas of intelligence: working memory, processing speed, verbal comprehension, and perceptual reasoning. The test also measures childrens full scale IQ, and does not require reading or writing skills. The researchers asked whether there was a correlation between intelligence and OP exposure. In addition, they attempted to determine whether children were particularly vulnerable either before or after birth. They found that prenatal, but not postnatal, OP exposure had a negative effect on childrens intelligence, particularly when it came to verbal comprehension. Children whose mothers had the highest levels of exposure scored, on average, seven points lower in IQ than those whose mothers had the lowest levels. The deficit was apparent even when the researchers controlled for other factors, such as maternal education and exposure to other chemicals. This may not seem like a large difference, but small disadvantages early on in life can often lead to large disparities in development as children grow older. Alarmingly, these effects may not be confined to rural agricultural communities. While OP exposure levels among women in the Salinas Valley were higher than average for the American population, they were still within the overall range of concentrations found in the US. Furthermore, the CHAMACOS study was complemented by those of two Childrens Environmental Health Research Centers in New York City

(at Columbia University and Mount Sinai Medical Center), where OP exposure likely occurred only through home use and food residues. The results suggest that overexposure to OPs is not limited to agricultural communities, but might be a serious concern in the common American household. While the use of these pesticides is decreasing in California and nationwide the EPA reports a 52% decline in use on foods most frequently consumed by children between 1993 and 2004much of our agricultural production is still heavily dependent on the use of OPs. To deal with these potentially harmful chemicals in our food, Dr. Eskenazi stressed that pregnant women should eat lots of fruits and vegetables, because its very important for the fetus, for the woman, for her children, but at the same time, make sure to wash those fruits and vegetables well. She also urged that keeping chemicals out of ones life is always a good thing when one is pregnant, and that means everything from using toxicfree makeup and personal care products to fewer plastics, as well as fewer pesticides. These results from the CHAMACOS study are only one example of the progress made by the Childrens Environmental Health Research Centers over the past decade other interesting research investigates how children may not be able to metabolize certain chemicals before a certain age, as well as effects that traffic pollution may have on newborns. While there is still much to learn in the world of public health, research organizations such as CHAMACOS continue to uncover interesting (and potentially alarming) facts about the effects our environment has on our health. New centers have been established to examine environmental contributions to autism, and new initiatives have been directed toward better prevention and diagnosis at both the clinical and public health levels. With any luck, the coming years will yield a greater understanding of the interaction between the environment and children, as well as more effective ways to keep them both healthy.

SEAN MCGRATH

Molly Sharlach is a graduate student in plant and microbial biology.


Fall 2011 Berkeley Science Review

15

BRIEFS Quantum Computing

Advances in quantum computing

One small step for Cal, a quantum leap for mankind

How does one make a computer faster? Shrink its building blocks and pack in more of them. But there is a fundamental limit to this process: each building block cannot be smaller than a single atom. That limit is not as far-off as it may seem; todays chip manufacturers are quickly approaching it. In fact, computer manufacturers routinely construct transistors, computer chip building blocks, with features assembled from only a few thousand atoms. At this scale, physicists have found that matter behaves in ways that are unpredictable from the perspective of standard mechanics. For example, electrons stop behaving like individual particles and start acting more like waves that can interfere and leak from one wire to another. This is one example of what physicists refer to as quantum effects, phenomena governed by the principles of quantum mechanics. Quantum behavior poses a serious challenge for chip manufacturers, who have been sidestepping certain quantum effects for decades.

Irfan Siddiqis Quantum NanoElectronics Lab (QNL) at UC Berkeley aims to harness the potential benefit of the very quantum effects that plague conventional computers. The lab is hoping to develop the first generation of quantum computers. Recently, the team took a big step toward this goal when they directly observed the quantum behavior of a small system, called a qubit, in real time. Qubits are the potential building blocks of a future generation of powerful quantum computers. A quantum computer employs the quantum behaviors of atoms to speedily perform complex calculations by parallel processing. Whereas each processor in a conventional computer must do computations one-byone, or serially, quantum effects known as entanglement and superposition would allow a quantum computer to do multiple computations simultaneously, or in parallel. In fact, such a computer could rapidly find the factors of a given large integer by dividing it by all smaller integers, all at the same timea feat that would undermine many modern encryption techniques. In a matter of seconds, a quantum computer could use this advantage to perform a factorization that might take a classical computer the entire age of the universe to compute and use those prime factors to break a code.

How can a quantum computer do more things simultaneously than a classical one? The difference lies in how each system stores information. While a modern laptop stores information in bits, binary pieces of data that are either 1s or 0s, a quantum computer would store information in qubits, short for quantum-bits. These qubits can be 1s or 0s just like bits, but they can also be a combination of the two states. This latter combination is known as a coherent superposition, and it holds the key to a quantum computers potentially massive advantage. A quarter on a table can be thought of as a bit because it can be in one of two states: heads or tails. But what if the quarter is spinning on the table? As it spins, it is in neither individual state, but rather something like a superposition of both states. It is potentially both heads and tails, acting like a qubit. If we insist on discovering what state the quarter is in, say by touching it and knocking it down, we collapse this superposition to one of two states: either heads or tails. Just as a quarter cannot spin forever, a qubit cannot maintain a superposition state forever. It is eventually knocked down due to the quantum equivalent of friction. This so-called decoherence scrambles the information stored in the superposition and can introduce insurmountable errors

Qubit

Sensor
10 mm

16

Berkeley Science Review

Fall 2011

DANIEL SLICHTER

The inch-long copper box pictured at left holds a tiny quantum circuit on a silicon chip. The chip is connected to the circuit board by aluminum wires thinner than the tip of an eyelash. The qubit, at the top center of the circuit, can jump between two quantum states. As shown on the right, an incoming signal interacts with the qubit, and the wave properties are changed depending on the qubit state. The outgoing signal is very weak, about one-millionth the strength of a typical Wi-Fi signal, but the very sensitive JPA amplifier allows it to be measured cleanly.

BRIEFS Soap films

250 years of mathematical exploration


Kids love to blow soap bubbles: the bubbles can merge and retain their shape, sparkle, and seemingly defy gravity. Yet these thinly stretched films are more than just fragile physical beauties: they embody on their delicate surfaces truly perplexing structures that have kept mathematicians fascinated for the past three centuries. Jenny Harrison, a Professor of Mathematics at UC Berkeley, has recently worked out the solution to a centuries-old mathematical problem that is best portrayed by soap films and, in doing so, uncovered deep insights to the foundation of mathematical structures. The question answered by Harrisons work is known today as Plateaus problem, after Joseph Plateau, a 19th century Belgian physicist who methodically researched the physics of soap bubbles and films. He hypothesized that when you dip a loop of metallic wire into a soapy solution, the surface of the soap film formed on the wire represents the minimum mathematically possible area for the loop, no matter what shape the loop is. This theorem is deeply related to a question first posed in 1760 by the great French mathematician Louis Lagrange: If a simple closed loop is drawn in 3-dimensional space, is there an area-minimizing surface enclosed by the loop? Area-minimizing surfaces, or simply minimal surfaces, are all around you. A flat disc has the smallest area with the boundary of a circle; a sheet of paper has the smallest area with the boundary of a rectangle. Other minimal surfaces are less common, like the Mbius strip and the catenoid. The catenoid shows that minimal surfaces need not be flat. On the contrary, one mathematical definition of a minimal surface requires that the steepest-uphill and steepest-downhill curvatures be equally steep at every point on the surface, making every point look like a mountain pass or a saddle. Minimal surfaces are even found inside your body: the lipid membranes that enclose your cells are most stable if the hydrophobic middle layer of the sheet is kept isolated from the surrounding water,
Fall 2011 Berkeley Science Review

Rising above Plateaus problem

500 nm
At the heart of the qubit are two Josephson junctions (circled in red), formed by placing a thin insulating barrier between two superconducting aluminum wires. The qubit state is determined by whole groups of electrons moving back and forth across the insulating layers of the Josephson junctions. The wires in this scanning electron micrograph image are about 500 atoms wide.

in a quantum computation. As Dr. Rajamani Vijay, a postdoctoral researcher at QNL, puts it, Decoherence is our number one enemy. Currently, quantum computers exist only in theory, but the physicists of QNL are hoping to change all that. They began by building their own fundamental building blocks: qubits of their own design. Those made at QNL are essentially electrical circuits made with similar techniques to [those used to make] computer chips, according to graduate student Dan Slichter. They are unique in that they are fast, tunable, easy to manipulate, and mass producible with current technology. While the circuits speed is important, it comes at the cost of coherence time (how long before the superposition, containing the qubits information, collapses). This is ok, says Vijay, so long as each decoherence event can be detected. Then, quantum error correction techniques can compensate for the information loss. But, as Slichter points out, individual decoherence event measurements are notoriously difficult, and scientists have been trying to do this for a long time. Enter the recent breakthroughs of Siddiqi, Vijay, and Slichter. Their work used qubits made from super-cooled circuits about the size of a human cell, which are too big to display quantum effects at room temperature. At just 0.03 Kelvinbarely above absolute zerotheir qubit circuit becomes superconducting, meaning it offers no resistance to currents of electrons trying to flow through it. Inside the qubit, the electrons can behave

in quantum-mechanical unison. When one of these qubits undergoes a quantum jump from one state to another (like switching from heads to tails), which may be a signal of a decoherence event, it introduces a tiny shift in the electromagnetic (EM) waves in a nearby sensor. Traditionally, a chain of amplifiers would amplify this signal at the cost of adding noise. This added noise has been so large that it drowns out the quantum jump signal being amplified in the first place. Here, QNL has made its mark. Using their unique new piece of quantum electronics called a Josephson Parametric Amplifier (JPA), the researchers have found a way to maintain the integrity of the quantum jump signal. The JPA mixes the weak jump signal with a pump tone, a strong EM wave at the same frequency as the weak signal. This frequency-matched carrier signal amplifies the signal from a single qubit quantum jump above the noise introduced by the later amplification. By enabling the detection of individual decoherence events in qubits both directly and in real time, QNL cements a key step on the way to the correction of information loss due to decoherence. Their work puts a functional quantum computer, a means of making atom-sized computer building blocks, significantly closer.

Zlatko Minev completed a physics major at UC Berkeley and is now a graduate student at Yale.

17

BRIEFS Soap films

a configuration that is best achieved by minimizing the membranes area. According to Plateaus many experiments with soap films and bubbles, their structures must not only be minimal but also mathematically smooth (i.e., having derivatives of all orders) and exhibit the lowest possible energy that can be associated with the enclosed surface. Following his discoveries in the late 1800s, the study of soap films garnered much attention in the early 20th century, as many brilliant mathematicians strove to find a general mathematical description of their properties. In 1936, Jesse Douglas shared the first Fields Medal (akin to a Nobel Prize of mathematics) for providing a particular class of solutions to Plateaus problem, but while his special proof was a major step forward, it was not the end of the road. Another 75 years would pass before a fresh look and a novel approach would bring the full, general solution. In early 2011, Jenny Harrison submitted a paper for publication that describes a new approach to Plateaus problem: one that generalizes all previously studied special cases, including Douglass. Her proof applies to all dimensions and surfaces, from garden-variety surfaces like planes to more exotic ones like Mbius strips, a feat that eluded all previous attempts at a solution and represents a truly astonishing leap in mathematical reasoning. Harrison began her mathematical exploration of Plateaus problem in 1986 by trying to find new, simpler, and more general analysis methods for the universal treatment of a wide variety of mathematical surfaces such as fractals, smooth manifolds, and soap

films, as well as physical phenomena like the distribution of electrons on surfaces. In particular, she sought frameworks within which the classical methods of calculus could apply to all of these systems equally. I spent a good part of the last two decades working in relative isolation on this fascinatingly beautiful problem, Harrison says. My goal was to simplify the analytical methods of mathematics and at the same time extend the scope of applications, and isolation was absolutely necessary in order to maintain a state of creative flexibility. Harrisons four-decade-long study of the foundations of geometric properties equipped her with a deepened insight that allowed her to finally take up the formidable challenge of Plateaus problem when she learned ten years ago that it was still lacking a general solution. (Professor Frederick Almgren of Princeton University had claimed a general solution in 1966 but it was refuted by Professor Frank Morgan of Williams College in 2001) Driven by the problems elegance, she spent the last decade converging on the most general framework within which the problem could be solved. As I drew closer to the source, I could make out more of what was drawing me in, she explains. The problem had evolved into something more beautiful and more powerful than anything I could ever have hoped to imagine. Harrisons solution, which successfully takes into account all of Joseph Plateaus foundational experimental observations about soap film surface properties and their unique geometric characteristics, relies fundamentally on a new mathematical framework that she refers to as Quantum Integration Theory. Central to this theory are four primitive operators (extrusion, retraction, pre-derivative, and reduction) that act on infinitesimal mathematical objects known as Dirac chains. The operators can be used to move between dimensions, to find relationships between surfaces and their boundaries, and ultimately to prove the existence of a minimal surface enclosed by a boundaryprecisely the solution to Plateaus problem. The true beauty of Harrisons solution lies in its generality: because of the efficacy of her Quantum Integration Theory, the

A catenoid is formed when two parallel circular rings are slowly separated after being dipped in a soapy solution, producing a curved minimal surface. Proven to be minimal in 1744 by the great mathematician Leonhard Euler, the catenoid soap bubble adjusts its shape to minimize the area created in three dimensional space.

Alireza Moharrer is a solar power engineer in the Bay Area.

18

Berkeley Science Review

Fall 2011

FROM TOP: EXPLORATORIUM; JEAN SIMON CHARDIN

proof spans many applications beyond the easily visualized physical systems of soap films and cellular membranes. Perhaps the most fascinating application introduced in Harrisons paper is based on a class of novel mathematical operators known as quantum chain complexes that could potentially expand our understanding of the mathematical foundations of the physical universe. She was very surprised to find out that the four primitive operators uncovered correspond to the creation and annihilation operators that are fundamental to quantum field theory, a discovery that could provide new insight to important unsolved questions in quantum physics. I am not suggesting that we can solve any of these problems, and certainly not without more work, Harrison cautions, but only that the theory provides new tools which may turn out to be very helpful. Professor Harrisons modesty should not diminish the importance of her discovery. As the history of scientific discovery demonstrates, creativity can emerge from the intersection of deep explorations and ingenious inventions, a trend embodied by Harrisons inspiring work. More than just providing a proof of Plateaus problem, the delineation of her new Quantum Integration Theory could open up entirely unexplored territories in mathematical physics and beyond.

2929 Seventh Street, Berkeley

State-of-the-art life science laboratories. Grow your startup one bench at a time. Work in a dynamic environment, access UC Berkeley core facilities, and enjoy QB3s renowned support for entrepreneurs. qb3.org/ebic

A of

20

Berkeley Science Review

Fall 2011

lab space ones own


The QB3 Garage: an incubator for innovation

by Susanne Kassube
Fall 2011 Berkeley Science Review

21

EGGNEST: FRIZZYCHICK; DESIGN: VALERIE OSHEA

FEATURES QB3 Garage

alifornia Historical Landmark No. 976 is a mythical place for entrepreneurs. Located at 367 Addison Avenue, Palo Alto, CA, it is home to the garage in which Bill Hewlett and David Packard developed HPs first product, the Model 200A audio oscillator. The garage is not only the birthplace of Silicon Valley, but also a symbol for innovation and the Californian entrepreneurial spirit. For entrepreneurs in the life sciences, developing ideas into innovative products requires more than what can typically be found in a backyard garagethey need lab space thats suitable for performing experiments in compliance with environmental health and safety regulations. For emerging companies with limited financial means, this space is hard to find in commercial real estate: the minimum unit of lab space that can be rented is around 2,500 square feet, which is too expensive for most beginning entrepreneurs to put on their credit cards. When Regis Kelly and Douglas Crawford joined the California Institute for Quantitative Biosciences, or QB3, they identified the space issue as one of the main barriers between great scientific discoveries and innovative products that reach the marketplace. They decided to start a tiny incubator of ~2,500 square feet at UCSF, which they called the Garage to commemorate HPs humble place of origin. The smallest unit that can be rented at the Garage is a single lab bench, equivalent to ~120 square feet, which in many cases is sufficient for carrying out proof-of-principle experiments to get the company off the ground. The idea was initially met with skepticism by venture capitalists. Some of them said dont bother, this is a recipe for mediocrity, an intensive care unit for small companies that will not amount to anything, recalls Crawford. But we proceeded because it was QB3s strategic goal to promote great science and to help enrich our society. We believe that basic research will lead to economic growth, but

if we dont help it move through the final mile, to get the discovery to the marketplace, we are not meeting our social contract. The success story of the QB3 Garages first tenant, Fluxion Biosciences, supports Crawfords point. Founded in 2006, the company moved to South San Francisco in 2008 and now has 30 employees. When you go there, its exactly what people hope for from the science in our universities. Now theres a small factory in South City, hiring high school graduates to manufacture microfluidics devices. Its the full impactits jobs, its cool research tools that will drive future discoveries, and it is the realization of the potential of laboratory research. It showed us that it is possible to start with very little,

a tiny amount of space, and produce a company of great value, says Crawford.

From postdoc to entrepreneur


The idea for Fluxion Biosciences was born in Luke Lees lab in the Department of Bioengineering at UC Berkeley when Cristian Ionescu-Zanetti, a postdoc at the time, became interested in working outside of academia. He enjoyed his research, but felt that in the academic environment he was, taking things maybe a fifth of the way towards something that really works, a product thats better than the status quo. Together with a graduate student in the lab, he applied for a Small Business Innovation Research (SBIR) grant, entered business-plan competitions, and eventually became the first company to move into the Garage at UCSF. They came and knocked on our door when we were still planning; we had dedicated the space, but we hadnt even started to get the approvals from the university, Crawford says. In the end, their inquiry for space precipitated it all; it was a nice synergy between us and Fluxion. While starting the company at the Garage, Ionescu-Zanetti continued working as a postdoc half-time, but soon devoted all his efforts to the company. Fluxion currently markets two products, called IonFlux and BioFlux. Technologywise our focus has always been to take laborintensive processes, such as drug screening, and parallelize and automate them, to make them faster, better, and cheaper, explains

We believe that basic research will lead to economic growth, but if we dont help it move through the final mile, to get the discovery to the marketplace, we are not meeting our social contract.
- Douglas Crawford QB3 Associate Director

Bill Hewlett and David Packards garage in Palo Alto, now designated as California Historical Landmark 976.

22

Berkeley Science Review

Fall 2011

HEWLET T-PACKARD

FEATURES QB3 Garage

was established in 2000 by Governor Gray Davis as one of four Institutes for Science and Innovation in California and comprises the three UC campuses at Berkeley, San Francisco and Santa Cruz. The aim of QB3 is to accelerate discoveries that will benefit society. Through its Innovation Toolkit, QB3 provides lab space in its incubators as well as mentoring, networking and funding opportunities for nascent entrepreneurs and connects researchers at its universities with the private sector. QB3 is led by Regis Kelly (left), a neuroscientist and former Vice Chancellor of UCSF. Douglas Crawford (right) joined QB3 as Associate Director after completing his PhD in biochemistry at UCSF.

QB3

Ionescu-Zanetti. The IonFlux automatically records the flux of ions through membrane channels without the need for intermediate user intervention. The machine uses standard-format multi-well plates commonly used in high-throughput screening, and can be readily adapted to existing screening platforms. Pharma companies and academic labs use the IonFlux for screening the effect of drugs on membrane channels, as well as for characterizing the consequences of mutations on ion channel currents. Conceptually, the IonFlux was Fluxions first product, but after talking to potential customers, the company soon started developing its second product, the Bioflux, which allows researchers to perform live-cell assays under shear flow. For a variety of applications, such as studies of platelet adhesion that naturally

occurs in the blood stream, the BioFlux mimics the physiological environment much better than traditional assays. The BioFlux instrument is used by scientists in both academia and industry for a number of cellbased assays, ranging from wound-healing research to studies of bacterial microfilms. The rewarding moments came after our first instruments went into pharma companies, and their people came back and said, This is much better than what we were doing before. They were really excited and even wanted to publish papers with us, says Ionescu-Zanetti. Once Fluxion had moved into the Garage, the remaining space filled up quickly. Its been full ever since it opened, and we get one to four inquiries a week from nascent companies looking for space, says Crawford. The increasing demand prompted

Crawford to expand capacity, which led to the creation of the QB3 Garage/Innovation network that now comprises four incubators: the original Garage at UCSF, the Garage at the UC Berkeley campus in Stanley Hall, the QB3 Mission Bay Innovation Center, and finally the QB3 East Bay Innovation Center, which opened in July 2011and there are already plans for adding the next incubator to the network.

Getting funding in tough times


Allopartis is one of the companies that started out in the QB3 Garage at UCSF and then moved into the Mission Bay Innovation Center, which is now home to more than 20 start-ups. It was cofounded by three former students from Richard Mathiess lab in the Department of Chemistry at UC Berkeley: Robert Blazej and Nick Toriello, who graduated from the joint UCSF/UC Berkeley Bioengineering graduate program, and Charlie Emrich, a biophysics graduate. One of the hardest things about starting the company was getting funding. We were founder-financed at the beginning for about 8 months, which meant that all three of us went almost totally broke before we got it funded, says Emrich. Meeting venture capitalists could sometimes be a surreal experience for someone who had just gotten

REGIS KELLY PHOTO: QB3; OTHER PHOTOS: ELISABETH FALL

Fluxion Biosciences founder, Cristian Ionescu-Zanetti, at the UCSF Garage laboratory.


Fall 2011 Berkeley Science Review

23

FEATURES QB3 Garage

Robert Blazej, co-founder of Allopartis, working in his laboratory at the UCSF Garage.

out of graduate school: Here we were in our beaten down cars, driving down to Menlo Park where all the venture capitalists work, parking in between a Maserati and a brand new BMW, says Emrich. Allopartis eventually got funded right around the time when the market crashed, which forced them to be creative with the resources they had. They used the money to prove their core technology, the AlloScreen, and have attracted further investments, including grants from the Department of Energy, ever since. The AlloScreen employs the principles of directed evolution and a unique selection

coding DNA, which can subsequently be separated by centrifugation from all the inactive variants, which will stay attached to the bead. The information obtained by sequencing the released DNA molecules is the basis for the further characterization of the altered proteins they encode. Emrich and his colleagues have used the AlloScreen to improve the activity of cellulases, enzymes that digest cellulose. Cellulose is the most abundant biopolymer on earth, explains Emrich, it is a linear chain of glucose molecules, and these chains are magically very crystalline, not very soluble, and very

with Louise Glass in the Department of Plant and Microbial Biology at UC Berkeley, they are now working on co-evolving cellulases with engineered strains of the cellulolytic fungus Neurospora crassa to better understand the activity profiles of different types of cellulases.

Crossing the valley of death


Allopartis has successfully crossed what is known among entrepreneurs as the first valley of deatha gap in funding opportunities for projects that go beyond the academic research that the NIH will fund,

We had the deal of a centurywe easily got a million dollars worth of equipment for $30,000.
- Stephen Cary, co-founder of Omniox
system to generate enzymes with optimized properties, such as activity or stability. While natural evolution happens on a timescale of millions of years and relies on spontaneous mutations to create proteins with altered properties, directed evolution in the laboratory accelerates this process by artificially creating a library of many variants of the original DNA gene that encodes the enzyme. In the AlloScreen, each of the DNA variants is then attached to a substrate particle, and emulsified with the contents of a cell-free expression system in order to produce the many different protein variants that are encoded in each DNA variant. If a particular protein variant is active, it will be able to digest its substrate particle and release its recalcitrant, so they do not break down easily. Because of these properties, cellulosic enzymes are the holy grail in the making of biofuels. While glucose can relatively easily be fermented into ethanol, breaking down cellulose into glucose is the rate-limiting step. With improved cellulases, abundant and renewable resources such as agricultural waste and non-food crops could be used for the production of low-emission biofuels that could substitute fossil fuels and lower greenhouse gas emissions. Although getting there will be a long journey, scientists at Allopartis have already created cellulase variants with improved activity and are optimistic. Were now getting some commercial traction for those variants, says Emrich. In collaboration but are not yet at a stage of maturity where they can attract commercial funding. Once a start-up has obtained minimal funding to bridge this gap, the next big challenge usually lies in finding affordable equipment. Omniox, a current start-up at the Garage at UCSF, was lucky in that the economic crisis worked in their favor. Many biotech companies were going out of business at the time and sold their equipment, often at bargain prices. We had the deal of a centurywe easily got a million worth of equipment for $30,000, recalls Stephen Cary, co-founder of Omniox. The company is developing a molecular carrier that will deliver oxygen to hypoxic tissues, areas of the body that are starved

24

Berkeley Science Review

Fall 2011

ELISABETH FALL

FEATURES QB3 Garage

Pathway of success for QB3 Garage start-ups


Rogers Bridgingthe-Gap Award, SBIR Grants Further develop technology / Pre-clinical studies. ($1 - 3 Million)

Phase III SBIR funding and additional venture capital

Initial idea, incorporate company. ($1200)

QB3 support (Space, Mentoring) Obtain equipment and develop technology.

Commercialization / Clinical studies ($15 - 100 Million)

First Valley of Death

Second Valley of Death

Funding opportunities
Business plan competitions are a great opportunity for budding entrepreneurs to raise funding, network, and get in touch with venture capitalists. The UC Berkeley Business Plan Competition (BPlan) is organized by MBA students and held annually at the Haas School of Business. The Rogers Bridging-the-Gap Award enables teams of researchers led by a QB3 faculty member to develop ideas with the potential to benefit society. Desired outcomes are filing of intellectual property patents and incorporation of a company. The award is administered by QB3 and supported by the Rogers Family Foundation. Three projects are funded with up to $100,000 each per year over a two-year period. Small business innovation research (SBIR) grants are federal research funds that support projects that have the potential for commercialization. A Phase I grant is worth $150,000 and allows nascent companies to conduct proof-of-principle experiments to establish feasibility of their concept. Successful Phase I awardees can apply for a $1 million Phase II grant to develop their projects further. Crawford recommends filing an SBIR grant early, while still at the university as a postdoc or graduate student: Its a very productive way of starting a company. It forces you to focus on what your most important milestone is, and if the grant is funded, you dont have to go through the extreme poverty in the beginning stage of the company. Mission Bay Capital (MBC) is a seed-stage venture fund managed by QB3s Regis Kelly and Douglas Crawford on a pro bono basis and supported by venture capitalist experts Brook Byers and John Wadsworth. MBC funds four companies per year with ~$500,000 each.

of oxygen such as tumors. The first protein that comes to mind for this purpose is hemoglobin, the protein that transports oxygen from the lungs to all other tissues in the body. However, government agencies and companies have tried for decades to develop hemoglobin into an oxygen transporter that could be used as a blood substitute, with no success: taken out of red blood cells, hemoglobin scavenges nitric oxide, with devastating effects for the body. It was during his final days as a graduate student in Michael Marlettas lab in the Department of Molecular and Cell Biology at UC Berkeley when Cary had an idea that seemed too good to be true: I was reading up on all the efforts around developing

hemoglobin as an oxygen delivery therapeutic, reading paper after paper about how the FDA was rejecting it for trauma and surgery because of the toxicities, when I suddenly remembered a group meeting from a few weeks before where Elizabeth Boon, a postdoc in the lab, had presented on a protein that didnt have very much nitric oxide reactivity, and I thought wait a minute, maybe this is a much better platform, because its a stable gas sensor, rather than being a gas reactor. The protein is part of the heme nitric oxide/ oxygen binding family, or H-NOX. Like hemoglobin, it uses heme as a cofactor, but subtle differences in the coordination geometry result in very different oxygenbinding properties. Cary presented his idea

to Jonathan Winger, a postdoc in the lab, and together with Marletta they successfully applied for a Rogers Bridging-the-Gap Award for translational research at QB3. To develop their idea further, they founded Omniox and Emily Weinert, a postdoc in the Marletta lab, created and characterized more variants of the protein with a range of oxygen binding affinities. Under Carys leadership and supported by a National Cancer Institute SBIR grant, the company then moved into the Garage at UCSF. The H-NOX protein could potentially be used as a therapeutic in many different diseases that are associated with hypoxia. Potential applications include treating stroke, managing sickle cell pain, and wound
Fall 2011 Berkeley Science Review

VALERIE OSHEA

25

FEATURES QB3 Garage

healing. Although Cary considers branching out, the company currently focuses on overcoming hypoxia in tumors. Hypoxia is a huge driver of tumorigenesis and metastasis, explains Cary. If the growth of blood vessels cant keep up with the growth of the tumor, large regions are starved of oxygen. Those regions are hard to target using conventional therapies such as radiation, which relies on the damaging effects of reactive oxygen species. In addition, cells in the hypoxic regions usually become more aggressive as a consequence of being starved of nutrients, energy, and oxygen, and tend to form metastases in other parts of the body. The goal of Omniox is to improve existing cancer therapies by bringing oxygen to the tumor. Initial studies using a mouse model show that the protein very efficiently travels from blood vessels into the tumor tissue to deliver oxygen to previously hypoxic areasa phenomenal success that was rewarded with a $3 million SBIR Phase II Award from the National Cancer Institute. Omniox is now eligible to apply for a SBIR Bridge Award if they can secure matching funds from private investors, which would add another $6 million to their budget. These funds will pay the costs of optimizing a lead candidate that will then be used for pre-clinical studies to determine its efficacy and its toxicity profile in animals. If all goes well, H-NOX will be ready for clinical studies in about two to

three years. The company will then face what Crawford calls the second valley of death. QB3 has worked hard toward bridging the first valley of death by providing mentoring, funding, and lab space to start-ups, but bridging the second valley of death, the gap between pre-clinical and clinical studies, poses further difficulties: The enterprise back at the discovery end is expensive, but the cost in the clinic dwarfs that, explains Crawford. Cary estimates the costs for phase I and II studies at around $15 million; getting H-NOX to the market through phase III studies will add another $50 to $100 million. But, given Omnioxs latest results, it seems likely that they will find investors willing to pitch in to help get Omniox to work in helping cancer patients.

From bench to business


One of Omnioxs neighbors at the Garage at UCSF is Refactored Materials, a start-up that works towards the synthetic production of spider silk. Spider silk is a material of phenomenal strength, lightness, and flexibility that outperforms all man-made materials and could potentially be used in applications ranging from lightweight and durable clothing to artificial tendons. A big challenge for commercialization is the production of spider silk: Spiders cant be farmed, theyre territorial, they will attack each other, eat each other, and no one has

been able to make spider silk recombinantly on a commercial scale, explains David Breslauer, a graduate from the joint UCSF/ UC Berkeley Bioengineering graduate program. Together with two other graduates from the same program, Dan Widmaier and Ethan Mirsky, Breslauer co-founded Refactored Materials and has been a tenant at the Garage at UCSF since May 2010. They decided to use yeast cells for the recombinant production of the large silk proteins, and have already produced enough silk protein to try to make fibers. Fibers are generally either melt-spun, meaning that you melt a polymer, extrude, and cool it, or wet-spun, meaning that you dissolve a polymer and extrude it into a non-solvent that coagulates it, explains Breslauer. In contrast to many other emerging companies, Refactored Materials was funded from the beginning: they got their first grant right when Breslauer graduated. Although the company is now minimally funded through federal and state grants for several years, theyre still looking for additional sources of money: Were moving faster than those grants can support, explains Breslauer. Its nice to have them, but its not necessarily something to rely on. While the first six months at the Garage felt very similar to working in an academic environment for Breslauer, the mindset changed once they started to work harder

Refactored Materials co-founders, Dan Widmaier (left) and David Breslauer (right), examine spider silk fibers at the UCSF Garage.

26

Berkeley Science Review

Fall 2011

ELISABETH FALL

FEATURES QB3 Garage

on developing the business aspect. You suddenly stop caring as much about publications, youre just trying to make something that really works, rather than understand every little detail about it, Breslauer remembers.

Strategic partners to finance growth


Silicon BioDevices was the second company to move into the Garage at Berkeley after it opened its doors in summer 2010. The company is developing diagnostic devices that are based on digital microchips and can detect tiny amounts of specific proteins in a liquid sample such as blood. The ease of use combined with high sensitivity and low coststhe single-use device will be available for $1.50sets them apart from the bulky and expensive analyzers that are currently available on the diagnostics market. Once a drop of whole blood is applied, a membrane at the top of the device separates red blood cells from the plasma. The plasma then solubilizes antibody-coated magnetic particles on the back of the membrane, allowing them to bind the protein to be detected and a secondary antibody. Nonspecifically bound particles are removed magnetically, and the remaining particles are detected by the chip. After the signal is read out and processed, the test result can be sent directly to the physicians cell phone by a wireless transceiver that is integrated into the device. A significant advantage with regard

You suddenly stop caring as much about publications, youre just trying to make something that really works.
David Breslauer Refactored co-founder

to safety is the self-testing capability of the device: The sensor can control the assay by making sure its run correctly, in a timely way, and you can disable it if it has been compromised in any way, explains Silicon BioDevices co-founder Octavian Florescu, a graduate from Bernhard Bosers lab in the Department of Electrical Engineering and Computer Science at UC Berkeley. The user-friendly design might eventually allow for diagnostics at home, but for the near future Florescu hopes that the device will find its way into emergency departments and physicians offices, rendering timeconsuming laboratory testing obsolete. The number one reason doctors dont perform inoffice testing is because it requires extra time and extra staff, says Florescu. The device would be the first highly sensitive diagnostic tool that could be integrated seamlessly into a physicians workflow. Although initial results are promising, it might still be another three years until you encounter one of Florescus chips at your local doctors office. Developing the final prototype and making it manufacturable will take approximately two years before the device is ready for approval by the FDA, which might then take another year. The company has raised money from

business plan competitions, but is financed out of Florescus pocket for the main part. In order to finance further development and production, the company is now approaching life science investors and strategic partners. Its a very slow process; even if you have a great technology, you have to add another 12 to 18 months to strike a good deal, explains Florescu. Judging from the history of its predecessors at the Garage, it seems likely that Silicon BioDevices will be able to close a deal: out of the first six companies that started at the Garage at UCSF, four closed venture financing rounds and a fifth was acquired by Affymetrix for $25 million. Theres now one very wealthy 28 or 29-year old after starting a biotech company at the Garage, says Crawford, adding: Were not promising that for everyone, but its nice to know that there is at least that possibility.

Starting a start-up
Contrary to what one might think, the high success rate is not based on an evaluation of the commercial potential of the companies by QB3. We dont want to be rigorous in the evaluation of the market opportunity of what they brought to us, we want to be rigorous in our evaluation of the people. Most real innovations are diamonds in the rough: over and over again, we dont see it when it comes. If you have good people, you get to the right conclusions most of the time, and we want to help them grow as quickly as possible, says Crawford. Carys advice for nascent entrepreneurs? Just do it! You can be a postdoc with an idea, and you can start a company. For $1,200 you incorporate your company, then you take your science idea and submit it as a six-page SBIR grant, and eventually you get half a million dollars. Although the process might not always be so smooth, starting a company is a rewarding experience, says Crawford: I do not know of a single case where an individual regretted their decision. All admit that its the hardest thing theyve ever done, stressful, but satisfying in a way that then exceeds their expectations. So, what are you waiting for?

ELISABETH FALL

Spools of Refactored recombinant spider silk.

Susanne Kassube is a graduate student in biophysics.


Fall 2011 Berkeley Science Review

27

The science behind positive psychology

The brain is

half full
professors already studying the positive psychology topics that they wanted to help promote, the Hornadays created the Greater Good Science Center (GGSC). Housed within the UC Berkeley Child Study Center on the south side of campus, the GGSC offers undergraduate and graduate research fellowships, holds community lectures, and publishes the online Greater Good magazine to highlight current research in the field. With the belief that positive human traits are innate and strongly tied to individual thriving, the GGSC and its positive psychology peers hope to promote the elusive holy grail of personal achievementtrue happiness.

by Azeen Ghorayshi

s the story goes, University of Pennsylvania psychologist Martin Seligman, a self-described pessimist, was weeding his garden when his five-year-old daughter Nikki began playfully shrieking and tossing weeds in the air. As Seligman scolded her harshly for being disruptive, his daughter spun around, looked Seligman in the eye, and said: Daddy, stop being such a grouch! Seligman says that he took this as a wake-up call. Being a pessimist was something he could deal with, even something to be proud of in his academic circles, yet being called a grouch made him cringe. His research up to this point centered on the roots of depression, but the gardening incident made Seligman realize that he, and perhaps the field of psychology as a whole, had focused solely on negativity for too long. So, shortly after his appointment as president of the American Psychological Association in 1998, Seligman charted out a new approach for the field. Dubbed positive psychology, this branch of research would focus on human thriving over human pathology studying function over dysfunction. The movement quickly developed a following, including at UC Berkeley. At around the same time, an entirely separate journey brought two Cal alumni, Tom and Ruth Hornaday, back to their alma mater. The Hornadays had recently dealt with a tragic family loss, and came to Berkeley in 2001 with the idea of funding multidisciplinary research on social and emotional wellbeing. After speaking with several Berkeley

Our better halves


Before the field of positive psychology could really get off the ground, it needed a manifesto of sortsa clearly paved vision for its new focus on positive human behaviors. To create the common language and standardized protocols necessary for a rigorous scientific discipline, Seligman and his cohorts wrote the Character Strengths and Virtues (CSV) manualequivalent in purpose, but opposite in focus, to the Diagnostic and Statistic Manual of Mental Disorders (DSM) used to characterize psychological conditions for over 50 years. Psychology up to that point, said Seligman, had studied only half of the landscape of the human condition, and the CSV would thus serve as the DSMs natural counterpart. The manual lays out the central tenets of the positive psychology field. The main idea is that virtues such as compassion, courage,
Fall 2011 Berkeley Science Review

29

LEAH ANDERSON

FEATURES Positive psychology

and wisdom are as much a part of our human nature as selfishness, weakness, or ignorance. Therefore, just as psychological illnesses need to be identified, treated, and prevented, an academic study of human strengths is needed to help understand and therefore better cultivate good character. Perhaps most importantly, Seligman explains that tuning in to our collective positive natures will not only make us better people, it can also make us happier. In keeping with the tenets of positive psychologists, the professors at Berkeleys Greater Good Science Center strongly support the idea that humans are hard-wired to care, and in turn that caring creates happier individuals. One of their primary research focuses has therefore been on altruism, the sometimes puzzling phenomenon where people put the well-being of othersoccasionally even strangersbefore their own. The question of whether or not truly altruistic altruism exists has divided social psychologists for years. On one side of this so-called altruism question is the idea that cooperation and kindness mask deeper, sometimes unconscious, self-interest. On the side of the GGSC and the positive psychologists is the belief that there is a truly evolved altruism woven into our emotional makeup. One of the biggest challenges to the claim of sincere altruism is that what youre seeing is really a strategic pursuit of prestige or reputational gain, says sociology professor and GGSC affiliate Robb Willer. Well, we wanted to show that not everyone

is driven by that. To do this, Willer and GGSC co-founder Dacher Keltner teamed up at Keltners Social Interaction Laboratory to separate genuine altruism from its less virtuous counterparts based on a variety of measures. First, they separated the public and the private spheres of social interaction; in a measure of genuine altruism, individuals concerned with public opinion would presumably be more selfish when given the opportunity to act anonymously. As a test of this model, a group of 94 undergraduate research participants were first given a written evaluation to determine their self-perceived generosity. They then acted as the subjects in a series of economic scenarios designed to test how likely they were to allocate a pool of resourcesto be exchanged at the end of the experiment for real moneyto another individual. In the private condition, there was no third party viewing the transactions, and the subject was only identified by a letter. In the public condition, a third party was present to see how much, if any, of the resources the subject decided to share. But here was the catch: the third party could potentially return money to them as well, giving the subjects a reputational incentive to act more prosocially than they otherwise might. What Keltner and Willer saw emerge was a sincerely altruistic group that sustained stably high levels of generosity regardless of the public or private conditions. In contrast, more reputationally altruistic people acted

After observing domesticated family pets and wild animals like swans, chimpanzees, and wolves, Darwin carefully recorded animal responses such as purring, snarling, and tail-wagging.

prosocially in the public condition, but their generosity dropped off significantly when given the opportunity to act unwatched. Further testing showed that the sincerely altruistic group both placed less personal value on status and sustained their levels of generosity after going through a classic experimental construct designed to disable their ability to pretend. A lot of people

In 1872, Charles Darwin published The Expression of the Emotions in Man and Animals, a hereditary study of behavior. At the time, the dominant belief was that humans possessed unique and divinely created muscles to express emotions. In Expression, however, Darwin claims that our emotional capabilities are subject to natural selection. Using detailed illustrations and close analyses of physiological responses to different emotionssuch as hair raising, vocal emissions, perspiration, and the precise movement of facial musclesDarwin traces purposeful links between expressions of emotion in animals to their human equivalents. He concludes, the young and the old of widely different races, both with man and animals, express the same state of mind by the same movements. Notably, the book was also the first scientific text to make use of the new medium of photography.

30

Berkeley Science Review

Fall 2011

CHARLES DARWIN

FEATURES Positive psychology

psychologists and the GGSCs approach; they do not disregard the weaknesses inherent in human nature. Rather, they argue that by working to better understand what can make us good, we could work to make ourselves better. Alongside their goal of studying what makes an individual inherently good is the concept that virtue and happiness are heavily intertwineda concept called eudaemonia that dates back to the ancient Greeks. Showing that there is such a thing as truly innate altruism is thus crucial to their idea that tapping in to our shared inner generosity can increase our personal well-being. Help others, they say, and you can help yourself.

An empathetic gene
Keltner is something of a celebrity in the popular psychology world. He has shoulderlength silvery blonde hair and exudes the sort of calm eloquence one might expect from a surfer-turned-academic. His research is regularly featured in major media outlets like The New York Times, Nightline, CNN, and Oprah. And so it is no surprise that his undergraduate psychology course, Human Happiness, is one of the most popular classes at Berkeley. In the class, Keltner often asks his students the following question: Where do our individual senses of morality come from? They will usually give me one of several true, but only partially true, answersfrom their parents, from their culture, from their religion, from the books that they read,

think that we are not generally good as a speciesthat we are bloody, violent and genocidal, says Keltner. These negative aspects have a clear evolutionary story, but we want to show that our prosocial side as a species is equally important in really thinking about who we are and how to cultivate mental health. This idea is key to both the positive

says Keltner. But in the last fifteen years psychologists have really started to think about how morality is also rooted in evolution and genetics. Raising the more specific question: Where is something like morality located inside of us? In 1872, thirteen years after the publication of On the Origin of Species, Charles Darwin attempted to find the evolutionary roots of what was until then thought to be a distinctly human characteristicemotion. In The Expression of the Emotions in Man and Animals, Darwin took a plethora of human feelingssuch as anger, grief, shame, sympathy, and joyand attempted to find their animal equivalents. Until that point, Western culture believed that our incredibly complex emotional capacity was associated with a feeling of the sacred, and therefore had to be God-given, says Keltner. What Darwin said is that these emotions are really as much a part of our evolutionary heritage as the other traits he had studied. Working with the idea that our emotional capabilities are coded somewhere within our genomes, Keltner teamed up with Sarina Rodrigues, a post-doctoral candidate in the GGSCs research fellowship program. Rodrigues was interested in a hormone and neuromodulator called oxytocinsometimes referred to as the love hormone. Oxytocin exploded in popularity in the early 1990s for its well-established roles in emotional behaviors like parenting and pair-bond formation. In animals such as the

Fall 2011

Berkeley Science Review

31

FEATURES Positive psychology

female prairie vole, for example, oxytocin released during physical contact with a male has been strongly implicated in establishing the voles lifelong monogamy. In humans, the hormone was initially only known to have effects on pregnancy and labor, but in the past ten years increasing evidence has pointed to its involvement in complex emotional brain processes such as trust, generosity, and even love. Oxytocin can act both as a hormone, traveling long distances through the bloodstream to carry out effects far from its origin in the brain, and as a neurotransmitter, binding to small proteins called receptors on the surface of neurons to cause changes in firing. Since Keltner and Rodrigues were interested in empathy, they focused their project on oxytocins effect at the neuronal level, looking at the two known versions of the oxytocin receptor (see Can you second my emotion? BSR Fall 2010). Located on the third chromosome of the human genome, the two variants differ by only a single nucleotidea guanine (G) in one version

is switched to an adenine (A) in the other. Although the difference seems trivial, it had previously been implicated in some interesting behavioral differences, so they decided to take a closer look. After genotyping a group of 192 participants to determine which of the two variants each person expressed, Rodrigues put the subjects through a battery of behavioral tests and self-reports to identify their potential differences in empathy. Rodrigues and Keltner relied on the well-established fact that empathy is closely tied to an understanding of other peoples emotionsa capability often lacking in individuals with genetic disorders such as autism. Using a standardized empathy measure called Reading the Mind in the Eyes, participants were shown 36 black-andwhite pictures of peoples eyes and then asked to choose the word that best described the subjects apparent mood. They also tested all of the participants for their relative stress reactivity, in keeping with oxytocins known calming effects. What they found was that subjects with the G variant consistently scored higher on the eye-reading task and lower on the stress test than subjects with the A variant. Although the differences were not always hugely significant, the results indicated that a single nucleotide difference in the genetic code could potentially be tied to something as complex as empathy. I came into this research as a big skeptic, said Rodrigues in a New York Times article about oxytocin published shortly after her paper came out, but the results had me floored. Her findings, published in 2009, were added to the slew of recent papers regarding oxytocins role in many of the emotional processes that we most associate with being human. And yet, according to both Keltner and Rodrigues, the oxytocin study does not suggest that there are inherently empathetic or unempathetic people; a genetic correlate rarely indicates fixed, unchanging characteristics. Rather, they say, its just one element, and perhaps a strong jumping-off point, for something much more complex. For example, take a childs height. This is a trait that is 100% heritable, and its still very susceptible to environmental interventionif you feed kids better, they grow taller, says Philip Cowan, an executive faculty member of the

GGSC who studies applied developmental psychology. So you cant even think about genetics without environmental context, and you certainly cant think about environmental context without genetics. Their goal is that with an increased understanding of how positive behaviors are rooted in our genes, we can work to better cultivate them environmentally.

A nurturing nature
We know there are empathetic and prosocial children, and as a scientist you want to figure out the physiological underpinnings of that profile, says Keltner. But also, and this is where the GGSC comes in a lot, how do you make kids more empathetic? Much of this branch of research has to do with positive interventionspositive psychologys preventative, prosocial counterpart to classical psychologys interventions implemented when antisocial behaviors reach a breaking point. Cowan, who has worked for 40 years with his wife Carolyn on family systems and child development, focuses primarily on positive interventions in the earliest stages of lifethe prenatal and early childhood periods. In a longitudinal study published in 2011, the Cowans showed that of 100 two-parent couples raising their first child, those who attended regular couples therapy before their childs first year of kindergarten were able to sustain a much more stable family system for ten years after the sessions were complete. Furthermore, the children of parents who underwent therapy showed increased academic and social competence compared to the no-therapy controls. The goal is to get in very early and focus on the systems of family relationship, not just an ideal of good parenting. You have to look at all of the relationships in the system, says Cowan. And this is where the real impact of the GGSC lieshighlighting the ways in which new research can concretely improve peoples lives. Though Cowan looks at the application of principles out in the field, a lot of potentially applicable psychology research never makes it too far out of the lab. When the Hornadays established the GGSC, they were looking to establish a pipeline from the social science labs directly to the people it

32

Berkeley Science Review

Fall 2011

IMAGE: AZEEN GHORAYSHI; ADAPTED BY LEAH ANDERSON

FEATURES Positive psychology

could benefit. This pipeline has largely come in the form of the GGSCs popular online magazine, Greater Good. There was a growing body of evidence supporting a different way of viewing our human nature, says Jason Marsh, Editor in Chief of the magazine. We saw that there was a great need for a publication with the mission of featuring this research and putting it into terms that are more accessible to the general public. The magazine provides in-depth articles, guest columns from researchers, and even a popular parenting blog called Raising Happiness. From slightly more tongue-incheek quizzes such as Is she flirting with you? (an emotional intelligence quiz), to a Fathers Day article explaining how to get dads more involved in child-rearing (written by Cowan and his wife), the magazines content ranges from light and playful to serious and contemplative. It also advertises their community seminar series, started in 2009, where people in the science, education, and public policy communities speak on topics ranging from increasing altruism in kids to sustaining thriving romantic relationships. The study of human behavior is what were all always talking about anywayits what people talk about in bars, what people gossip about in barbershops, says Willer. But Greater Goods focus on the cutting-edge research about this stuff is what makes it much more rigorous than your typical scientific public outreach project.

The market of happiness


But what about Martin Seligmans new campaign for positivity? Shortly after his official launch of positive psychology in 1998, something strange started happeningthe American public got hooked. In 2000, less than 250 books were published on happiness, most in the way of self-help manuals. In 2010, over 2,300 books were published on the topic. Browse the self-help aisle of your nearest bookstore and you will inevitably find the following titles: Happiness: The Science Behind Your Smile, What Happy People Know: How the New Science of Happiness Can Change Your Life for the Better, and The Happiness Project: Or, Why I Spent a Year Trying to Sing in the Morning, Clean My Closets, Fight Right, Read Aristotle, and

Generally Have More Fun. Now, with websites and smartphone applications devoted to a continuously more personalized view of user experience, people can even track their own happiness in real-time and correlate it with where they are located and what they are doing. We want to know exactly what this happiness thing is and how to get more of it. So it happened that the positive psychology movement came with its own particular form of backlasha popular marketplace promising scientific understanding of happiness in exchange for money, money, and more money. Seligman, as the figurehead of the field that spawned a cultural movement, received a fair amount of criticism. In 2004, he published his own bestseller, Authentic Happiness: Using the New Positive Psychology to Realize Your Potential for Lasting Fulfillment. But, many critics argued, how could such a young field of scientific research already have the answer to such an age-old, elusive question? The problem, critics said, stemmed from two things: first, happiness is difficult to objectively measure, and second, by focusing so heavily on happiness, Seligman was missing the point of a more broadly fulfilling life, and perhaps selling out to the lucrative popular marketplace of happiness. The core of the measurement issue is that if happiness is necessarily subjective, then selfreports are unavoidable. In psychology, we like to be able to use more concrete measures than just self-reports, says Willer. Ideally, we would like to have behavioral or physiological signatures for happiness, but that is very difficult to doand that is a challenge for the field. That said, I think these are healthy criticisms that positive psychologists are really working hard to listen and respond to. On the point of the over-pursuit of happiness,

however, Seligman himself seems to have backtracked slightly in recent years. His new book, Flourish, even lengthily derides the so-called happiology that many critics would argue he helped to create, saying that he never meant for positive psychology to be construed as a prescription for a pleasant life. Happiness is a diffuse term, says Keltner. By solely asking, Am I happy? we miss out on the many nuances of a meaningful life. Nevertheless, positive psychology as a field is thriving. Classes like Keltners are now taught on over 200 college campuses nationwide, and a steadily increasing number of psychologists are turning to positive research topics. Regardless of whether the secret to a happy life will ever be definitively understood, institutions like the GGSC see their roles as solidconnecting people to the research that defines their lives. Happiness is very important, says Marsh. But its only one part of the puzzle. Weve got a lot to figure out along the way.

Azeen Ghorayshi is a lab technician in molecular and cell biology.

Fall 2011

Berkeley Science Review

33

your click. your pick.


Readers Choice Award
A lab space of ones own
The QB3 Garage: an incubator for innovation

The brain is

half full
Hornadays created the Greater Good Science Center (GGSC). Housed within the UC Berkeley Child Study Center on the south side of campus, the GGSC offers undergraduate and graduate research fellowships, holds community lectures, and publishes the online Greater Good magazine to highlight current research in the field. With the belief that positive human traits are innate and strongly tied to individual thriving, the GGSC and its positive psychology peers hope to promote the elusive holy grail of personal achievementtrue happiness.

Theres a

MAP
for that

LEAH ANDERSON

20

Berkeley Science Review

Fall 2011

Fall 2011

Berkeley Science Review

21

Fall 2011

Berkeley Science Review

29

36

Berkeley Science Review

Fall 2011

Fall 2011

Berkeley Science Review

37

Fall 2011

Berkeley Science Review

45

Vote online to make sure your favorite article wins.

sciencereview.berkeley.edu/award

34

Berkeley Science Review

Fall 2011

MAREK JAKUBOWSKI

ERIC FISCHER

by Susanne Kassube

The science behind positive psychology

by Azeen Ghorayshi

s the story goes, University of Pennsylvania psychologist Martin Seligman, a self-described pessimist, was weeding his garden when his five-year-old daughter Nikki began playfully shrieking and tossing weeds in the air. As Seligman scolded her harshly for being disruptive, his daughter spun around, looked Seligman in the eye, and said: Daddy, stop being such a grouch! Seligman says that he took this as a wake-up call. Being a pessimist was something he could deal with, even something to be proud of in his academic circles, yet being called a grouch made him cringe. His research up to this point centered on the roots of depression, but the gardening incident made Seligman realize that he, and perhaps the field of psychology as a whole, had focused solely on negativity for too long. So, shortly after his appointment as president of the American Psychological Association in 1998, Seligman charted out a new approach for the field. Dubbed positive psychology, this branch of research would focus on human thriving over human pathology studying function over dysfunction. The movement quickly developed a following, including at UC Berkeley. At around the same time, an entirely separate journey brought two Cal alumni, Tom and Ruth Hornaday, back to their alma mater. The Hornadays had recently dealt with a tragic family loss, and came to Berkeley in 2001 with the idea of funding multidisciplinary research on social and emotional well-being. After speaking with several Berkeley professors already studying the positive psychology topics that they wanted to help promote, the

L Digitizing the Drawers


by Ginger Jui

Our better halves


Before the field of positive psychology could really get off the ground, it needed a manifesto of sortsa clearly paved vision for its new focus on positive human behaviors. To create the common language and standardized protocols necessary for a rigorous scientific discipline, Seligman and his cohorts wrote the Character Strengths and Virtues (CSV) manualequivalent in purpose, but opposite in focus, to the Diagnostic and Statistic Manual of Mental Disorders (DSM) used to characterize psychological conditions for over 50 years. Psychology up to that point, said Seligman, had studied only half of the landscape of the human condition, and the CSV would thus serve as the DSMs natural counterpart. The manual lays out the central tenets of the positive psychology field. The main idea is that virtues such as compassion, courage, and wisdom are as much a part of our human nature as selfishness, weakness, or ignorance. Therefore, just as psychological illnesses need to be identified, treated, and prevented,

Cell phones for a better commute

ooking down and across the San Francisco Bay from the Berkeley hills during the evening rush hour, one sees an umbilicus of trafficamber headlights on the left, ruby taillights on the rightstretching across the Bay Bridge, connecting the gridded East Bay streets to the skyline of San Francisco. A traffic engineer admiring this scene might ask: Where are all these people going? Is traffic this bad everyday? or How can we make it flow faster and more efficiently? Increasingly, the need for real-world solutions to traffic problems requires transcribing this birdseye view of traffic information into actual hard data on a computer. This task requires ubiquitous technology allowing individuals to transmit information about their environment, as well as the computational power to amass and analyze this data at break-neck speeds. Today, UC Berkeley researchers are merging the fields of civil engineering and information technology to bridge these information gaps between the traffic models on their computers and the drivers on the ground. Real-time traffic information is a valuable commodity. For over 40 years, the major source of traffic information for the California Department of Transportation (Caltrans) has been loop detectors. These are the thick cables embedded as a circle or hexagon in the asphalt of highway lanes that are also commonly used to detect whether cars are stopped at stoplights. Historically, Caltrans invested in loop detectors because they report the three main pieces of informationtraffic speed, volume, and occupancynecessary for generating a complete picture of highway conditions. However, loop detectors are expensive to build, highly sensitive, and difficult to maintain, requiring Caltrans to shut down whole lanes of traffic if an array of detectors goes haywire. As California state budgets continue to shrink, Caltrans is looking for cheaper and more effective alternatives for gathering traffic information. One such alternative is now available in the form of Global Positioning System (GPS) data from mobile devices. GPS data

Digitizing the Drawers


by Joan Ball

Moving natural history collections online

atural history collections around the world contain over one billion specimens, and could reveal important changes in biological systems that have occurred over the past 100200 years. In the internet age, you might expect that specimen data would exist in online databases, but this is not the case for most museums. The painstaking endeavor to make natural history collections digitally accessible requires huge data entry efforts and the coordination of interdisciplinary teams of scientists. Biologists of diverse disciplines are increasingly collaborating with computer programmers to create efficient data management and dissemination systems. John Wieczorek is a programmer whose domestic partner, Dr. Eileen Lacey, happens to be curator of mammals in the Museum of Vertebrate Zoology (MVZ) at Berkeley. One day in 1997 Eileen came home from work and said to John, Hey, they have this picture of a database on the wall in the museum, and I dont think anybody there understands it. Why dont you go in and see if you can help them out? John was initially doubtful, until she came home two weeks later to say the same thing. He recalls thinking, Well, shes going to do this until I go look at that damn database picture on the wall, so he agreed to go in for a meeting a few days later. To his surprise, that meeting turned out to be an interview where he accepted a position to develop a modern relational database that would handle all of the collection data in the museum. He has proceeded to become a leader in the global effort to make information from natural history museums accessible to anyone with an internet connection. Wieczorek is but one important player in the ongoing, interdisciplinary efforts necessary to get collections data online and in a central location. The standardization and centralization of specimen databasesdocumenting everything from large mammals to tiny insects and plantsis essential for understanding historical biodiversity and how it has changed over the past century

EGGNEST: FRIZZYCHICK; DESIGN: VALERIE OSHEA

Stay in the loop.


Subscribe today and never miss an issue.

Fall 2011 Issue 21

sciencereview.berkeley.edu

Get involved.
BSR is always looking for new writers, editors, artists, and layout editors
Fall 2011 Berkeley Science Review

Digitizing the Drawers

sciencereview.berkeley.edu

35

Theres a

MAP
for that

36

Berkeley Science Review

Fall 2011

L
by Ginger Jui

Fall 2011

Berkeley Science Review

37

ERIC FISCHER

Cell phones for a better commute

ooking down and across the San Francisco Bay from the Berkeley hills during the evening rush hour, one sees an umbilicus of trafficamber headlights on the left, ruby taillights on the rightstretching across the Bay Bridge, connecting the gridded East Bay streets to the skyline of San Francisco. A traffic engineer admiring this scene might ask: Where are all these people going? Is traffic this bad everyday? or How can we make it flow faster and more efficiently? Increasingly, the need for real-world solutions to traffic problems requires transcribing this birdseye view of traffic information into actual hard data on a computer. This task requires ubiquitous technology allowing individuals to transmit information about their environment, as well as the computational power to amass and analyze this data at break-neck speeds. Today, UC Berkeley researchers are merging the fields of civil engineering and information technology to bridge these information gaps between the traffic models on their computers and the drivers on the ground. Real-time traffic information is a valuable commodity. For over 40 years, the major source of traffic information for the California Department of Transportation (Caltrans) has been loop detectors. These are the thick cables embedded as a circle or hexagon in the asphalt of highway lanes that are also commonly used to detect whether cars are stopped at stoplights. Historically, Caltrans invested in loop detectors because they report the three main pieces of informationtraffic speed, volume, and occupancynecessary for generating a complete picture of highway conditions. However, loop detectors are expensive to build, highly sensitive, and difficult to maintain, requiring Caltrans to shut down whole lanes of traffic if an array of detectors goes haywire. As California state budgets continue to shrink, Caltrans is looking for cheaper and more effective alternatives for gathering traffic information. One such alternative is now available in the form of Global Positioning System (GPS) data from mobile devices. GPS data from the smartphone in your pocket and

FEATURES Mobile tech

navigation devices on your dashboard may revolutionize how Caltrans and UC Berkeley researchers track travel behavior and design transportation infrastructure. Integration of mobile technologies into traffic monitoring and planning is a problem that sits at the intersection of academic research and the public and private sectors. The California Center for Innovative Transportation (CCIT) attacks precisely this kind of problem by taking an innovative approach to the significant scientific, business, and deployment challenges it presents. A non-profit affiliate of the UC Berkeley Institute of Transportation Studies, CCIT works to bring cutting edge research from UC Berkeley to make transportation systems safer, cleaner and more efficient: in a word, more sustainable.

The mobile solution


Jump to February 2008. One hundred cars driven by UC Berkeley students roll onto a 10-mile stretch of I-880 between Hayward and Fremont, California. Each car is identified by a tarp taped to its hood numbering between 00 and 99 and carries in it a GPSequipped Nokia cell phone. For the next eight hours, these intrepid students drive in loops on the I-880 while their cell phones transmit GPS coordinates back to a server at UC Berkeley.

Dubbed Mobile Century, this mass joyride was sponsored by Caltrans and involved CCIT, Nokia, and the UC Berkeley Civil and Environmental Engineering department. Mobile Century sought to demonstrate that cell phone-based GPS data could be used to accurately estimate traffic speed and trip duration in real time. While other researchers had deployed this technology in highly controlled experiments, the Mobile Century experiment tested this concept in real-world traffic conditions for the first time. Traffic engineers are giddy about the rise of GPS-equipped mobile devices. As Professor Alex Bayen, who is jointly appointed in the Department of Electrical Engineering and Computer Science and the Department of Civil Engineering, explained enthusiastically in a recent interview, The big novelty in 2009 was that every cell phone, waving at the iPhone Im using to record our interview and the Android phone he pulls out of his pocket, suddenly had a GPS, and that created an explosion of data. That opened a big opportunity for transportation because suddenly you could do monitoring in places where you dont have dedicated infrastructure. That explosion of data could potentially be harnessed as a cheap and widely available data source for use in traffic monitoring, control, and planning. For Caltrans and traffic

engineers everywhere, Mobile Century was a scientific leap forward. It demonstrated that it was possible to build algorithms and data infrastructure to process cell phone GPS data in real time, and that these estimates of traffic conditions were accurate and reliable. Following Mobile Century, the next crucial problem to solve was capturing more of the valuable GPS data riding around in everyones pocket. As it turns out, theres an app for that. Mobile Millenniumthe next generation real-world experiment that emerged from Mobile Centuryrolled out in November 2008. Rather than providing drivers with a particular cell phone, UC Berkeley researcherswith the help of Navteq, a location-based services companyrolled out a traffic application that drivers could download from the Mobile Millennium website. Mobile Millennium had two goals. The first was to incorporate GPS data reported by cell phones, as well as historical data, GPS from San Franciscos taxi cabs, and data from existing radar and loop detector infrastructure into a complex traffic model to monitor traffic conditions on both highways and arterial streets. The second was to report this traffic information back to users. This app, the first traffic app deployed by Nokia in North America, was downloaded from the Mobile Millennium website by over

38

Berkeley Science Review

Fall 2011

GREGORY THOMAS

FEATURES Mobile tech

5000 users during the twelve months of the experiment.

Developing a new market


Mobile Millennium, in principle, served as a proof of concept for a traffic information product to help consumers plan their commutes using real-time traffic information. This idea has now been replicated, implemented, and marketed by a number of companies, including Navteq, Traffic.com, and Google Traffic. The success of Google, however, has driven the consumer-product based business model to extinction. It is hard to sell a free commodity, Professor Bayen quipped. Because some websites give it for free, the market for travel information has dramatically shrunk in recent years. Yet, in the face of Googles grip on the traffic monitoring market, an innovative new business model is emerging from the Mobile Millennium research thanks to a continued collaboration between Caltrans and CCIT. The CCIT office sits on the third floor of the former Masonic Temple in Berkeley, at the corner of Bancroft and Shattuck. The day I met Ali Mortazavi, program manager of the deployment and innovation team at CCIT, there was a blue Corvette convertible parked out front: a gas guzzling eight-cylinder sports car with leather bucket seats. Surprised, having assumed that the CCIT staff would be a more eco-friendly bunch, I asked Mortazavi whether it belonged to someone inside. He reassured me that it didnt, and

that he himself commuted to work in a fuel efficient four-cylinder Hyundai. Caltrans and researchers at CCIT are very interested in using mobile probe data collected from GPS-enabled phones and other dashboard GPS units. Currently, however, loop detectors are the only technology that provide all three crucial pieces of traffic dataoccupancy, volume, and speed needed by CalTrans traffic models. Right now Caltrans installs sensors every half a mile, says Mortazavi. Now imagine you install sensors every mile or two miles, and fill in the gaps with [GPS] data. That would be a huge cost reduction. Furthermore, rather than kicking out all of the existing traffic monitoring infrastructure, he says, CCIT and Caltrans are taking an incremental approach to assessing the added value of GPS data purchased from third party vendors. The available GPS data is a heterogeneous mix, coming from mobile consumer devices, GPS in commercial fleets (such as buses, trucks, and taxicabs), the San Francisco Bay areas electronic toll collection system, FasTrak, and radar. CCIT and Caltrans want to figure out whether the information they are collecting might allow them to decrease the number of loop detectors to install and maintain. In addition to the scientific work of building traffic models, Mortazavi and CCIT are also working on each step on the business side of deploying this cutting-edge technology, from designing the data specs, writing the terms and

conditions of data contracts and procuring the data. Were trying to create a win-win situation for both Caltrans, who would like to provide something beneficial for the public, and the private sector, who are looking for more profit, says Mortazavi. Significant scientific challenges are associated with the acquisition of more traffic data. There are different errors associated with various measurement devices, and orthogonal types of information available from each technology. For example, one task for the algorithms being developed in Professor Alex Bayens lab is to fuse count and volume data from loop detectors with speed data from cell phone GPS. The end goal is to leverage both types of information to get a better estimate of traffic density. The problem is, Mortazavi explains, its really easy to blend speed from different sources, but its really difficult to mix different data types, for example, volume and speed. This challenge is now being tackled in Bayens lab, using what Mortazavi calls fusion algorithms for the different data types and in the traffic models that use this data to generate a real-time traffic map. So far, it seems like the algorithms and models work pretty well. I dropped by post-doc Anthony Patires cubicle at CCIT to have a closer look at the traffic models. He has been at CCIT for less than a year, after getting his PhD in Civil and Environmental Engineering at UC Berkeley. Patire clearly keeps busy: the walls of his cubicle are lined
Fall 2011 Berkeley Science Review

MAREK JAKUBOWSKI

39

FEATURES Mobile tech

Meet Mobile Millenium


Mobile Millenium is a groundbreaking project that asks a unique research question: can we use consumer technology to improve the real-time modeling and distribution of traffic information? The first phase of this Caltrans sponsored project sought to use global position systems (GPS) in cellular phones to gather traffic information, model that traffic in real time, and broadcast the information back to users. This project was sponsored by Caltrans and the initial phase involved a public-private partnership between the California Center for Innovative Transportation (CCIT) at UC Berkeley, Nokia Research Center and NAVTEQ, a location based services company. In November 2008, CCIT launched the Mobile Millenium traffic app that was downloaded by over 5,000 users onto their Nokia cell phones. This 12 month pilot program provided a proof of principle that GPS data harvested from consumer devices could be used to successfully model traffic information in the Bay Area. Today, the Mobile Millenium traffic model continues to run from a server housed at CCIT. It gathers traffic data from a large variety of sources, including cellular phones, GPS in commercial fleets (such as buses, trucks and taxicabs), the San Francisco Bay areas electronic toll collection system, FasTrak, and radar. Students and researchers at CCIT and across the UC Berkeley campus continue to work on refining this data-modeling infrastructure. Because Mobile Millenium is a unique public-private partnership, this work includes both hard scientific questions and business-side development and implementation. CCIT is currently working to define and test business models for acquiring and distributing traffic information from private companies. Meanwhile, research work continues to improve the server-side hardware infrastructure to handle the increasing amounts of traffic information that needs to be processed and distributed in real-time. In addition, the modeling outputs are being used to generate unique traffic information products, such as graduate student Paul Borohkovs Reliable Router traffic app.

with cryptic, albeit neatly organized, labels headlining different projects, such as MM (that would be Mobile Millennium), FHWA, T01702, as well as other titles like Random and Fun Stuff. His workstation is composed of two computer screens aglow with computer code and a groaning bookshelf that carries hefty titles such as Stochastic Processes and Non-linear Programming. Patire pulls up what looks like a Google Maps-based web application to show me the Mobile Millennium traffic models running in real time. The purpose here is to take real time data, run it through a model, and use the model to fill in gaps where we have no data, he explains. The models not perfect. Sometimes, it will predict something thats just a little bit off. For those places that we have measurements, we can get it back on track. In other words, the model is able to reflect traffic conditions in real time using a combination of real data and computer modeling. As Patire reports, You would see a traffic jam form [in the Mobile Millenium app], and it could be 15 minutes until it is posted on other traffic monitoring websites. Noon on Wednesday looks like a good time to head into San Francisco: the real-time traffic map he pulls up shows all the major highways and arterial streets highlighted

in a happy traffic green. In fact, it looks very much like the map you would see if you clicked the current traffic conditions button on Google Mapsthe Mobile Millennium map looks equally well connected and the coverage is quite extensive.

Moving forward
The Mobile Millennium real-time maps are not yet available online to the public, but can be seen on display in the lobby of Sutardja Dai Hall on UC Berkeleys campus. This building houses CITRIS, the Center for Information Technology Research in the Interest of Society, whose mission is to shorten the pipeline between world-class laboratory research and the creation of startups, larger companies, and whole industries. Inside the lobby of the CITRIS building, there is a touch-screen TV displaying the Mobile Millennium traffic map. Patire suggests Friday afternoons are the best time for watching the Bay Area traffic mayhem. The data algorithms and infrastructure developed in Mobile Millennium are now being used in Professor Bayens lab at CITRIS to develop next-generation traffic routers. Bayen describes how traffic routers currently fall into four categories of ascending sophistication: zero notion of traffic, historical

traffic, now traffic, and future traffic. Bayen explains: Lets say you want to go from Berkeley to Mountain View. The simplest router is going to give you a route based on shortest travel time, assuming the posted speed limit. Because its smart, its not going to route you through the tiny roads, even though it might be faster on the map. Its not, because of stop signs and pedestrians. In practice its going to route you by the shortest time through the freeway system, when it can. That doesnt take into account traffic. The [second generation] routers account for historical traffic data; [it tells you,] dont count on 20 minutes between San Mateo and the Dumbarton Bridge, you should count on 30 minutes. But what if there is an accident on the Dumbarton Bridge? Third generation routers will incorporate real-time traffic information to optimize your route. Bayen says, You look at traffic nowand actually there is no traffic today, even though its 6 oclock. Since there is no traffic, youll still use the freeway. Bayens masters student Paul Borokhov is working precisely on this problem of the third-generation router. Borokhov is developing a smartphone app, tentatively titled the Reliable Router, that will use real-time traffic data to suggest routes that

40

Berkeley Science Review

Fall 2011

FEATURES Mobile tech

would maximize the probability of reaching a destination on time. For drivers, getting to a destination on time may be more important than getting there along a shorter or nominally faster route that also comes with a high probability of lengthy delays. The Reliable Router algorithm works by creating a network of alternative routes between point A and point B. It then uses real-time traffic information from the Mobile Millenium traffic monitoring server to estimate the probability of arriving at your destination on time for each alternative route. In order for the Reliable Router algorithm to produce the best possible route, the algorithm cannot give the full set of directions to your destination at the outset. Instead, the app gives a set of guidelines that update themselves as you drive. As you come to

a node leading to alternative routes in the network, the algorithm chooses the best next segment for your route based on the time remaining to travel to your destination and the updated traffic conditions along the alternative routes. The instantaneously updated traffic directions in Reliable Router led Borokhov to develop a method for giving audio directions on the fly. Borokhov explains, When youre driving, looking at street names is difficult. You have to figure out where the sign is, read the sign, remember which street youre supposed to turn on, and then youve passed the street because youre driving 45mph. The innovative thing is that we can tell people, go right on the second street, and it takes care of the problem of needing to know street names. You also minimize

driver distractions because these are audio directions. I asked Borokhov whether this app could be used to optimize for routes that minimized fuel consumption. He speculated that it could be done with a few modifications. For each link in the network, one could estimate the amount of fuel consumed along a given route based on the current traffic conditions and then optimize the route for fuel efficiency. However, Borokhov cautioned that to generate an accurate environmental impact estimate, one would have to avoid making overly simplistic assumptions about the vehicle and its rate of fuel consumption. By integrating robust models of vehicle emissions and real-time travel data, future travel behavior models developed at CCIT and CITRIS may enable unprecedented minimization of environmental impact for regional transportation networks. Building this integrated network of multimodal transportation information poses the next significant challenge for traffic modeling algorithms. The use of data from mobile technologies, which can give granular information down to the level of individual people and their movements, will provide both the scaffolding and the substance for these algorithms as they enable the next level of interaction with our transportation system.

From linear highways to transportation networks


Researchers like Patire and Bayen can no longer regard highways as linear systems largely disconnected from smaller streets. While the earliest models, like the one used in Mobile Century, focused on highways, the next generation of traffic models must also address improving mobility in the arterial streets that feed these highways. As Ali Mortazavi at CCIT says, You cannot draw a boundary here on the highway and not care about whats happening in the arterials. Everything is connected. If you mess up something in one corridor, youll affect everything in the network. In urban areas, the congestion is high and if you shut down one thing youll affect everything. This network approach will be a significant scientific challenge. The highway
Fall 2011 Berkeley Science Review

ERIC FISCHER

41

FEATURES Mobile tech

is easily mapped [using GPS] because its easy to when see someone is driving 60 miles an hour, explains Mortazavi. Now you have a lot more parameters. You have people walking outside and inside buildings, and people stopping at traffic signals. Thats another challenge and its exciting.

Future directions
Todays transportation systems are complex networks populated by multimodal users. Everyone I interviewed for this article had a different way of getting to workbiking, driving, taking the commuter railshowing just how many parameters the traffic engineer of the near future will have to account for in models of transportation network control and planning. Real-time traffic information and mobile technologies could potentially nudge commuters towards more sustainable modes of transportation. CCIT is cognizant that traffic congestion cannot be solved by merely better monitoring and upgrading roadway infrastructure. Travel behavior itself must change.

Caltrans is also increasingly in the business of influencing driver behavior, by giving real-time information on traffic congestion to help drivers predict travel time, plan alternate routes, or even choose to take public transit instead of using their vehicles. Caltrans and CCIT are already active in pursuing this line of thought in a project you have probably seen on highways around the Bay Area: the deployment of the black and orange Changeable Message Signs (CMS) that give, for example, rush hour travel times from Berkeley to downtown San Francisco. CCIT has recently partnered with Caltrain, the commuter rail between San Jose and San Francisco, to add public transit information to these signs. During rush hour, signs along Highway 101 now compare drive times with riding Caltrain to work, hoping to nudge drivers to choose public transit for their commute. In addition, CCIT is currently developing another mobile app in collaboration with IBM, called Smarter Traveler, that will push commuters to take transit, walk, or bike even before they get into their autos. Combined with the vertically integrated,

palm-of-your-hand transportation data in applications like Mobile Millennium and Google Maps, this top-down approach to transportation communication aims to empower commuters with an incredible range of options to optimize their preferred mode of transportation. With these systems in development, the focus now shifts to the way commuters will actually interact with the information with which they are presented. Will commuters optimize travel time or greenhouse gas emissions? Are drivers worried about fuel consumption or more subjective concerns like how relaxing, safe, or scenic their commute can be? By intersecting sustainability, technology, and transportation and information, UC Berkeley researchers are giving us, and the statewide agencies that set transportation policy, the tools and knowledge to start answering these questions to improve our daily lives.

Ginger Jui is a graduate student in integrative biology.

Pacific offers an extensive selection of undergraduate, graduate and professional programs on three campuses in Stockton, San Francisco, and Sacramento. With nine schools and colleges and more than 80 majors and programs of study, including continuing education programs, the University of the Pacific has that rare combination of small-university friendliness and programs found at much larger universities. College of the Pacific School of International Studies FINANCIAL ASSISTANCE School of Engineering and Computer Science Fellowships Conservatory of Music Teaching Assistantships Eberhardt School of Business Benerd School of Education Graduate Assistantships Thomas J. Long School of Pharmacy Research Assistantships and Health Sciences Arthur A. Dugoni School of Dentistry McGeorge School Of Law

University of the Pacific, Office of Graduate Studies 3601 Pacific Ave. Stockton, CA 95211

Apply at: www.pacific.edu

Digitizing the Drawers


by Joan Ball

Moving natural history collections online

atural history collections around the world contain over one billion specimens, and could reveal important changes in biological systems that have occurred over the past 100200 years. In the internet age, you might expect that specimen data would exist in online databases, but this is not the case for most museums. The painstaking endeavor to make natural history collections digitally accessible requires huge data entry efforts and the coordination of interdisciplinary teams of scientists. Biologists of diverse disciplines are increasingly collaborating with computer programmers to create efficient data management and dissemination systems. John Wieczorek is a programmer whose domestic partner, Dr. Eileen Lacey, happens to be curator of mammals in the Museum of Vertebrate Zoology (MVZ) at Berkeley. One day in 1997 Eileen came home from work and said to John, Hey, they have this picture of a database on the wall in the museum, and I dont think anybody there understands it. Why dont you go in and see if you can help them out? John was initially doubtful, until she came home two weeks later to say the same thing. He recalls thinking, Well, shes going to do this until I go look at that damn database picture on the wall, so he agreed to go in for a meeting a few days later. To his surprise, that meeting turned out to be an interview where he accepted a position to develop a modern relational database that would handle all of the collection data in the museum. He has proceeded to become a leader in the global effort to make information from natural history museums accessible to anyone with an internet connection. Wieczorek is but one important player in the ongoing, interdisciplinary efforts necessary to get collections data online and in a central location. The standardization and centralization of specimen databasesdocumenting everything from large mammals to tiny insects and plantsis essential for understanding historical biodiversity and how it has changed over the past century
Fall 2011 Berkeley Science Review

45

MAREK JAKUBOWSKI

FEATURES Natural history

or more. Dr. Stan Blum, a biodiversity informatics specialist at the California Academy of Sciences, says that, Weve been beating the drum on digitization for a long time and it seems to be steadily gathering momentum. But you have to keep beating the drum, to attract the investments needed for much larger collections.

The need to digitize


With the advent of increasingly sophisticated technologies, the use of natural history collections data has expanded from its traditional uses in taxonomy to studies in ecology, biogeography, pest management, disease transmission, conservation, and more. Scientists are using methods in chemical and molecular analysis to measure contaminant levels in preserved specimens, determine food items consumed, and clarify evolutionary relationships derived from genetic data. Researchers can associate A partial illustration of Arctos, the open source database that the MVZ uses to capture and store specimen information. specimen occurrences and their attributes with geographic data on climate and land use. In this way, museums serve as repositories are in the process of tagging and linking all and money, and entomological collections of valuable resources for understanding of this auxiliary information to individual have experienced dwindling personnel and the biological effects of climate change and specimens. funding. habitat modifications. On the other end of the data manageThe Essig Museum has recently Collections are increasingly responding ment spectrum, many collections have launched a large effort to digitize 1.2 million to the demand for data by creating openabsolutely no means of searching specimen arthropod specimens from eight institutions access databases, available for easy search information from a central location. Most of across California through a project called and download. Vertebrate collections in these collections have simply been unable to Calbug. Although it may seem late in the particular have become highly accessible muster the considerable resources required to game compared to the MVZ, this is the through online databases. The MVZ at digitize. Insects have particularly high diverlargest effort so far attempted for insect Berkeley is a driving force behind this trend. sity and very large numbers of specimens collections in terms of specimen number, Not only was it one of the first museums exist in collections, so the task of digitizing species number, and geographic area covered. to completely digitize and geographically has seemed impossible. By and large there If all goes well, the bees will catch up with the reference its entire collection of around has been no concerted, coordinated effort birds, bringing collections management and 677,000 specimens, but it has also been part to get that huge amount of data digitized, global change studies into the 21st century. of several multi-institutional collaborations says Dr. Rosemary Gillespie, director of the that continue to digitize collections from Essig Museum of Entomology at Berkeley. Some historical context around the world. The MVZ has gone so People in Entomology collections have just The MVZ first managed their data on paper far as to digitize historical field notes, phothrown up their hands, because its such a when vertebrate collections were established tographs, annotated maps, gene sequences, huge quantity of data. Digitizing collecat Berkeley in 1908. From that time until the and vocal recordings of frogs and birds. They tions requires significant investments of time late 1970s they used ledger folios and card

46

Berkeley Science Review

Fall 2011

JAMES GAO

catalogues to keep track of holdings, similar to the old catalogue system at public libraries. These cards and ledgers were used to locate specimens and find associated information such as date collected, locality, collector, and taxon name. Such lists provided a central place to look up information. The catalogues then became a valuable resource for efficiently digitizing the collections, eliminating the need to physically sort through specimen drawers and cabinets across the museum. Digitization began in 1979 at the MVZ, when Berkeley ran on mainframe computers. Faculty members and staff had access to keyboards and monitors, but no personal computer, and hundreds of individuals accessed a single mainframe. As Stan Blum, a former programmer with the MVZ, explains, They just plugged you in over the wire to the mainframe computer on campus. The MVZ started digitizing mammals, reptiles, and amphibian records using the card catalogues, and then digitized bird collections directly from specimen labels. Concerted digitization continued until 1983, and included everything from the earliest specimen (a bird egg from 1843) to the most recent. Now, new specimens are digitized before they enter the collection. Changes in computing technology became an impetus to revamp data management at the MVZ, when in the early 90s the MVZ learned that the universitys mainframe computer, where they had created and managed their database, would soon be decommissioned. This system had already exceeded its capacity, and they had 32 different databases for the collection. They needed a better way to manage a growing body of dataa database that could handle all the information typical of natural history collections. This database should be capable of relating a variety of data associated with specimens, including the basics (taxon name, collector, date, location), and extras like field notes and photographs. Blum originally entered the field of biodiversity informatics after receiving his PhD in Zoology. Blum says, As a post-doc, I could see that the next curator position in Ichthyology wasnt going to open up for another five years, and they come few and far between. In contrast, I knew the field of informatics was only going to grow. For over

Search results:
Specimen type: egg Species: Gymnogyps californianus Common name: California condor Continent: North America Country: United States Specific locality: 5.5 mi NE of Pasadena Coordinates: 34.201057, -118.092905 Collecting Date: 10 Febrary, 1907 Collector: Joseph Grinnell

CONDOR EGG IMAGES: MVZ; SEARCH ILLUSTRATION: HELENE MOORMAN

Each individual specimen at the Museum of Vertebrate Zoology is linked to other types of media in the Arctos online database, improving the overall value and accessibility of specimen information. A search for this California condor egg, collected in 1907, turns up images associated with the specimen in addition the specimen record. Clockwise from top left: the California condor egg specimen; the eggs original hand-written specimen label; nest site of condor; landscape around study site.

FEATURES Natural history

twenty years he has been working full-time on projects that apply information technology to biodiversity sciencehelping museum scientists capture, manage, and efficiently analyze data. In 1995 Blum came to Berkeley and created the mysterious database picture on the wall of the MVZ. He worked very closely with MVZs Staff Curator of Mammals, Barbara Stein, to create a roadmap for what would become the collection database. He did this using a methodology known as object role modeling. Blum needed to understand the ins and outs of all collection information and workflows to design an effective database. This methodology explored the combinatorics of data in detail, that is, how many entries can go into each field and how each field can and cannot be related to others. They spent a lot of time doing structured interviews, where he would ask about each different type of data at the museum and what concepts should be included. For example, they would discuss tissue samples by creating lists of all the possible tissue types, all the different vials tissues could be stored in, all the species with which they could be associated, specifying the number of possible entries for each field, and mapping

out the potential relationships among a programmer nightmares, but it is very the fields. satisfying in the end when the database is The final conceptual cleaned up. Gross put the MVZ database model consisted of multiple online in 1999. figures and a companion In 2005, Blums model was incorporated document. A computer into the Arctos data-management system algorithm then converted created by Dusty McDonald and Gordon the model to a logical data Jarrell. With Arctos, users can search for structure for the relational and add information online through a web database. The result was interface, which automatically populates the a very complicated database. This is the open source software set of tables. There that the MVZ uses to manage its database is no way to parse today, along with collaborators such as something that big the Museum of Southwestern Biology in and complicated New Mexico, the Museum of Comparative in a hierarchical Zoology at Harvard and many others. The way, you have to Berkeley model was also the basis for Specify, drill down in another open source program for biological bits and chunks, collections data management, supported by he explains. He the University of Kansas. left the museum after his design work Increasing data sharing, accessibility, was finished, but his and value model remains the basis The critical step after digitization is what for databases that most natural Wieczorek calls data mobilizationmaking history collections use today, and is loosely the data readily available and increasing their called the Berkeley Model. value. Specimens of any particular species When Wieczorek came to the MVZ are generally dispersed among numerous in 1997, the abstract work of model design museums, so it is important to make multiple was over and his job was to implement the institutions data available simultaneously. data plan. He first had to extract 32 MVZFor the vertebrate collections, data would not databases from the mainframe and standardcome from a single warehouse, but directly ize all the different fields and data types. from each institution using a Distributed He integrated the information from these Generic Information Retrieval (DiGIR) files into a modern database system for all of the specimen-data, along with auxiliary data such as sound files and field notes. Imagine organizing the house of a compulsive hoarder, sorting through junk to find anything of value, creating piles, and putting it all into a logical place. According to Joyce Gross, a programmer for Berkeleys Natural History Museums consortium, this Mammal specimens at the Museum of Vertebrate Zoology. The collection also includes birds, reptiles, and amphibians. kind of work gives

48

Berkeley Science Review

Fall 2011

FEATURES Natural history

accounting for potential Georeferencing methods are now applierror. The MaNIS institucable to collections in general. After MaNIS, tions then implemented the Moore Foundation awarded Berkeley a a bulk geographic refcollaborative grant of $1.6 million dollars erencing effort after all to further advance georeferencing and to collections data were establish best practices. Under this grant entered into a database. Wieczorek created BioGeomancer, a workEach institution claimed bench and a web service to automatically specific regions to georegeoreference localities online. The last of the ference, generally those multi-institution digitization projects that for which they had good the MVZ participated in was ORNIS, for geographic knowledge. bird specimens. ORNIS used BioGeomancer UC Berkeley claimed to double the speed of georeferencing. The California, for example. MVZ now gives extended international Then the institutions geoworkshops on georeferencing to train referenced all collection students, researchers, and collection staff, points from that region on campus and elsewhere, on the standard no matter which museum concepts and procedures. the specimens came from. The MVZ is now working to tackle Wiezcoreks eyes lit up the problem of sustainability. There are not when he explained how currently enough resources to keep up with A researcher at the the Essig Museum of Entomology displays one of hundreds people liked this way of the demand of institutions that want to get of specimen drawers waiting to be digitized. The drawers are tightly packed doing things. They loved involved in MaNIS, HerpNET, ORNIS, and with insects, averaging 200 specimens each. it. Collaborative georefertheir sister vertebrate network, FishNet. The encing created a commuMVZ, under principal investigator Carla nity, because people from Cicero, just received an National Science protocol, developed in a collaboration different museums had to talk to each other Foundation grant to create and coordinate between Berkeley, the University of Kansas, when issues came up. The collaboration and VertNet, which would combine data from and the California Academy of Sciences. The the sense of community might have been all the vertebrate disciplines. The goal is basic idea is that a user submits a query from the best thing that ever came out of MaNIS. to streamline the process of data publisha web portal, the portal sends the query out Having georeferenced localities ing, remove the need for to all the participating institutions via DiGIR, increases the value of museum data. Not and results are compiled and sent back in the only can users now get vertebrate data online, form of a table, map, or downloadable file. but they can also easily create distribution The MVZ helped lead the three multimaps showing specimen localities. When the institutional collaborations that used DiGIR a collaboration later to provide simultaneous access to collecformed among 52 tions data online. The first was the Mammal herpetology colNetworked Information System (MaNIS), lections, producwhich originally consisted of 17 collections. ing the HerpNET Initially, there were a bunch of big name database, they had institutions that were skeptical. Some of a successful example them thought that digitization would never to follow, and were able happen. Others said it sounded like a good to use the same collaboraidea but they werent ready, says Wieczorek. tive georeferencing model, tools, Before the end of funding for the project, and procedures. however, all of the big institutions were on the waiting list to participate, and all have since become valuable contributors. MaNIS was wildly successful in creatPinned beetle (opposite page) and wasp (right) specimens. ing new tools for data mobilization. They The pin or angle of the label can potentially obstruct developed a method to standardize the important information, which is sometimes hand-written. process of geographically referencing text The imaging process for specimens like this takes considerable time and effort. descriptions of specimen locations, and

INSECT AND PORTRAIT: MAREK JAKUBOWSKI; MAMMALS: CHRIS CONROY, MVZ

FEATURES Natural history

participating institutions to maintain their own servers, and increase the performance and capabilities of the networks under one cloud-based platform. Though early in the project, Wieczorek and colleague Aaron Steele are confident that the 176 participating collections and the waiting list of 61 more can all be transitioned into the new VertNet within the first year and a half of the threeyear project. The economic impact will be

is only a small fraction of the specimens in California collections. Unique challenges associated with insect specimens compound the problem of huge collection sizes; specimens are small and delicate, and the labels are tiny and difficult to read. The labels and specimens also have pins sticking through them, obscuring information. For these collections, the Essig team must find a way to mass-process data

them together to create a single high-resolution image. With the final mosaic image, one can then zoom from a large picture of the entire drawer of 200 insects to the tiny hairs on the leg of a single specimen. Drawer-scanning technology has not been applied to insect labels, but instead to high-resolution images of the specimens themselves. Obtaining data from labels is more difficult. Multiple stacked labels rest

Essig Museum of Entomology


1939 1 142 56,000 6,000,000 2%
an estimated 20-fold reduction in the cost of maintaining the network.

Museum of versus Vertebrate Zoology


founding year full-time staff oldest specimen age number of species number of specimens amount digitized 1908 13 175 18,200 677,000 100%
below the specimens, containing useful information on the time and place of collection. Essig staff still must arrange specimens so that all labels are visible, a process that is extremely time consuming when working with large numbers of samples. One potential benefit of scanning drawers would be to simplify workflow by eliminating the necessity to shoot single photos of specimens and save individual files manually. The British Museum of Natural History has partnered with Smart Drive Ltd. to create the SatScan tray scanner and to develop software that can crop images of individual specimens and save files automatically using standard filenames. GigaPan is a similar technology, developed by the NASA Ames Research Center, Google, and the Carnegie Mellon Museum of Natural History, that uses a camera mount suitable for high-resolution cameras to take photos automatically from

Potential new approaches for large Entomology collections


The question now is whether the models provided by past and ongoing efforts on the smaller vertebrate collections are transferable to the enormous amount of data associated with invertebrate collections. Digitization projects for entomology collections face incredible challenges, despite the helpful precedents from the MVZ and other vertebrate collections. The Essig museum at Berkeley alone has more than 6 million specimens in its collection. Calbug has an NSF grant to digitize 1.2 million specimens from collections across California using methods similar to those from MVZs digitization efforts. While this is one of the largest attempts to digitize arthropod collections, it

from specimens and automate portions of the current workflow. One approach that Calbug has been experimenting with is taking photographs of individual specimens in the collection, so that students and volunteers can enter data from images. It takes a significant amount of time to arrange specimens and labels such that all of the labels stacked on a pin beneath the specimen are legible. Peter Oboyski, a postdoc working with Calbug, envisions a Ford-style assembly line with a designated station for specimens that is set up for efficient image capture. Essig staff are experimenting with new imaging techniques and methods to simplify workflow during the image-capture process. One option is to capture high-resolution images of entire drawers. Drawer-scanning technology takes multiple images from various angles across the drawer and stitches

50

Berkeley Science Review

Fall 2011

HELENE MOORMAN

FEATURES Natural history

different angles throughout a drawer and software to stitch the images together. The goal for all of these efforts is to increase the current production rate of 5,500 images every two weeks to at least 12,000. Essig staff are testing different methods to identify the most efficient one. Undergraduate students are essential to carrying out this work; as Oboyski put it, Most projects on this campus would never get done without undergraduate help. They are the unsung heroes of research on campus. The Calbug team plans to use specimen images together with large-scale approaches, such as crowd sourcing and automatic text recognition, to put the data into a useful structure. They are planning to collaborate with the Citizen Science Alliance, an organization that has been very successful in developing web interfaces for citizen science projects. One of their projects involves digitizing

weather logs from Royal Navy ships around the time of World War I; on launch day of the project they had 100,000 pages digitized. The crowd-sourcing approach provides a mechanism for hundreds or thousands of people to do data entry work and to learn about collections and global research at the same time. It may also be possible to develop optical character recognition (OCR) software to interpret information automatically from insect labels. Current OCR software does not recognize the type of information on insect labels, including handwriting, typeface, and various abbreviations for the same word (California might be CA, Cal, or Calif). Calbug would have to build a dictionary of all possible abbreviations for words that may be found on the labels. They are working on this dictionary to further explore the possibility of creating OCR software and to create

lookup tables for citizen science data entry. Following in the footsteps of the MVZ, and benefitting from the technologies already developed, the Essig museum may become a leader in digitization of larger invertebrate collections. If successful, they will increase the scale of data capture by orders of magnitude. This would pave the way for collections throughout the world to digitize massive numbers of specimens. Each specimen, each drawer, each taxonomic group, and each of the natural history collections has the potential to add something important to our understanding of the flux of biodiversity. The combination of collections from around the world will hopefully lead to profound new insights.

Joan Ball is a graduate student in environmental science, policy, and management.

Eran Karmon Editors Award


In memory of Eran Karmon, co-founder and first Editor in Chief of the Berkeley Science Review. This award is given annually to the Editor in Chief of the BSR thanks to a generous donation from the Karmon family.

Fall 2011

Berkeley Science Review

51

book review
The Instant Physicist
by Richard A. Muller, Illustrated by Joey Manfre W.W. Norton & Company 139 pages, $16.95 illustrated by Joey Manfre. Each two-page entry consists of a colorful cartoon on one side and less than a page of writing on the other, making this an ideal book to pick up for only a moment or two at a time. After thumbing through the first few pages, I already felt smarter for my newly-gained knowledge of the easy-to-digest tidbits Professor Muller presents. After a few days of sporadic reading, a curious thing happened; I began to reference the information I learned from the book, often. It is amazing just how easily arcane physics knowledge can be woven into day-to-day conversation. And when one of my pedantic friends started to read the book, he took the information and ran with it, holding multiple campfire audiences captive. Some entries were fascinating and stood by themselves; some entries, however, left me hanging. I wanted more than the short entry and corresponding cartoon provided, and feared that the (sometimes shocking) snippets of information could be misinterpreted if extrapolated upon without additional influencing factors. One that hit me particularly hard was the entry about organic foods being more toxic than their pesticide-controlled counterparts due to the fact that the naturally occurring insect resistance selected for in organic plants are thousands of times more carcinogenic than the pesticides used in conventional agriculture. While I understand this logic, I am not quite ready to give up my eating habits based on a picture of a fly slapping a poison sticker on a barrel full of organic apples. But it did get me thinking, and perhaps that is what The Instant Physicist does best: providing food for thought and presenting a slew of quick facts that causes an inquisitive mind to question conventionally held notions or seek further data on arcane snippets. Some entries also felt more silly than fascinating. It is interesting that a gram of chocolate chip cookies contains more energy than a gram of TNT, but the comment about giving chocolate chip cookies and sledgehammers to teenagers being more effective at destroying a car than blowing it up with TNT provides more silliness than knowledge. Then again, the book is about showiness as much as it is about the physics, and that is partly what makes the book enjoyable. There are plenty of dry volumes without cartoons and silliness to go around; why not allow this book to be over the top? I particularly enjoyed the entry about Plutos revoked planetary status. Who does give the International Astromical Union (IAU) the right to decide on a status that is older than the organization itself? Congratulations, Dr. Muller, for reinstating Plutos planet-ness with a 512 to 0 classroom vote. Despite the occasional groan or roll of the eyes, this book gives exactly what it promises: instant physicist status.

was eager to open my copy of The Instant Physicist when it arrived. It was hardbound, colorful, and best of all, skinny. With time always at a premium, a short volume that promises to help you learn astonishing facts, win arguments with friends and relatives, make crazy-sounding bets and win money in less than 150 pages sounds to good to me. The book is written by Berkeley physics professor Richard Muller, whose popular class Physics for Presidents won best class two years in a row, and humorously

Erin Jarvis is a graduate student in integrative biology.

physWannabe
physWannabe: Did you know that alcohol must be radioactive to be sold? plebius: What? physWannabe: Yes... [explanation from The Instant Physicist and a few other random facts] plebius: Wow, thats amazing! You should start a blog or something.

52

Berkeley Science Review

Fall 2011

REPRINTED FROM THE INSTANT PHYSICIST: AN ILLUSTRATED GUIDE BY RICHARD A. MULLER, ILLUSTRATED BY JOEY MANFRE. ILLUSTRATION 2011 BY JOEY MANFRE. USED WITH PERMISSION OF THE PUBLISHER, W. W. NORTON & COMPANY, INC.

www.strategenius.org Placement for Full-time Faculty and Administrative Positions, and Internships We offer competitive salaries ranging from $49,000289,000 StratGenius works with effective educators of all backgrounds, particularly under-represented educators: women in math/science, men in elementary and people of color in all areas. We have over 20 years experience in placing candidates. You will benefit from name recognition, strong relationships and a solid reputation for results built by StratGenius.orgs founder and President, Orpheus Crutcheld, who gives each candidate the personalized attention s/he deserves. Benets of working with StratGenius We have a solid reputation for personalized service from both candidates and schools. As a national boutique firm we screen schools for effectiveness and commitment to diversity. The geographic reach of our Member schools is nationwide including: MA, RI, CT, NY, NJ, DC, MD, VA, SC, GA, FL, IL, MN, AZ, CO, WA, OR and all over California. Internships offered, Elementary Internships: Work as a co-teacher, with a Master Teacher as your Mentor. Middle/High School Internships: Teach with a reduced load, mentoring, professional development, scholarship and support.

QB3 Startup in a Box


structure for life science entrepreneurs

Unpack it!
legal assistance business bank account SBIR grant writing mentoring operational checklists qb3.org/startup-box

Fall 2011

Berkeley Science Review

53

f aculty profile
Karen De Valois
True visionary

aren De Valois knows the benefitsand pitfallsof scientific collaboration. During her three-decade career as a Berkeley Professor of psychology and vision science, questions about her research were often directed to her husband, collaborator, and fellow professor, the late Russell De Valois. Though she worked for a decade as an independent principal investigator and adjunct professor through Russells lab, she eventually became a professor in her own right in the early 1980s, establishing a research program that investigated human and animal color vision. While she may be best known for Spatial Vision, a cornerstone text in the perception field co-authored with her husband, she estimates that they collaborated on only two experimental publications over the first 20 years of her career. Originally from Georgia, De Valois completed her psychology PhD at Indiana University in 1973. A highlight of her

graduate experience was receiving a key to the departments calculator room, which housed the expensive and large devices of the day. At Berkeley, De Valois was the second female faculty member in physiological optics (as vision science was then known), and served as chair of the psychology department from 19982003. At the 2010 vision science group retreat, De Valois gave an eye-opening keynote on the history of women in vision science, inspiring the creation of cross-departmental Women in Vision Research meetings to discuss barriers, mentorship, and career issues. AA: What got you interested in color vision? Are you an artist? KDV: A very bad one, only for fun! There are a number of things about color that I think make it particularly interesting. One, a trivial point, is that its beautiful; it makes the world more interesting to look at. More importantly, although color vision certain-

ly is not necessary for life, or for survival, it is of all things in vision, or any sensory system, or indeed any behavior, the one area in which we may be likely first to understand the relationship between behavior and the underlying physiology. Its that question that really drives me. Once we really understand that, then that opens up a world of possibilities. If we truly can understand the biology as its related to the behavior, then we have done something really remarkable. AA: How would you advise young researchers to approach scientific collaboration? KDV: There are many virtues to collaborative research, particularly as things become more complex; we need to collaborate with people whose expertise is in areas other than our own. There is, however, a danger in doing so. If you are a young faculty member trying to get tenure and your name comes out on lots of papers but

54

Berkeley Science Review

Fall 2011

AMANDA ALVAREZ

theyre all collaborative papers, particularly if you published with other scientists who are senior to you, then no matter where your name is on the list of authors, you will not get the same kind of credit for it. So collaboration is a great thing, but you have to make sure that you as an individual are given adequate credit. And I dont think theres any evil intent: its a legitimate question, but its going to very hard to get the credit that you need when everything you do is done in collaboration with someone more senior. This is why I encourage young faculty members not to continue publishing with their advisor for very long. You can go back and do it again later. You have to develop your own independent strain of research, which is very hard if you are still publishing with your advisor, after you have your own lab. AA: Did you face these problems in trying to distinguish your research from your husbands? KDV: Very much so. We made a decision early on, although he did both physiology and psychophysics, and I did too. We decided to split the two: I would primarily do psychophysics, he would primarily do physiology. This way, we could establish a

degree of independence. It was a practical choice, too. During those years I had very young children, and physiological experiments dont stop at certain times. Even so, there were still people who would ask Russ in my presence about work that I had done, that he was not involved in. He would always respond, I dont know, I wasnt a part of that, ask Karen. I established an independent research program, and although our interests were similar, we worked on unrelated questions. Much later we got to the point where we felt like it was ok for us to collaborate again. AA: Has teaching also been a big part of your career? KDV: I taught a proseminar for graduate students in vision, advanced undergraduate classes in vision, undergraduate biological psychology, and visual sensitivity for optometry students. I always enjoyed teaching about color or spatial vision. People can relate immediately, because weve all experienced it. For one of my last lectures, I colored my hair hot pink! It surprised the class; I got a round of applause at the end. AA: How has the practice of science changed since you were a graduate student? KDV: Ive been fortunate to be in science through a period of just extraordinary technological innovation. In my first year as a graduate student, one of the required courses was electrical design and wiring. You could put individual logic gates in a rack, wire it up, and have some-

thing like a primitive computer. Before that we had to punch paper tape. By the time that I graduated there was a mainframe computer on campus. A few years later, Russ and I got our first tabletop calculator, a Hewlett-Packard. It was so expensive that we had it in a steel case chained to a desk. That year we also got our first lab computer. It had eight kilobytes of memory. We programmed in machine language. Gradually, as computers came online it became possible to display more complex imagery [for visual experiments]. The whole field that I worked incolor and pattern, spatial visionwould not have been possible before the advent of computer displays. AA: In your talk on the history of women in vision science, you mentioned a pioneer in the field, Christine Ladd-Franklin. How have attitudes to women in science changed? KDV: When I was a graduate student, there were about 40 faculty members in my department. There were, by formal design, no women faculty. It was shocking to me to be told that the women admitted to graduate school had to have substantially better records than the men did. Female graduate students were not recommended for jobs at major research universities; they were expected to teach at small colleges. Both the law and cultural attitudes have made these issues less of a problem. Now we have time off the tenure clock, child care, but sometimes institutions tend not to be particularly supportive. I really was fortunate to have a husband who understood and was very supportive, but you shouldnt have to have that. The story of Christine Ladd-Franklin, who was denied her PhD by Johns Hopkins for 43 years, is very instructive in a lot of ways. She was never given the recognition that someone else in that situation might have received. It really does show how far weve come, not that we dont still have ways to go.

AMANDA ALVAREZ

A pseudoisochromatic plate for testing color vision in patients.

Amanda Alvarez is a graduate student in vision sciences.


Fall 2011 Berkeley Science Review

55

t oolbox
M
athematics has a reputation for being exact. Consider the Pythagorean Theorem, which exactly relates the lengths of the three sides of a triangle. I didnt know Pythagoras (c. 500 B.C.E.), but I like to imagine him scurrying back and forth around a giant marble triangle, measuring the sides with a piece of string and chiseling the values into his granite lab notebook. When he sat down to ponder over his results, what did he make of his measurements? Specifically, what did he make of the fact that each time he went to measure side number one, he got a slightly different number? As it turns out, Pythagoras was something of an idealist. By ignoring the imperfections in his measurements, he gave us the beautiful and precise formula we love. However, another story lies in exactly that which Pythagoras overlooked: uncertainty. Probability and statistics are the branches of mathematics that embrace uncertainty and the fact that in the real world, we acquire all of our knowledge through imperfect measuring devices. Historically speaking, probability and statistics were a bit late to the party. It wasnt until 1654 (for those keeping track, thats a mind-blowing two millenia after Pythagoras) that all-star mathematicians Blaise Pascal and Pierre de Fermat teamed up to give uncertainty a mathematical treatment. This lag is reflected in the types of mathematics that are taught last and remembered least in American schools (a recent survey by the National Science Foundation showed that 43% of Americans lack an understanding of probability, a figure that only 57% of Americans grasp, unfortunately). Most current-generation traffic routing algorithms are Pythagoras-like in that they ignore the fact that measurements have variability each time you drive down a road, it takes a different amount of time. The figure below shows what the full dataset might look like for two fictional roads, with the height of the curve at each point showing how likely it is for any one trip to last that particular amount of time. This chart is known as a probability distribution. A Pythagoras-like algorithm would assume that the road has one travel timethis distributions expected value, which is simply the average for all the trips observed (the dotted green and black vertical lines). This simplification might speed up the calculations, but we throw away any information we had about how dependable (that is, uncertain) the estimates are. To better appreciate the devastating repercussions this might have, imagine you have to be home in 10 minutes to take a pie out of the oven. You can choose one of two routes, shown in the figure below. Option one is Bieber Speedway, a high speed expressway, which happens to also cut through the Justin Bieber Wax Sculpture Park, so that the occasional walking tour crosses the street and holds up traffic. Option two is the tranquil and secluded Country Road, which has a lower speed limit, but no Bieber Park. A current-generation router would only know the expected values (dotted lines) of both distributions: on average, Bieber Speedway gets you home in 6 minutes, which is faster than Country Road, which takes 8 minutes. Accordingly, the algorithm would direct you to take Bieber Speedway. It turns out that today just wasnt your day, and a mob of fans crosses the street, adding 5 minutes to your trip. You get home to find that your pie is a sooty heap. Closer inspection of the two probability distributions reveals how understanding uncertainty could have saved your dessert. By collapsing the entire travel time distribution to one number, traditional routing algorithms lose sight of the fact that a considerable proportion of trips down Bieber Speedway take longer than 10 minutes. Based on historical traffic data, there is a zero percent chance of a trip down Country Road taking 10 minutes or more, so while Country Road is slower on average, it is more reliable. Next-generation algorithms like those employed in the Reliable Router (Theres a map for that, pg. 36 of this issue) keep track of the travel time probability distribution, allowing the reliability of each road to factor into your route selection. Humans intuitively evaluate the uncertainty in life without much prompting, which is why when you ask for directions from a local, theyll probably send you through low-risk routes. Computers are blind to these subtletiesuntil we teach them. By endowing computers with the ability to appreciate uncertainty Expressway Country Road mean (6 min) mean (8 min) via the mathematics of probability and statistics, we come one step closer to automating good advice. Maybe a future generation of algorithms will also favor routes that take us Too late! right by the best slice of pie in town. 0 4 8 12 16 Trip time (min) Robert Gibboni is a graduate student in neuroscience.

Tra c jam
10 min trip

Expressway

6 min trip

Start
8.5 min trip

Destination
Country Road
8 min trip

Probability of occurrence

56

Berkeley Science Review

Fall 2011

AMY ORSBORN

berkeley
sciencereview.berkeley.edu

You might also like