Contact/News Media

Thursday, December 18, 2014

A clear, molecular view of the evolution of human color vision

By around 30 million years ago, our ancestors had evolved the ability to see the full-color spectrum of visible light, except for UV light.

By Carol Clark

Many genetic mutations in visual pigments, spread over millions of years, were required for humans to evolve from a primitive mammal with a dim, shadowy view of the world into a greater ape able to see all the colors in a rainbow.

Now, after more than two decades of painstaking research, scientists have finished a detailed and complete picture of the evolution of human color vision. PLOS Genetics published the final pieces of this picture: The process for how humans switched from ultraviolet (UV) vision to violet vision, or the ability to see blue light.

“We have now traced all of the evolutionary pathways, going back 90 million years, that led to human color vision,” says lead author Shozo Yokoyama, a biologist at Emory University. “We’ve clarified these molecular pathways at the chemical level, the genetic level and the functional level.”

Co-authors of the PLOS Genetics paper include Emory biologists Jinyi Xing, Yang Liu and Davide Faggionato; Syracuse University biologist William Starmer; and Ahmet Altun, a chemist and former post-doc at Emory who is now at Fatih University in Istanbul, Turkey.

Yokoyama and various collaborators over the years have teased out secrets of the adaptive evolution of vision in humans and other vertebrates by studying ancestral molecules. The lengthy process involves first estimating and synthesizing ancestral proteins and pigments of a species, then conducting experiments on them. The technique combines microbiology with theoretical computation, biophysics, quantum chemistry and genetic engineering.

Five classes of opsin genes encode visual pigments for dim-light and color vision. Bits and pieces of the opsin genes change and vision adapts as the environment of a species changes.

“Gorillas and chimpanzees have human color vision,” Yokoyama says. “Or perhaps we should say that humans have gorilla and chimpanzee vision.”

Around 90 million years ago, our primitive mammalian ancestors were nocturnal and had UV-sensitive and red-sensitive color, giving them a bi-chromatic view of the world. By around 30 million years ago, our ancestors had evolved four classes of opsin genes, giving them the ability to see the full-color spectrum of visible light, except for UV.

“Gorillas and chimpanzees have human color vision,” Yokoyama says. “Or perhaps we should say that humans have gorilla and chimpanzee vision.”

For the PLOS Genetics paper, the researchers focused on the seven genetic mutations involved in losing UV vision and achieving the current function of a blue-sensitive pigment. They traced this progression from 90-to-30 million years ago.

The researchers identified 5,040 possible pathways for the amino acid changes required to bring about the genetic changes. “We did experiments for every one of these 5,040 possibilities,” Yokoyama says. “We found that of the seven genetic changes required, each of them individually has no effect. It is only when several of the changes combine in a particular order that the evolutionary pathway can be completed.”

In other words, just as an animal’s external environment drives natural selection, so do changes in the animal’s molecular environment.

Mice are nocturnal and, like the primitive human ancestor of 90 million years ago, have UV vision and limited ability to see colors.

In previous research, Yokoyama showed how the scabbardfish, which today spends much of its life at depths of 25 to 100 meters, needed just one genetic mutation to switch from UV to blue-light vision. Human ancestors, however, needed seven changes and these changes were spread over millions of years. “The evolution for our ancestors’ vision was very slow, compared to this fish, probably because their environment changed much more slowly,” Yokoyama says.

About 80 percent of the 5,040 pathways the researchers traced stopped in the middle, because a protein became non-functional. Chemist Ahmet Altun solved the mystery of why the protein got knocked out. It needs water to function, and if one mutation occurs before the other, it blocks the two water channels extending through the vision pigment’s membrane.

“The remaining 20 percent of the pathways remained possible pathways, but our ancestors used only one,” Yokoyama says. “We identified that path.”

In 1990, Yokoyama identified the three specific amino acid changes that led to human ancestors developing a green-sensitive pigment. In 2008, he led an effort to construct the most extensive evolutionary tree for dim-light vision, including animals from eels to humans. At key branches of the tree, Yokoyama’s lab engineered ancestral gene functions, in order to connect changes in the living environment to the molecular changes.

The PLOS Genetics paper completes the project for the evolution of human color vision. “We have no more ambiguities, down to the level of the expression of amino acids, for the mechanisms involved in this evolutionary pathway,” Yokoyama says.

Images: Thinkstock

Related:
Evolutionary biologists urged to adapt their research methods
Fish vision makes waves in natural selection

Emory math in finals for Discover Magazine's "People's Choice" award

Much of the work of number theorist Ken Ono, above, involves solving long-standing mysteries stemming from the work of Indian math genius Srinivasa Ramanujan.

By Carol Clark

An Emory math breakthrough, “Mother Lode of Mathematical Identities,” is down to the final two in voting for Discover Magazine’s “People’s Choice” for top science story of 2014. The final round will continue through December 24, and you can cast your vote by clicking here.

The editors of Discover Magazine sifted through all their science stories of the year and selected the 100 most important ones for 2014. They ranked the find by Emory mathematician Ken Ono and collaborators 15th.

Since the magazine opened up these stories for “People’s Choice” voting in November, the math breakthrough has kept moving up in the rankings.

Last summer, Ono and his collaborators Michael Griffin and Ole Warnaar found a framework for the celebrated Rogers-Ramanujan identities and their arithmetic properties, yielding a treasure trove of algebraic numbers and formulas to access them.

“Ole found this huge vein of gold, and we then figured out a way to mine the gold,” Ono said of the discovery. “We went to work and showed how to come full circle and make use of the formulas. Now we can extract infinitely any functions whose values are these beautiful algebraic numbers.”

And Ono’s newest discovery, “Mathematicians prove the Umbral Moonshine Conjecture,” will be generating buzz in 2015. Ono will be presenting the proof of the conjecture, including the work of collaborators, on January 11 at the Joint Mathematics Meeting in San Antonio, the largest mathematics meeting in the world.

Related:
Mathematicians find algebraic gold
Mathematicians prove the Umbral Moonshine Conjecture

Monday, December 15, 2014

Mathematicians prove the Umbral Moonshine Conjecture

In theoretical math, the term "moonshine" refers to an idea so seemingly impossible that it seems like lunacy.

By Carol Clark

Monstrous moonshine, a quirky pattern of the monster group in theoretical math, has a shadow – umbral moonshine. Mathematicians have now proved this insight, known as the Umbral Moonshine Conjecture, offering a formula with potential applications for everything from number theory to geometry to quantum physics.

“We’ve transformed the statement of the conjecture into something you could test, a finite calculation, and the conjecture proved to be true,” says Ken Ono, a mathematician at Emory University. “Umbral moonshine has created a lot of excitement in the world of math and physics.”

Co-authors of the proof include mathematicians John Duncan from Case Western Reserve University and Michael Griffin, an Emory graduate student.

“Sometimes a result is so stunningly beautiful that your mind does get blown a little,” Duncan says.

Duncan co-wrote the statement for the Umbral Moonshine Conjecture with Miranda Cheng, a mathematician and physicist at the University of Amsterdam, and Jeff Harvey, a physicist at the University of Chicago.

Ono will present their work on January 11, 2015 at the Joint Mathematics Meetings in San Antonio, the largest mathematics meeting in the world. Ono is delivering one of the highlighted invited addresses.

Ono gave a colloquium on the topic at the University of Michigan, Ann Arbor, in November, and has also been invited to speak on the umbral moonshine proof at upcoming conferences around the world, including Brazil, Canada, England, India, and Germany.

The number of elements in the monster group is larger than the number of atoms in 1,000 Earths.

It sounds like science fiction, but the monster group (also known as the friendly giant) is a real and influential concept in theoretical math.

Elementary algebra is built out of groups, or sets of objects required to satisfy certain relationships. One of the biggest achievements in math during the 20th century was classifying all of the finite simple groups. They are now collected in the ATLAS of Finite Groups, published in 1985.

“This ATLAS is to mathematicians what the periodic table is to chemists,” Ono says. “It’s our fundamental guide.”

And yet, the last and largest sporadic finite simple group, the monster group, was not constructed until the late 1970s. “It is absolutely huge, so classifying it was a heroic effort for mathematicians,” Ono says.

In fact, the number of elements in the monster group is larger than the number of atoms in 1,000 Earths. Something that massive defies description.

“Think of a 24-dimensional doughnut,” Duncan says. “And then imagine physical particles zooming through this space, and one particle sometimes hitting another. What happens when they collide depends on a lot of different factors, like the angles at which they meet. There is a particular way of making this 24-dimensional system precise such that the monster is its symmetry. The monster is incredibly symmetric.”

“The monster group is not just a freak,” Ono adds. “It’s actually important to many areas of math.”

It’s too immense, however, to use directly as a tool for calculations. That’s where representation theory comes in.

The shadow technique is a valuable tool in theoretical math.

Shortly after evidence for the monster was discovered, mathematicians John McKay and John Thompson noticed some odd numerical accidents. They found that a series of numbers that can be extracted from a modular function and a series extracted from the monster group seemed to be related. (One example is the strange and simple arithmetic equation 196884 = 196883 + 1.)

John Conway and Simon Norton continued to investigate and found that this peculiar pattern was not just a coincidence. “Evidence kept accumulating that there was a special modular function for every element in the monster group,” Ono says. “In other words, the main characteristics of the monster group could be read off from modular functions. That opened the door to representation theory to capture and manipulate the monster.”

The idea that modular functions could tame something as unruly as the monster sounded impossible – like lunacy. It was soon dubbed the Monstrous Moonshine Conjecture.

(The moonshine reference has the same meaning famously used by Ernest Rutherford, known as the father of nuclear physics. In a 1933 speech, Rutherford said that anyone who considered deriving energy from splitting atoms was "talking moonshine.”)

In 1998, Richard Borcherds won math’s highest honor, the Fields Medal, for proving the Monstrous Moonshine Conjecture. His proof turned this representation theory for the monster group into something computable.

Fast-forward 16 years. Three Japanese physicists (Tohru Eguchi, Hirosi Ooguri and Yuji Tachikawa) were investigating a particular kind of string theory involving four-dimensional spaces. The appearance of numbers from the Mathieu Group M24, another important finite simple group, was unexpected.

“They conjectured a new way to extract numbers from the Mathieu Group,” Duncan says, “and they noticed that the numbers they extracted were similar to those of the monster group, just not as large.” Mathematician Terry Gannon proved that their observations were true.

It was a new, unexpected analogue that hinted at a pattern similar to monstrous moonshine.

“What I hope is that we will eventually see that everything is unified, that monstrous moonshine and umbral moonshine have a common origin,” Duncan says.

Duncan started investigating this idea with physicists Cheng and Harvey. “We realized that the Mathieu group pattern was part of a much bigger picture involving mock modular forms and more moonshine,” Duncan says. “A beautiful mathematical structure was controlling it.”

They dubbed this insight the Umbral Moonshine Conjecture. Since the final version of the more than 100-page conjecture was published online last June, it has been downloaded more than 2,500 times.

The conjecture caught the eye of Ono, an expert in mock modular forms, and he began pondering the problem along with Griffin and Duncan.

“Things came together quickly after the statement of the Umbral Moonshine Conjecture was published,” Ono says. “We have been able to prove it and it is no longer a guess. We can now use the proof as a completely new and different tool to do calculations.”

Just as modular forms are “shadowed” by mock modular forms, monstrous moonshine is shadowed by umbral moonshine. (Umbra is Latin for the innermost and darkest part of a shadow.)

“The job of a theoretical mathematician is to take impossible problems and make them tractable,” Duncan says. “The shadow device is one valuable tool that lets us do that. It allows you to throw away information while still keeping enough to make some valuable observations.”

He compares it to a paleontologist using fossilized bones to piece together a dinosaur.

The jury is out on what role, if any, umbral moonshine could play in helping to unravel mysteries of the universe. Aspects of it, however, hint that it could be related to problems ranging from geometry to black holes and quantum gravity theory.

“What I hope is that we will eventually see that everything is unified, that monstrous moonshine and umbral moonshine have a common origin,” Duncan says. “And part of my optimistic vision is that umbral moonshine may be a piece in one of the most important puzzles of modern physics: The problem of unifying quantum mechanics with Einstein’s general relativity.”

Images: NASA and Thinkstock.

Related:
Mathematicians trace source of Rogers-Ramanujan identities
New theories reveal the nature of numbers

Tuesday, December 9, 2014

Birdsong study reveals how brain uses timing during motor activity

Songbirds are one of the best systems for understanding how the brain controls complex behavior.  Image credit: Sam Sober.

By Carol Clark

Timing is key for brain cells controlling a complex motor activity like the singing of a bird, finds a new study published by PLOS Biology.

“You can learn much more about what a bird is singing by looking at the timing of neurons firing in its brain than by looking at the rate that they fire,” says Sam Sober, a biologist at Emory University whose lab led the study. “Just a millisecond difference in the timing of a neuron’s activity makes a difference in the sound that comes out of the bird’s beak.”

The findings are the first to suggest that fine-scale timing of neurons is at least as important in motor systems as in sensory systems, and perhaps more critical.

“The brain takes in information and figures out how to interact with the world through electrical events called action potentials, or spikes in the activity of neurons,” Sober says. “A big goal in neuroscience is to decode the brain by better understanding this process. We’ve taken another step towards that goal.”

Sober’s lab uses Bengalese finches, also known as society finches, as a model system. The way birds control their song has a lot in common with human speech, both in how it’s learned early in life and how it’s vocalized in adults. The neural pathways for birdsong are also well known, and restricted to that one activity.

“Songbirds are the best system for understanding how the brain controls complex vocal behavior, and one of the best systems for understanding control of motor behavior in general,” Sober says.



Researchers have long known that for an organism to interpret sensory information – such as sight, sound and taste – the timing of spikes in brain cells can matter more than the rate, or the total number of times they fire. Studies on flies, for instance, have shown that their visual systems are highly sensitive to the movement of shadows. By looking at the timing of spikes in the fly’s neurons you can tell the velocity of a shadow that the fly is seeing.

An animal’s physical response to a stimulus, however, is much slower than the millisecond timescale on which spikes are produced. “There was an assumption that because muscles have a relatively slow response time, a timing code in neurons could not make a difference in controlling movement of the body,” Sober says.

An Emory undergraduate in the Sober lab, Claire Tang, got the idea of testing that assumption. She proposed an experiment involving mathematical methods that she was learning in a Physical Biology class. The class was taught by Emory biophysicist Ilya Nemenman, an expert in the use of computational techniques to study biological systems.

“Claire is a gifted mathematician and programmer and biologist,” Sober says of Tang, now a graduate student at the University of California, San Francisco. “She made a major contribution to the design of the study and in the analysis of the results.”

Co-authors also include Nemenman; laboratory technician Diala Chehayeb; and Kyle Srivastava, a graduate student in the Emory/Georgia Tech graduate program in biomedical engineering.

The researchers used an array of electrodes, each thinner than a human hair, to record the activity of single neurons of adult finches as they were singing.

“The birds repeat themselves, singing the same sequence of ‘syllables’ multiple times,” Sober says. “A particular sequence of syllables matches a particular firing of neurons. And each time a bird sings a sequence, it sings it a little bit differently, with a slightly higher or lower pitch. The firing of the neurons is also slightly different.”

The acoustic signals of the birdsong were recorded alongside the timing and the rate that single neurons fired. The researchers applied information theory, a discipline originally designed to analyze communications systems such as the Internet or cellular phones, to analyze how much one could learn about the behavior of the bird singing by looking at the precise timing of the spikes versus their number.

The result showed that for the duration of one song signal, or 40 milliseconds, the timing of the spikes contained 10 times more information than the rate of the spikes.

“Our findings make it pretty clear that you may be missing a lot of the information in the neural code unless you consider the timing,” Sober says.

Such improvements in our understanding of how the brain controls physical movement hold many potential health applications, he adds.

“For example,” he says, “one area of research is focused on how to record neural signals from the brains of paralyzed people and then using the signals to control prosthetic limbs. Currently, this area of research tends to focus on the firing rate of the neurons rather than taking the precise timing of the spikes into account. Our work shows that, in songbirds at least, you can learn much more about behavior by looking at spike timing than spike rate. If this turns out to be true in humans as well, timing information could be analyzed to improve a patient’s ability to control a prosthesis.”

The research was supported by grants from the National Institutes of Health, the National Science Foundation, the James S. McDonnell Foundation and Emory’s Computational Neuroscience Training Program.

Bird graphic courtesy of Sam Sober.

Related:
Doing the math for how songbirds learn to sing
Birdsong study pecks theory that music is uniquely human

Thursday, December 4, 2014

Scientists zeroing in on psychosis risk factors

The onset of schizophrenia and other psychotic disorders typically occurs at about 21 years of age, with warning signs beginning around age 17, on average.

By Carol Clark

During the first phase of a major national study, scientists have uncovered a new cluster of preclinical symptoms linked to a significant increase in the risk that a young person will go on to develop a psychotic illness, including schizophrenia. The consortium of researchers, from Emory and seven other universities, has also discovered several biological processes tied to the transition from subtle symptoms to clinical psychosis.

The onset of schizophrenia and other psychotic disorders typically occurs at about 21 years of age, with warning signs, known as a prodromal syndrome, beginning around age 17, on average. About 30 to 40 percent of youth who meet current criteria for a prodromal syndrome will develop schizophrenia or another psychotic disorder.

“We are moving at an unprecedented pace towards identifying more precise predictors,” says Elaine Walker, an Emory professor of psychology and neuroscience. “By increasing our understanding of the factors that give rise to psychosis, we hope to ultimately improve the ability to provide preventive intervention.”

Walker is one of the principal investigators in the North American Prodrome Longitudinal Study (NAPLS). The National Institute of Mental Health (NIMH) funded the ongoing study, which unites the efforts of Emory, the University of North Carolina, Yale, Harvard, the University of Calgary, UCLA and UC San Diego, and the Feinstein Institute at Hillside Hospital.

“The only way we can do this research is by having a large consortium, combining a range of expertise, from genetics to neuro-endocrinology, psychology and psychiatry,” Walker says. “It is also difficult to identify individuals who are at risk for psychosis and in order to have enough statistical power, we need a large sample of study subjects.”

The consortium has published a flurry of 60 papers during the past four years, involving more than 800 adolescents and young adults who were showing clinical warning signs of impending psychosis and a group of 200 healthy youth.

Among the key findings: Prodromal youth with elevated levels of the stress hormone cortisol and indicators of neuro-inflammation are more likely to become psychotic within a year.

“We’ve developed a risk-prediction algorithm, including measures of symptoms as well as biomarkers, that we have made available for clinicians,” Walker says. “In the future, they can take saliva samples from at-risk patients to check cortisol levels, and to monitor those levels over time. As we get more information, we keep adding to the algorithm to improve the sensitivity and specificity of prediction. It’s important because anti-psychotic medications have a lot of side effects. You don’t want to give them to young people unless you are fairly confident that they are on the way to a psychotic disorder.”

The researchers are now working on refining a blood-biomarker algorithm that clinicians could use to monitor at-risk patients for signs of neuro-inflammation, oxidative stress, hormones and metabolism.

In addition to medication, cognitive therapy and other interventions that reduce stress may be used to help an individual safely make it through the high-risk period, Walker says. “We’ve found that youth with the highest risk tend to be both exposed to more stress and more reactive to stress.”

The consortium of researchers also discovered that the brains of at-risk patients who later develop psychosis show a dramatic decline of gray matter in the year leading up to the diagnosis. And they found that the elevation in a patient’s cortisol level predicts the magnitude of the decline in brain volume.

In 2008, the NIMH awarded $25 million to the consortium for the first phase of the NAPLS project, which lasted five years.

The NAPLS project is now entering its second five-year phase through additional NIMH funding, including a $2.4 million award to Emory.

“We have better technology than ever before for studying the human brain and changes in the brain over time,” Walker says. “We’re at an especially fruitful time in terms of discoveries we can make.” 

During phase two, “we will be looking more closely at hormones, especially stress hormones and indicators of neuro-inflammatory processes,” Walker says. “And we’re going to be looking much more closely at changes in brain structure and function over time. We’re hoping to identify in real-time, with much greater clarity, what is causing what. In other words, the chain of neural mechanisms.” 

Schizophrenia, the most extreme psychosis, affects about 1 percent of the population and can have devastating consequences. Most people diagnosed with schizophrenia are unable to hold a job or live independently for most of their lives. Preventing the onset of schizophrenia and other psychoses has become a major area of emphasis at the NIMH.

“Psychosis is extremely complex, there is no doubt about it, and we’re learning that it’s even more complex than we previously realized,” Walker says. “But if we’re ever going to make progress in prevention and treatment, we’re going to have to come to grips with that complexity and fully understand it.”

For more information about the project, contact the Mental Health and Development Program at Emory: 404-727-7547.

Thinkstock photos by Brian McEntire (top) and Michael Blann (bottom).

Related:
Schizophrenia: What we know now
Study of psychosis risk and brain to track effects of Omega-3 pills
Daily pot smoking may hasten psychosis onset

Tuesday, December 2, 2014

'A beautiful find' by Emory mathematician vies for top science story of 2014

Much of the work of number theorist Ken Ono, above, involves solving long-standing mysteries stemming from the work of Indian math genius Srinivasa Ramanujan. (Emory Photo/Video)

A find by Emory mathematician Ken Ono and collaborators ranked 15th in Discover Magazine’s top 100 science stories for 2014.

That makes the discovery of the “Mother Lode of Mathematical Identities” eligible for the magazine’s “people’s choice” awards for the top science story of the year. You can cast your vote for the “Math Breakthrough” by clicking here. The Emory math discovery made it through the first two rounds of voting and is now among the four finalists.

Last summer, Ono and his collaborators Michael Griffin and Ole Warnaar found a framework for the celebrated Rogers-Ramanujan identities and their arithmetic properties, yielding a treasure trove of algebraic numbers and formulas to access them.

“Ole found this huge vein of gold, and we then figured out a way to mine the gold,” Ono said of the discovery. “We went to work and showed how to come full circle and make use of the formulas. Now we can extract infinitely any functions whose values are these beautiful algebraic numbers.”

In the people’s choice awards, the math discovery is vying against stories of cosmic inflation, cybersecurity leaks, the collapse of the West Antarctic ice sheet, the battle against the Ebola outbreak, the genomes of the first Americans, entangled photons and the Rosetta spacecraft’s rendezvous with a comet. It was another big year for science news, showcasing a wide range of disciplines. In fact, one of the only things all of these science advances have in common is their reliance on math.

Related:
Mathematicians find algebraic gold

Friday, November 21, 2014

Athletes' testosterone surges not tied to winning, study finds

Kathleen Casto, number 1931 in the center, shown competing in cross country as an undergraduate in North Carolina. She is now a graduate student in psychology at Emory, studying the hormonal correlates of competition in women.

By Carol Clark

A higher surge of testosterone in competition, the so-called “winner effect,” is not actually related to winning, suggests a new study of intercollegiate cross country runners.

The International Journal of Exercise Science published the research, led by David Edwards, a professor of psychology at Emory University, and his graduate student Kathleen Casto.

“Many people in the scientific literature and in popular culture link testosterone increases to winning,” Casto says. “In this study, however, we found an increase in testosterone during a race regardless of the athletes’ finish time. In fact, one of the runners with the highest increases in testosterone finished with one of the slowest times.”

The study, which analyzed saliva samples of participants, also showed that testosterone levels rise in athletes during the warm-up period. “It’s surprising that not only does competition itself, irrespective of outcome, substantially increase testosterone, but also that testosterone begins to increase before the competition even begins, long before status of winner or loser are determined,” Casto says.

Cross country is "an intense experience."
Casto was a Division I cross country runner as an undergraduate at the University of North Carolina, Wilmington. She majored in psychology and chemistry and became interested in the hormonal correlates of competition in women. She applied as a graduate student in psychology at Edwards’ lab when she learned about his work.

Edwards has been collecting data since 1999 on hormone levels of Emory sports teams that have volunteered to participate. The research has primarily involved women athletes. Edwards’ lab also developed a questionnaire to measure the status of an athlete. Members of the team rate the leadership ability of other individuals on the team, to provide a combined rating score for each of the participating athletes.

Many of the labs’ previous studies involved sports such as volleyball and soccer that require team coordination, intermittent physical exertion and only overall team outcomes of win or loss. Casto wanted to investigate how hormones relate to individual performance outcomes in cross country racing.

Cross country racing is both a team and individual sport. Teams are evaluated through a points-scoring system, but runners are also judged on their individual times, clearly ranking their success in an event.

“Cross country running is a unique sport. It’s associated with a drive to compete and perseverance against pain over a relatively long period of time,” Casto says. “It’s an intense experience.”

Participants in the study were consenting members of the 2010 and 2011 Emory varsity men’s and women’s cross country teams. Each participant provided three saliva samples: One before warming up (to serve as a baseline), one after warming up, and a third immediately after crossing the finish line.

Testosterone went up from the baseline for both men and women during the warm-up, while levels of cortisol – a hormone related to stress – did not.

At the end of the race, both men and women participants showed the expected increases in cortisol and surges in testosterone. Neither hormone, however, was related to finish time.

This research follows on the heels of a 2013 study of women athletes in a variety of sports by Edwards and Casto, published in Hormones and Behavior. They found that, provided levels of the stress hormone cortisol were low, the higher a woman’s testosterone, the higher her status with teammates.

The body uses cortisol for vital functions like metabolizing glucose. “Over short periods, an increase in cortisol can be a good thing, but over long periods of chronic stress, it is maladaptive,” Casto says. “Among groups of women athletes, achieving status may require a delicate balance between stress and the actions or behaviors carried out as a team leader.”

Higher baseline levels of testosterone have been linked to long-term strength and power, such as higher status positions in companies.

“Although short-term surges of testosterone in competition have been associated with winning, they may instead be indicators of a psychological strength for competition, the drive to win,” Casto says.

Photos courtesy Kathleen Casto.

Tuesday, November 18, 2014

Climate change will slow China's reduction in infectious diseases

Shanghai depends on water from the Huangpu River, which is connected to the heavily polluted Tai Lake. Photo by Jakub Halun.

From Woodruff Health Sciences Center

China has made significant progress increasing access to tap water and sanitation services, and has sharply reduced the burden of waterborne and water-related infectious diseases over the past two decades. Climate change, however will blunt China’s efforts at further reducing these diseases, finds a study in the latest edition of Nature Climate Change.

By 2030, changes to the global climate could delay China’s progress reducing diarrheal and vector-borne diseases by up to seven years, the study shows. That is, even as China continues to invest in water and sanitation infrastructure, and experience rapid urbanization and social development, the benefits of these advances will be slowed in the presence of climate change.

The study, led by Justin Remais, associate professor of environmental health at Emory’s Rollins School of Public Health, provides the first estimates of the burden of disease due to unsafe water, sanitation and hygiene in a rapidly developing society that is subjected to a changing climate.

“Our results demonstrate how climate change can lead to a significant health burden, even in settings where the total burden of disease is falling owing to social and economic development,” says Remais. “Delays in development are especially concerning for China, which is investing heavily in improving health even as the impact of those investments is being countered by the effect of climate change.” 

Read more.

Related:
Creating an atmosphere for change

Friday, November 7, 2014

Interstellar: Starting over on a new 'Earth'



The movie Interstellar opens in theaters at a time when Earth is facing major losses of biodiversity and ecosystems, says David Lynn, an Emory professor of biomolecular chemistry.

While humanity is challenged to find out what’s happening to Earth and how to make adjustments, we have also begun to realize that billions of Earth-like planets likely exist in habitable zones around the stars of our galaxy.

“In as little as 10 years, we could know whether we’re alone in the universe, whether there are other living systems,” Lynn says. “That’s an exciting prospect. It’s not clear necessarily that we’ll find out that there is intelligent life or not. That may be a lower probability, but that’s also possible.”

Much of the science in Interstellar is not accurate, and its vision of the future may not come true. And yet, it is still an important film, Lynn says, since its themes resonate today, during a critical time in our history.

Related:
Chemists boldly go in search of 'little green molecules'
Prometheus: Seeding wonder and science

Tuesday, November 4, 2014

Having a Y chromosome doesn't affect women's response to sexual images, brain study shows

The study provides "further evidence that we need to revamp our thinking about what we mean by 'man' and 'woman,'" says psychologist Kim Wallen.

By Carol Clark

Women born with a rare condition that gives them a Y chromosome don’t only look like women physically, they also have the same brain responses to visual sexual stimuli, a new study shows.

The journal Hormones and Behavior published the results of the first brain imaging study of women with complete androgen insensitivity, or CAIS, led by psychologists at Emory.

“Our findings clearly rule out a direct effect of the Y chromosome in producing masculine patterns of response,” says Kim Wallen, an Emory professor of psychology and behavioral neuroendocrinology. “It’s further evidence that we need to revamp our thinking about what we mean by ‘man’ and ‘woman.’”

Wallen conducted the research with Stephan Hamann, Emory professor of psychology, and graduate students in their labs. Researchers from Pennsylvania State University and Indiana University also contributed to the study.

The Y chromosome was identified as the sex-determining chromosome in 1905. Females normally have an XX chromosome pair and males have an XY chromosome pair.

By the 1920s, biochemists also began intensively studying androgens and estrogens, chemical substances commonly referred to as “sex hormones.” During pregnancy, the presence of a Y chromosome leads the fetus to produce testes. The testes then secrete androgens that stimulate the formation of a penis, scrotum and other male characteristics.

Women with CAIS are born with an XY chromosome pair. Because of the Y chromosome, the women have testes that remain hidden within their groins but they lack neural receptors for androgens so they cannot respond to the androgens that their testes produce. They can, however, respond to the estrogens that their testes produce so they develop physically as women and undergo a feminizing puberty. Since they do not have ovaries or a uterus and do not menstruate they cannot have children.

“Women with CAIS have androgen floating around in their brains but no receptors for it to connect to,” Wallen says. “Essentially, they have this default female pattern and it’s as though they were never exposed to androgen at all.”

Wallen and Hamann are focused on teasing out neural differences between men and women. In a 2004 study, they used functional magnetic resonance imaging (fMRI) to study the neural activity of typical men and typical women while they were viewing photos of people engaged in sexual activity.

The patterns were distinctively clear, Hamann says. “Men showed a lot more activity than women in two areas of the brain – the amygdala, which is involved in emotion and motivation, and the hypothalamus which is involved in regulations of hormones and possibly sexual behavior.”

For the most recent study, the researchers repeated the experiment while also including 13 women with CAIS in addition to women without CAIS and men.

“We didn’t find any difference between the neural responses of women with CAIS and typical women, although they were both very different from those of the men in the study,” Hamann says. “This result supports the theory that androgen is the key to a masculine response. And it further confirms that women with CAIS are typical women psychologically, as well as their physical phenotype, despite having a Y chromosome.”

A limitation of the study is that it did not measure environmental effects on women with CAIS. “These women look the same as other women,” Wallen explains. “They’re reared as girls and they’re treated as girls, so their whole developmental experience is feminized. We can’t rule out that experience as a factor in their brain responses.”

The findings may have broader applications in cognition and health. “Anything that we can learn about sex differences in the brain,” Wallen says, “may help answer important questions such as why autism is more common in males and depression more common in females.”

Related:
Intersex: A lesson in biology, identity and culture

Images: Thinkstock

Monday, November 3, 2014

Creepy crawlies and the science of fear

 
Tarantulas don't eat people and even try to avoid them. So chill out.

Why are we afraid of spiders, snakes and roaches? WXIA reporter Julie Wolfe explores that question through a new exhibit at the Fernbank Museum of Natural History called "Goose Bumps! The Science of Fear." Below is an excerpt from a report by Wolfe:

"It was my nightmare inside a glass box: A dozen cockroaches hissing and wiggling and waiting to crawl up my nose. Okay, maybe not that last part.

"When Emory Assistant Psychology Professor Seth Norrholm suggested I slip my hand into a box that may lead to that creepy, crawly nightmare, I hesitated. It's a response that was programmed into me stretching back to my caveman ancestors.

"All fears can fit into three categories: Innate fears, learned fears and preparatory fear.

"'An innate fear is something that you're born with, and it's a survival instinct type of fear,' Norrholm explained. Fear of animals and insects fall into that category. Among the most common fears: Spiders, cockroaches and snakes."

Watch a video of her report on the WXIA web site.

Related:
The psychology of screams
Psychologists closing in on claustrophobia
How fear skews our spatial perception
The anatomy of fear and memory formation

Wednesday, October 29, 2014

Teeth, sex and testosterone reveal secrets of aging in wild mouse lemurs

A brown mouse lemur in the wild. Mouse lemurs, weighing a mere 30 to 80 grams, are the world's smallest primates. Photos, above and below, by Jukka Jernvall.

By Carol Clark

Mouse lemurs can live at least eight years in the wild – twice as long as some previous estimates, a long-term longitudinal study finds.

PLOS ONE published the research on brown mouse lemurs (Microcebus rufus) led in Madagascar by biologist Sarah Zohdy, a post-doctoral fellow in Emory's Department of Environmental Sciences and the Rollins School of Public Health. Zohdy conducted the research while she was a doctoral student at the University of Helsinki.

“It’s surprising that these tiny, mouse-sized primates, living in a jungle full of predators that probably consider them a bite-sized snack, can live so long,” Zohdy says. “And we found individuals up to eight years of age in the wild with no physical symptoms of senescence like some captive mouse lemurs start getting by the age of four.”

It is likely that starvation, predation, disease and other environmental stressors reduce the observed rate of senescence in the wild, Zohdy notes, but a growing body of evidence also suggests that captive conditions may affect mental and physical function.

“We focused on wild mouse lemurs because we want to know what happens naturally when a primitive primate is exposed to all of the extrinsic and intrinsic mortality factors that shaped them as a species,” Zohdy says. “Comparing longevity data of captive and wild mouse lemurs may help us understand how the physiological and behavioral demands of different environments affect the aging process in other primates, including humans.”

The study determined ages of wild mouse lemurs in Madagascar’s Ranomafana National Park through a dental mold method that had not previously been used with small mammals. In addition to the high-resolution tooth-wear analysis for aging, fecal samples underwent hormone analysis.



The researchers found no difference between the longevity of male and female mouse lemurs, unlike most vertebrates where males tend to die first.

“And even more interestingly, we found no difference in testosterone levels between males and females,” Zohdy says. Mouse lemurs are female dominant, which may explain why their testosterone levels are on a par with males.

“While elevated male testosterone levels have been implicated in shorter lifespans in several species, this is one of the first studies to show equivalent testosterone levels accompanying equivalent lifespans,” Zohdy says.

A co-author of the study is primatologist Patricia Wright of the Centre ValBio Research Station in Madagascar and Stony Brook University. Other institutions involved in the study include Colorado State University, Duke University and the University of Arizona, Tucson.

Mouse lemurs, found only on the island of Madagascar, are the world’s smallest primates. They are among nearly 100 species of lemurs that arrived in Madagascar some 65 million years ago, perhaps floating over from mainland Africa on mats of vegetation.

Mouse lemurs weigh a mere 30 to 80 grams but in captivity they live six times longer than mammals of similar body size, such as mice or shrews. Captive gray mouse lemurs (Microcebus murinus) can live beyond age 12. By age four, however, they can start exhibiting behavioral and neurologic degeneration. In addition to slowing of motor skills and activity levels, reduced memory capacity and sense of smell, the captive four-year-olds can start developing gray hair and cataracts, Zohdy says.


Brown mouse lemurs evolved in isolation, along with nearly 100 other species of lemurs.

The wild brown mouse lemurs in the study were trapped, marked and released during the years 2003 to 2010. A total of 420 dental impressions were taken from the lower-right mandibular tooth rows of 189 unique individuals. Over the course of seven years, 270 age estimates were calculated. For 23 individuals captured three or more times during the duration of the study, the regression slopes of wear rates were calculated and the mean slope was used to calculate ages for all individuals.

“We found that wild brown mouse lemurs can live at least eight years,” Zohdy says. “In the population that we studied, 16 percent lived beyond four years of age. And we found no physical signs of senescence, such as graying hair or cataracts, in any wild individual.”

Limitations of the study include the inability to document gradual physiological symptoms of senescence in the wild. “Our results do not provide information about wild brown mouse lemurs that can be directly compared to senescence in captive gray mouse lemurs,” Zohdy says. “Further research, using identical measures of senescence, will help to reveal whether patterns of physiological senescence occur consistently across the genus and in both captive and wild conditions.”

Another confounding factor Zohdy cites is “the Sleeping Beauty effect,” the fact that wild mouse lemurs hibernate for half the year, possibly boosting their life span.

“We now know that mouse lemurs can live a relatively long time in the wild,” she says, “but we don’t know the exact mechanisms behind why they live so long.”

Related:
For the love of lemurs and Madagascar
In Madagascar: A health crisis of people and their ecosystem

Friday, October 24, 2014

Molecular beacons shine light on how cells 'crawl'

"Our premise is that mechanics play a role in almost all biological processes, and with these DNA-based tension probes we’re going to uncover, measure and map those forces,” says biomolecular chemist Khalid Salaita. Graphic by Victor Ma.

By Carol Clark

Adherent cells, the kind that form the architecture of all multi-cellular organisms, are mechanically engineered with precise forces that allow them to move around and stick to things. Proteins called integrin receptors act like little hands and feet to pull these cells across a surface or to anchor them in place. When groups of these cells are put into a petri dish with a variety of substrates they can sense the differences in the surfaces and they will “crawl” toward the stiffest one they can find.

Now chemists have devised a method using DNA-based tension probes to zoom in at the molecular level and measure and map these phenomena: How cells mechanically sense their environments, migrate and adhere to things.

Nature Communications published the research, led by the lab of Khalid Salaita, assistant professor of biomolecular chemistry at Emory University. Co-authors include mechanical and biological engineers from Georgia Tech.

Using their new method, the researchers showed how the forces applied by fibroblast cells are actually distributed at the individual molecule level. “We found that each of the integrin receptors on the perimeter of cells is basically ‘feeling’ the mechanics of its environment,” Salaita says. “If the surface they feel is softer, they will unbind from it and if it’s more rigid, they will bind. They like to plant their stakes in firm ground.”

The integrin receptors on fibroblast cells, above, "are kind of beasts," Salaita says. "They apply relatively high forces in order to adhere to the extracellular matrix." NIH photo.

Each cell has thousands of these integrin receptors that span the cellular membrane. Cell biologists have long been focused on the chemical aspects of how integrin receptors sense the environment and interact with it, while the understanding of the mechanical aspects lagged. Cellular mechanics is a relatively new but growing field, which also involves biophysicists, engineers, chemists and other specialists.

“Lots of good and bad things that happen in the body are mediated by these integrin receptors, everything from wound healing to metastatic cancer, so it’s important to get a more complete picture of how these mechanisms work,” Salaita says.

The Salaita lab previously developed a fluorescent-sensor technique to visualize and measure mechanical forces on the surface of a cell using flexible polymers that act like tiny springs. These springs are chemically modified at both ends. One end gets a fluorescence-based turn-on sensor that will bind to an integrin receptor on the cell surface. The other end is chemically anchored to a microscope slide and a molecule that quenches fluorescence. As force is applied to the polymer spring, it extends. The distance from the quencher increases and the fluorescent signal turns on and grows brighter. Measuring the amount of fluorescent light emitted determines the amount of force being exerted. (Watch a video of the flexible polymer technique.)

Yun Zhang, a co-author of the Nature Communications paper and a graduate student in the Salaita lab, had the idea of using DNA molecular beacons instead of flexible polymers. “She was new to the lab and brought a fresh perspective,” Salaita says.

The molecular beacons are short pieces of lab-synthesized DNA, each consisting of about 20 base pairs, used in clinical diagnostics and research. The beacons are called DNA hairpins because of their shape.

The thermodynamics of DNA, its double-strand helix structure and the energy needed for it to fold are well understood, making the DNA hairpins more refined instruments for measuring force. Another key advantage is the fact that their ends are consistently the same distance apart, Salaita says, unlike the random coils of flexible polymers.

T cells are white blood cells whose receptors are focused not on adhesion, but on activities like identifying various peptides. Electron micrograph of a human T cell by NIAID/NIH.

In experiments, the DNA hairpins turned out to operate more like a toggle switch than a dimmer switch. “The polymer-based tension probes gradually unwind and become brighter as more force is applied,” Salaita says. “In contrast, DNA hairpins don’t budge until you apply a certain amount of force. And once that force is applied, they start unzipping and just keep unraveling.”

In addition, the researchers were able to calibrate the force constant of the DNA hairpins, making them highly tunable, digital instruments for calculating the amount of force applied by a molecule, down to the piconewton level.

“The force of gravity on an apple is about one newton, so we’re talking about a million-millionth of that,” Salaita says. “It’s sort of mind-bogging that that’s how little force you need to unfold a piece of DNA.”

The result is a tension probe that is three times more sensitive than the polymer probes.

In a separate paper, published in Nano Letters, the Salaita lab used the DNA-based probes to experiment with how the density of a substrate affects the force applied.

“Intuitively you might think that a less dense environment, offering fewer anchoring points, would result in more force per anchor,” Salaita said. “We found that it’s actually the opposite: You’re going to see less force per anchor.” The mechanism of sensing ligand spacing and adhering to a substrate appears to be force-mediated, he says. “The integrin receptors need to be closely spaced in order for the engine in the cell that generates force to engage with them and commit the force.”

Now the researchers are using the DNA-based tools they’ve developed to study the forces of more sensitive cellular pathways and receptors.

“Integrin receptors are kind of beasts, they apply relatively high forces in order to adhere to the extracellular matrix,” Salaita says. “There are lots of different cell receptors that apply much weaker forces.”

T cells, for example, are white blood cells whose receptors are focused not on adhesion but on activities like distinguishing a friendly self-peptide from a foreign bacterial peptide.

The Salaita lab is collaborating with medical researchers across Emory to understand the role of cellular mechanics in the immune system, blood clotting and neural patterning of axons. “Basically, our premise is that mechanics play a role in almost all biological processes, and with these DNA-based tension probes we’re going to uncover, measure and map those forces,” Salaita says.

Related:
Chemists reveal the force within you
Biochemical cell signals quantified for first time

Thursday, October 16, 2014

Ebola's backstory: How germs jump species

Fruit bats are associated with an array of deadly viruses, including Nipah, Ebola and Marburg. As the bats' habitat shrinks, the odds increase that bats will cross paths with humans, wild primates and other animals.

By Carol Clark
From Emory Medicine

While virologists study pathogens like Ebola by zooming in on them with an electron microscope, primate disease ecologist Thomas Gillespie climbs 100-foot trees in the tropical forests of Africa to get the big picture view. He tracks pathogens in the wild to learn how they adapt to changing environments and jump between species.

It is physically challenging work that often takes him into remote forests where the wildlife has not yet learned to fear people. A chimpanzee turned Gillespie into a human yo-yo while he was ascending a tree with a rope and harness. “Chimpanzees have 10 times the strength of a man and they can be curious and playful,” he says. “I once had an adult male chimpanzee grab my rope and bounce me up and down.”

Wild primates pose a special risk for zoonotic diseases—those transmissible from animals to humans—due to our close genetic relationship. HIV/AIDS and Ebola are the two most dramatic examples of diseases linked to wild primates, but many other viral, bacterial, fungal, and parasitic pathogens found in apes and monkeys are readily passed to humans.

“The bottom line is that the majority of emerging infectious diseases are coming from wildlife and most of that wildlife is in tropical forests,” says Gillespie, a professor in Emory’s Department of Environmental Sciences and the Rollins School of Public Health. “We can’t afford to just focus on one pathogen or one animal. It’s really important to get a general understanding of the interactions of different species, and how changes in the environment are driving zoonotic disease transmission.”



Gillespie is investigating undisturbed forests, as well as sites where logging and other human activity is under way. He gathers fecal and blood samples from people and animals for analysis while also scouring the forest floor and treetops to learn about the diversity of pathogens in the environment. The data can then be mapped spatially and over time to connect the dots of disease ecology.

For one ongoing study in Uganda, Gillespie and his collaborators are following primates in and around fig trees. The researchers hang out near these ancient forest giants, observing the tableau of life feeding amid the branches and on the ground below.

Fig trees are a keystone species of rainforest ecosystems. Climate change is playing havoc with the seasonal fruiting of other types of trees. But fig trees have co-evolved with specific pollinators—fig wasps—and due to their complex interaction, there is always a fig tree fruiting somewhere in the forest, providing a critical, consistent food source for fruit bats, primates, and ground dwellers like the bush duiker, a shy, dainty antelope that darts amid the forest shrubbery.

Fruit bats, associated with an array of deadly viruses including Nipah, Ebola, and Marburg, are especially specific in their diet. “They’re looking for ripe fruit,” Gillespie says, “and that’s a rare resource in the environment.”

And it’s becoming even rarer. Logging companies are cutting down huge swaths of African forests. Mining operations are moving into new terrain. Villages are expanding, and homes and food crops are eating into the wilderness. All these factors bump up the odds that fruit bats will be living near people, and that the bats will be joined by a variety of other animals while they are feeding from a tree.

“Most viruses can only last outside of a host for minutes or hours, not days,” Gillespie says, “but now we have this changing landscape of food availability. That raises the probability that a gorilla or chimpanzee will eat a piece of fruit that a bat has just defecated on, or has bitten into and discarded.”

Diseases and parasites could be transmitted in this manner. Ebola is one of the rare ones, extremely difficult to find, much less study, in the wild. But Ebola looms large in the public imagination because it is hemorrhagic, capable of causing massive bleeding, and because of its high fatality rate. It is also frightening because it is so mysterious, popping up out of the forest to kill voraciously then disappearing again for years.

The virus was first identified in 1976, following an outbreak in a remote hamlet of Zaire (now the Republic of Congo) near the Ebola River. Subsequent outbreaks have also been associated with forested backwaters and have quickly burned themselves out. That is, until the current outbreak in West Africa. Ebola has now made the leap from rural, forested regions to Africa’s urban areas, where many people live in crowded conditions with poor sanitation.

One of the biggest mysteries is where the virus has hidden between these outbreaks. Evidence of Ebola antibodies, and remnants of Ebola RNA, have been found in the blood of three species of fruit bats, making them prime suspects as an Ebola reservoir: An organism that can carry the pathogen without dying or even becoming sickened by it.

“Fruit bats are the best guess as to the reservoir, but until a live virus is found in their blood, we cannot be sure,” Gillespie says. “What we do know is that bats are an important part of the equation. And gorillas, chimpanzees, and some other animals, like the bush duiker, can get infected with Ebola.”

During the past decade, human Ebola outbreaks in Gabon and Congo have been accompanied by reports of gorilla and chimpanzee carcasses in surrounding forests, and epidemiological studies have connected encounters with dead gorillas, chimpanzees, and bush duikers to human cases.

“A hunter might find a dead gorilla in the forest,” Gillespie says, “and instead of saying, ‘I shouldn’t butcher this animal and eat it, it may have died of an infectious disease,’ he throws up his hands and says, ‘Thank you, God, for this gift!’ ”

Fruit bats are also hunted for food in many parts of Africa.

But you don’t have to be a hunter going deep into a forest to catch Ebola. Now that fruit bats are feeling the squeeze of fewer food sources, they may choose to roost under the eaves of a home, feasting on trees in the village orchard as children play below.

Widespread education about what is safe to eat and what is not, and how to identify animals that may have died from an illness, is becoming a vital part of preventing the spread of these diseases.

Just as people are encroaching on wilderness, pathogens are expanding their range into human habitats.

“We’re changing the environment in ways that may be promoting Ebola,” Gillespie says. “As the human population grows and the demand for resources pushes us into new areas, we’re going to see more diseases emerge. Anytime we alter a pristine natural system there are going to be unintended consequences.”

Photos: Thinkstock

Related:
Gorilla vet tracks microbes for global health
Mountain gorillas: People in their midst
Sanctuary chimps show high rates of drug-resistant staph

Thursday, October 9, 2014

Chemists uncover new role of a key base in organic synthesis

The collaboration of chemists from across three continents is a result of the Center for Selective C-H Functionalization (CCHF), an NSF National Center for Chemical Innovation headquartered at Emory. 

By Carol Clark

An international team of chemists has discovered a new piece to the puzzle of how a powerful base used in organic synthesis, cesium carbonate, plays a pivotal role during a catalytic reaction.

The research, published by the Journal of the American Chemical Society, was led by Jamal Musaev, a theoretical chemist at Emory University, and Ken Itami, an experimental chemist from Nagoya University in Japan. Sun Yat-Sen University in Guangzhou, China, also contributed to the findings.

Many organic chemistry reactions are acid/base reactions, involving the exchange of positively charged hydrogen atoms. Acids donate the positively charged hydrogen and bases accept it.

The current research focused on the use of cesium carbonate as a base. Cesium carbonate has recently been observed to accelerate a particular class of catalytic reactions, a phenomenon termed the “cesium effect.”

The use of cesium carbonate base and carboxylic acids co-catalysts have been shown to be critical in a number of recent carbon-hydrogen (C-H) bond functionalization reactions. The full story behind the impact of this base was previously not clear. It was known that the cesium base removed hydrogen protons, or scavenged acidic acid, from the solution, and was also involved in the exchange of ligands during a reaction, but these two factors did not explain the acceleration seen.

This recent work offers a new explanation. The researchers found that cesium base can generate an aggregate state: The molecules come together creating a cluster that is actually the starting point for the catalytic reaction, and not the discreet carboxylic acids and carbonate complexes as was previously thought.

“One-by-one, we are identifying key components of catalytic reactions and then putting them all together,” Musaev says. “It’s difficult work, but important, because the more your understand the reaction the better you can predict ways to modify it and control it.”

The findings about how the base acts in these reactions has the potential to impact the development of not just new C-H functionalization reactions, but the way that catalytic reactions in general are considered.

The collaboration of chemists from across three continents is a result of the Center for Selective C-H Functionalization (CCHF), an NSF National Center for Chemical Innovation headquartered at Emory. C-H functionalization holds the potential to make organic synthesis faster, simpler and greener, and could open up new ways to develop drugs and other fine-chemical products.

The CCHF encompasses 15 top research universities from across the United States, and recently expanded to include institutes in Asia and Europe. The global network forged by the CCHF brings together leading players from around the world, representing the range of specialties that will be required to make the critical breakthroughs needed to bring C-H functionalization into the mainstream of chemical synthesis.

Related:
Organic chemists now forming global bonds
NSF chemistry center opens new era in organic synthesis

Tuesday, October 7, 2014

Top 10 reasons to learn to make Stone Age tools

The Late Acheulean hand axe, going back about 500,000 years, "is the oldest technology that pretty much everyone agrees is unique to humans," says experimental archeologist Dietrich Stout.

By Carol Clark

Are you between the ages of 18 to 50, right-handed, and available for six hours per week? Emory experimental archeologists are looking for at least 20 healthy individuals willing to devote 100 hours over about four months to learn the art of making a Stone Age hand axe.

“We need novices who will really commit to learning this prehistoric craft,” says Dietrich Stout, an assistant professor of anthropology and head of the Paleolithic Technology Laboratory.

Nada Khreisheh will train participants.
Nada Khreisheh, a post-doctoral researcher in the lab, will train the participants to break and shape flint, a skill known as knapping, as part of a major, three-year archeology experiment to investigate the role of stone tools in human brain evolution, especially key areas of the brain related to language. For more details on how to apply, send her an email: nada.n.khreisheh@emory.edu.

In addition to attending tool-making training sessions, participants will undergo three magnetic resonance imaging (MRI) scans and eye-tracking experiments. And they will need to provide brief written feedback about their experiences following training sessions.

The ideal candidates to join the experiment would likely be curious about who we are as humans and where we came from. “If you’re not interested in trying to answer those questions, it might be hard to justify all the time and effort that will be involved,” Stout says.

If you meet the qualifying characteristics, here are 10 reasons you may want to apply to learn to make stone tools:

1. You will be making history.
“This is the first controlled, neuro-scientific study of real-world craft skill acquisition over time,” Stout says. “Our hypothesis is that the brain systems involved in putting together a sequence of words to make a meaningful sentence in spoken language overlap with systems involved in putting together a series of physical actions to reach a meaningful goal.”

Earlier studies, by Stout and others, have compared the brains of experienced knappers with novices. The results have all suggested that the part of the brain engaged in making a hand axe overlaps with areas associated with language. Longitudinal data, following people as they learn to master the art of making a hand axe, should provide a more definitive result, one way or the other. “This is a much more focused and rigorous test than any previous study,” Stout says.

At the same time, the researchers hope to develop the first systematic model for describing the syntax of natural human action. “We’re proposing a method to break actions down into ‘phrases’, quantify their ‘grammatical’ structure and relate this directly to processing in the brain,” Stout says. “Although we’re using the domain of tool knapping, the same method may apply more broadly to any complex series of actions.”

2. You will be making prehistory. 
The hand axe represents a pivotal point in prehistory and an ideal technology to hone in on key ways that we shifted from more ape-like hominids into full-fledged humans, Stout says. Simple Oldowan stone flakes are the earliest known tools, dating back 2.6 million years, before the human family emerged. The Late Acheulean hand axe, going back 500,000 years, embodies a much higher level of refinement and standardization.

 “You see a clear increase in complexity in the hand axe,” Stout says. “It’s the oldest technology that pretty much everyone agrees is unique to humans.”

The stone tool experiment has its own logo, designed by Khreisheh.

3. You can get in touch with your Acheulean roots.
Stout says doing knapping himself gives him a unique connection to the past. “I can pick up a stone tool from an archeological site and see things that are so familiar to my own experience: A flake taken off here and there, and then there is a ding where the person who was knapping the stone tried something that didn’t work,” he says. “I’ll get this sense of what that prehistoric person might have been feeling. It gives you goose bumps.”

4. It is an interesting challenge. 
“Many people expect it to be easy, because it’s ancient technology,” Stout says, but it’s actually challenging to chip out the lens-shaped cross-section of the hand axe, and thin down its edges to expert sharpness. “A lot of self-control is involved in flint knapping. You have to not get frustrated and just start banging on the rock,” he says.

5. It’s fun, once you get the hang of it.
“I really enjoy it, it’s kind of additive,” says Khreisheh, who began knapping a decade ago as part of her research at the University of Exeter, England. Greensand silicate, found near where she grew up, is her favorite material to work with and she has amassed a large collection of her hand-made tools. “When I was packing up my stuff to move to Atlanta, I had so many rocks it was just ridiculous.”

6. It may give you an edge.
“It’s not just another skill, it will really set you apart,” Khreisheh says. “When I tell people that I’m a flint knapper, they usually have no idea what that means but they are always interested in hearing about it.”

Dietrich Stout surveys his stash of flint stones near the Anthropology building.

7. You will never have to knap alone. 
If you decide to turn it an ongoing hobby, you can tap into an established community of knappers. They have conferences, publications and even exhibitions of lithic art. “Some of the better pieces are like sculpture,” Stout says.

8. It will change the way you look at rocks.
“It enriches your vision,” Stout says. “Other people may just see a rock, but you see all kinds of features in a stone, like, ‘This is where it would break if I hit it.’ The other day when I was walking on campus past some gravel landscaping, I thought, ‘That looks like it would make a good hammer stone. I could really use that one.’”

9. Stone tools have practical uses. 
You could not ask for a more impressive paperweight than a hand axe that you made yourself.

“I grab one of these things when I want to open a box,” says Stout, waving at the array of stone flakes spread out in his lab.

“One of my advisors made a lemon squeezer out of blade cores,” Khreisheh adds.

Another dedicated archeologist actually elected to have major surgery done with obsidian blades, rather than steel scalpels, to demonstrate to his students that stone tools are more technologically advanced than many people realize.

10. Understanding neural systems may lead to broader applications.
“Any insights into how people understand physical activity and language may lead to new ways to help people with brain damage or language difficulties,” Stout says. He hopes the tool-making experiment will benefit a range of neuroscience research.

“Neuroscientists tend to focus on the organism itself,” he says, “but humans are immersed in material culture. Much of our identity and experience is dictated by our stuff. As experimental archeologists, we bring a deep understanding of technology, culture and tools to the study of the human brain.”

The project is funded by the National Science Foundation and the John Templeton Foundation, through a program designed to integrate science across disciplines.

All photos by Bryan Meltz, Emory Photo/Video.

Related:
Brain trumps hand in Stone Age tool study
Hominid skull hints at later brain evolution
A brainy time traveler