Emory graduate student J.T. Fu, first author of the Nature paper, holds vials of the catalyst and the reagent used in the reaction.
By Carol Clark
For decades, chemists have aspired to do carefully controlled chemistry on carbon-hydrogen bonds. The challenge is staggering. It requires the power of a miniature wrecking ball to break these extremely strong bonds, combined with the finesse of microscopic tweezers to single out specific C-H bonds among the many crowded onto a molecule.
The journal Nature published a method that combines both these factors to make an inert C-H bond reactive — effectively turning chemical “trash” to “treasure.”
“We can change a cheap and abundant hydrocarbon with limited usefulness into a valuable scaffold for developing new compounds — such as pharmaceuticals and other fine chemicals,” says J.T. Fu, a graduate student at Emory University and first author of the paper.
The Nature paper is the latest in a series from Emory University demonstrating the ability to use a dirhodium catalyst to selectively functionalize C-H bonds in a streamlined manner, while also maintaining virtually full control of the three-dimensional shape of the molecules produced.
“This latest catalyst is so selective that it goes cleanly for just one C-H bond — even though there are several C-H bonds very similar to it within the molecule,” says Huw Davies, Emory professor of organic chemistry and senior author of the paper. “That was a huge surprise, even to us.”
Click here to read more about the discovery.
Related:
Creating global bonds
C-H center nets $20 million
A huge shortcut for synthesis
Contact/News Media
▼
Wednesday, December 19, 2018
Tuesday, December 4, 2018
Your past is calling: Can you ID stone toolmaking 'ring' tones?
Emory anthropologist Dietrich Stout invites you to participate in an online experiment, Sounds of the Past, investigating the human ability to discriminate and interpret the sounds produced by stone toolmaking. (Photo by Ann Watson, Emory Photo/Video)
By Carol Clark
Long before everyone started carrying a smart phone everywhere they went — attuned to the sounds of a text, call or email — our ancestors carried a hand axe.
“Stone tools were the key human technology for two million years,” says Dietrich Stout, director of the Paleolithic Technology Laboratory at Emory University. In fact, he adds, the process of making them may have played an important role in our ability to communicate.
If you can spare just 10 minutes for science, you can use your smart phone and a pair of headphones to log onto a web site to help Stout test whether ancient tool-making promoted special acoustic abilities — perhaps even honing the development of spoken language.
Stout is an experimental archeologist who recreates prehistoric stone toolmaking, known as knapping, to study the evolution of the human brain and mind. In many of his experiments, subjects actually bang out the tools as activity in their brains is recorded via functional magnetic resonance imaging (fMRI). He’s already found evidence that the visual-spatial skills used in knapping activate areas of the brain that are involved in language processing.
But what about the sounds of knapping?
“An experienced knapper once told me that he would rather be blindfolded than wear ear plugs while making a stone tool, because he got so much valuable information out of the sound when he struck the stone,” Stout says. “That got me wondering: Do knappers just think that the sounds are giving them meaningful information? Could we give them a test to find out if that’s true?”
Stout teamed up with Robert Rein, from the German Sport University Cologne, to develop just such a test. The result is the online experiment Sounds of the Past, open to everyone — from expert knappers to those who have never knapped at all.
During stone tool production a stone flake is produced by hitting a stone core with another stone, used like a hammer. Factors like the geometry of the core stone and the location and the strength of the strike determine the size of the flake that falls off.
The researchers recorded the sounds of flakes breaking off during stone tool production. Participants in the online experiment are presented with a series of these sounds, with no accompanying visuals, and asked to estimate the length of the flakes produced, within a range of parameters.
Participants are also asked whether they have prior experience knapping. The aim is to get as many experienced knappers as possible to participate, and at least an equal number of those without experience, then compare the results.
“No one is going to guess all the flake sizes, to the millimeter,” Stout says. “But if we plot out the results, we should see if there is a correlation between the level of accuracy and whether someone is an experienced or novice knapper.”
The study is self-funded and does not provide compensation for participants. Individual test results are also not available. “It’s really something that we hope participants will just have fun doing, along with the satisfaction that they are providing data to help us understand the evolution of the human brain,” Stout says.
The length of time the experiment will be available is open ended, he adds, although the researchers hope to have enough results in hand for analysis sometime next year.
Click here to participate in the experiment.
Related:
Complex cognition shaped the Stone Age hand axe
Brain trumps hand in Stone Age tool study
By Carol Clark
Long before everyone started carrying a smart phone everywhere they went — attuned to the sounds of a text, call or email — our ancestors carried a hand axe.
“Stone tools were the key human technology for two million years,” says Dietrich Stout, director of the Paleolithic Technology Laboratory at Emory University. In fact, he adds, the process of making them may have played an important role in our ability to communicate.
If you can spare just 10 minutes for science, you can use your smart phone and a pair of headphones to log onto a web site to help Stout test whether ancient tool-making promoted special acoustic abilities — perhaps even honing the development of spoken language.
Stout is an experimental archeologist who recreates prehistoric stone toolmaking, known as knapping, to study the evolution of the human brain and mind. In many of his experiments, subjects actually bang out the tools as activity in their brains is recorded via functional magnetic resonance imaging (fMRI). He’s already found evidence that the visual-spatial skills used in knapping activate areas of the brain that are involved in language processing.
But what about the sounds of knapping?
“An experienced knapper once told me that he would rather be blindfolded than wear ear plugs while making a stone tool, because he got so much valuable information out of the sound when he struck the stone,” Stout says. “That got me wondering: Do knappers just think that the sounds are giving them meaningful information? Could we give them a test to find out if that’s true?”
Stout teamed up with Robert Rein, from the German Sport University Cologne, to develop just such a test. The result is the online experiment Sounds of the Past, open to everyone — from expert knappers to those who have never knapped at all.
During stone tool production a stone flake is produced by hitting a stone core with another stone, used like a hammer. Factors like the geometry of the core stone and the location and the strength of the strike determine the size of the flake that falls off.
The researchers recorded the sounds of flakes breaking off during stone tool production. Participants in the online experiment are presented with a series of these sounds, with no accompanying visuals, and asked to estimate the length of the flakes produced, within a range of parameters.
Participants are also asked whether they have prior experience knapping. The aim is to get as many experienced knappers as possible to participate, and at least an equal number of those without experience, then compare the results.
“No one is going to guess all the flake sizes, to the millimeter,” Stout says. “But if we plot out the results, we should see if there is a correlation between the level of accuracy and whether someone is an experienced or novice knapper.”
The study is self-funded and does not provide compensation for participants. Individual test results are also not available. “It’s really something that we hope participants will just have fun doing, along with the satisfaction that they are providing data to help us understand the evolution of the human brain,” Stout says.
The length of time the experiment will be available is open ended, he adds, although the researchers hope to have enough results in hand for analysis sometime next year.
Click here to participate in the experiment.
Related:
Complex cognition shaped the Stone Age hand axe
Brain trumps hand in Stone Age tool study
Thursday, November 8, 2018
'Potato gene' reveals how ancient Andeans adapted to starchy diet
A woman sells potatoes at a market in the Andes. DNA analyses show that ancient populations of the Peruvian highlands adapted to the introduction of agriculture in ways distinct from other global populations.
By Carol Clark
Potatoes, native to South America, became an agricultural crop thousands of years ago in the Andean highlands of Peru. And just as the ancient Andean people turned wild tubers into the domesticated potato, the potato may have altered the genomes of the Andeans who made it a staple of their diet.
Science Advances published the findings. DNA analyses show that ancient populations of the Peruvian highlands adapted to the introduction of agriculture and an extreme, high-altitude environment in ways distinct from other global populations.
“We see a different configuration of a gene associated with starch digestion in the small intestine — MGAM — in the agricultural ancient Andean genome samples, but not in hunter-gatherers down the coast,” says Emory University geneticist John Lindo, first author of the paper. “It suggests a sort of co-evolution between an agricultural crop and human beings.”
In contrast, European populations that began consuming more grains with the rise of agriculture show different genomic changes. Research has shown that their genomes have an increased number of copies of the gene coding for amylase — an enzyme in saliva that breaks down starch.
Lindo, an assistant professor of anthropology at Emory, integrates the approaches of ancient whole genomes, statistical modeling and functional methods into ancient DNA research. The international team of 17 researchers also included Anna Di Rienzo of the University of Chicago (who specializes in physiology and genetics) and high-altitude archeologists Mark Aldenderfer, from the University of California, Merced, and Randall Haas, from the University of California, Davis.
The study looked at seven ancient whole genomes from the Lake Titicaca region of the Andean highlands of Peru, dating back from 1,800 to 7,000 years. The researchers also compared the ancient whole genomes with 64 modern-day genomes from both highland Andean populations and lowland populations in Chile, to identify genetic adaptions that took place before the arrival of Europeans in the 1500s.
The hardy potato helped people adapt to the harsh environment of the Andean highlands.
The Andean highlands make an ideal natural laboratory for ancient DNA studies, due to the strong selective pressure needed for ancient populations to adapt to altitudes greater than 2,500 meters. “Frigid temperatures, low oxygen levels and intense ultraviolet radiation make the highlands one of the most extreme environments that human beings have occupied,” Lindo says. “It provides a glimpse of our potential for adaptability.”
Both the ancient and modern high-altitude populations showed strong positive selection on variants in the MGAM gene, which is evident by at least 1,800 years ago. That fits with archeological evidence indicating that the domesticated potato — a crop resistant to cold that grows mainly underground — became a staple of the Peruvian highlands as far back as 3,400 years ago.
The researchers also discovered that the Andean highland population’s genomes do not share the same genetic changes previously seen in Tibetan genomes in response to hypoxia, or low levels of oxygen. That suggests that the Andean genomes adapted to high altitude in different ways.
One possibility uncovered in the study is differentiation in the DST gene, which has been linked to proper cardiac muscle in mice. DST histone modifications in the Andean genomes associated with blood and the right ventricle of the heart may correlate to the tendency of Andean highlanders to have enlarged right ventricles. That finding fits with a previous study suggesting that Andeans may have adapted to high altitude hypoxia via cardiovascular modifications.
A gene flow analysis found that the low- and high-elevation populations split between 8,200 to 9,200 years ago.
The arrival of the Spanish in South America — who brought new diseases, along with social disruption and war — coincided with an estimated reduction of 90 percent of the total Andean population. The current analysis, however, showed that the effective breeding population in the highlands went down by only 27 percent.
“We found a very strong selection in the highlands population on an immune gene that has been correlated with smallpox, which may have had a protective effect,” Lindo says. The harsh environment of the highlands, he adds, may have also buffered them from the devastation seen in the lowlands.
“Understanding the diet, environment and historical events of various ancestries, and how those ancestries adapted to these factors, may be one way to understand some health disparities among different populations,” Lindo says.
Related:
DNA analysis adds twist to ancient story of a Native American group
Malawi yields oldest-known human DNA from Africa
Photos: Getty Images
By Carol Clark
Potatoes, native to South America, became an agricultural crop thousands of years ago in the Andean highlands of Peru. And just as the ancient Andean people turned wild tubers into the domesticated potato, the potato may have altered the genomes of the Andeans who made it a staple of their diet.
Science Advances published the findings. DNA analyses show that ancient populations of the Peruvian highlands adapted to the introduction of agriculture and an extreme, high-altitude environment in ways distinct from other global populations.
“We see a different configuration of a gene associated with starch digestion in the small intestine — MGAM — in the agricultural ancient Andean genome samples, but not in hunter-gatherers down the coast,” says Emory University geneticist John Lindo, first author of the paper. “It suggests a sort of co-evolution between an agricultural crop and human beings.”
In contrast, European populations that began consuming more grains with the rise of agriculture show different genomic changes. Research has shown that their genomes have an increased number of copies of the gene coding for amylase — an enzyme in saliva that breaks down starch.
Lindo, an assistant professor of anthropology at Emory, integrates the approaches of ancient whole genomes, statistical modeling and functional methods into ancient DNA research. The international team of 17 researchers also included Anna Di Rienzo of the University of Chicago (who specializes in physiology and genetics) and high-altitude archeologists Mark Aldenderfer, from the University of California, Merced, and Randall Haas, from the University of California, Davis.
The study looked at seven ancient whole genomes from the Lake Titicaca region of the Andean highlands of Peru, dating back from 1,800 to 7,000 years. The researchers also compared the ancient whole genomes with 64 modern-day genomes from both highland Andean populations and lowland populations in Chile, to identify genetic adaptions that took place before the arrival of Europeans in the 1500s.
The hardy potato helped people adapt to the harsh environment of the Andean highlands.
The Andean highlands make an ideal natural laboratory for ancient DNA studies, due to the strong selective pressure needed for ancient populations to adapt to altitudes greater than 2,500 meters. “Frigid temperatures, low oxygen levels and intense ultraviolet radiation make the highlands one of the most extreme environments that human beings have occupied,” Lindo says. “It provides a glimpse of our potential for adaptability.”
Both the ancient and modern high-altitude populations showed strong positive selection on variants in the MGAM gene, which is evident by at least 1,800 years ago. That fits with archeological evidence indicating that the domesticated potato — a crop resistant to cold that grows mainly underground — became a staple of the Peruvian highlands as far back as 3,400 years ago.
The researchers also discovered that the Andean highland population’s genomes do not share the same genetic changes previously seen in Tibetan genomes in response to hypoxia, or low levels of oxygen. That suggests that the Andean genomes adapted to high altitude in different ways.
One possibility uncovered in the study is differentiation in the DST gene, which has been linked to proper cardiac muscle in mice. DST histone modifications in the Andean genomes associated with blood and the right ventricle of the heart may correlate to the tendency of Andean highlanders to have enlarged right ventricles. That finding fits with a previous study suggesting that Andeans may have adapted to high altitude hypoxia via cardiovascular modifications.
A gene flow analysis found that the low- and high-elevation populations split between 8,200 to 9,200 years ago.
The arrival of the Spanish in South America — who brought new diseases, along with social disruption and war — coincided with an estimated reduction of 90 percent of the total Andean population. The current analysis, however, showed that the effective breeding population in the highlands went down by only 27 percent.
“We found a very strong selection in the highlands population on an immune gene that has been correlated with smallpox, which may have had a protective effect,” Lindo says. The harsh environment of the highlands, he adds, may have also buffered them from the devastation seen in the lowlands.
“Understanding the diet, environment and historical events of various ancestries, and how those ancestries adapted to these factors, may be one way to understand some health disparities among different populations,” Lindo says.
Related:
DNA analysis adds twist to ancient story of a Native American group
Malawi yields oldest-known human DNA from Africa
Photos: Getty Images
Tuesday, October 23, 2018
Schadenfreude sheds light on the darker side of humanity
“We all experience schadenfreude but we don’t like to think about it too much because it shows how ambivalent we can be to our fellow humans,” says Emory psychologist Philippe Rochat.
By Carol Clark
Schadenfreude, the sense of pleasure people derive from the misfortune of others, is a familiar feeling to many — perhaps especially during these times of pervasive social media.
This common, yet poorly understood, emotion may provide a valuable window into the darker side of humanity, finds a review article by psychologists at Emory University. New Ideas in Psychology published the review, which drew upon evidence from three decades of social, developmental, personality and clinical research to devise a novel framework to systematically explain schadenfreude.
The authors propose that schadenfreude comprises three separable but interrelated subforms — aggression, rivalry and justice — which have distinct developmental origins and personality correlates.
They also singled out a commonality underlying these subforms.
“Dehumanization appears to be at the core of schadenfreude,” says Shensheng Wang, a PhD candidate in psychology at Emory and first author of the paper. “The scenarios that elicit schadenfreude, such as intergroup conflicts, tend to also promote dehumanization.”
Co-authors of the study are Emory psychology professors Philippe Rochat, who studies infant and child development, and Scott Lilienfeld, whose research focuses on personality and personality disorders.
Dehumanization is the process of perceiving a person or social group as lacking the attributes that define what it means to be human. It can range from subtle forms, such as assuming that someone from another ethnic group does not feel the full range of emotions as one’s in-group members do, all the way to blatant forms — such as equating sex offenders to animals. Individuals who regularly dehumanize others may have a disposition towards it. Dehumanization can also be situational, such as soldiers dehumanizing the enemy during a battle.
“Our literature review strongly suggests that the propensity to experience schadenfreude isn’t entirely unique, but that it overlaps substantially with several other ‘dark’ personality traits, such as sadism, narcissism and psychopathy,” Lilienfeld says. “Moreover, different subforms of schadenfreude may relate somewhat differently to these often malevolent traits.”
One problem with studying the phenomenon is the lack of an agreed definition of schadenfreude, which literally means “harm joy” in German. Since ancient times, some scholars have condemned schadenfreude as malicious, while others have perceived it as morally neutral or even virtuous.
“Schadenfreude is an uncanny emotion that is difficult to assimilate,” Rochat says. “It’s kind of a warm-cold experience that is associated with a sense of guilt. It can make you feel odd to experience pleasure when hearing about bad things happening to someone else.”
Psychologists view schadenfreude through the lens of three theories. Envy theory focuses on a concern for self-evaluation, and a lessening of painful feelings when someone perceived as enviable gets knocked down a peg. Deservingness theory links schadenfreude to a concern for social justice and the feeling that someone dealt a misfortune received what was coming to them. Intergroup-conflict theory concerns social identity and the schadenfreude experienced after the defeat of members of a rival group, such as during sporting or political competitions.
The authors of the current article wanted to explore how all these different facets of schadenfreude are interrelated, how they differ, and how they can arise in response to these concerns.
Their review delved into the primordial role of these concerns demonstrated in developmental studies. Research suggests that infants as young as eight months demonstrate a sophisticated sense of social justice. In experiments, they showed a preference for puppets who assisted a helpful puppet, and who punished puppets that had exhibited antisocial behavior. Research on infants also points to the early roots of intergroup aggression, showing that, by nine months, infants preferred puppets who punish others who are unlike themselves.
“When you think of normal child development, you think of children becoming good natured and sociable,” Rochat says. “But there’s a dark side to becoming socialized. You create friends and other in-groups to the exclusion of others.”
Spiteful rivalry appears by at least age five or six, when research has shown that children will sometimes opt to maximize their gain over another child, even if they have to sacrifice a resource to do so.
By the time they reach adulthood, many people have learned to hide any tendencies for making a sacrifice just for spite, but they may be more open about making sacrifices that are considered pro-social.
The review article posits a unifying, motivational theory: Concerns of self-evaluation, social identity and justice are the three motivators that drive people toward schadenfreude. What pulls people away from schadenfreude is the ability to feel empathy for others and to perceive them as fully human and to show empathy for them.
Ordinary people may temporarily lose empathy for others. But those with certain personality disorders and associated traits — such as psychopathy, narcissism or sadism — are either less able or less motivated to put themselves in the shoes of others.
“By broadening the perspective of schadenfreude, and connecting all of the related phenomena underlying it, we hope we’ve provided a framework to gain deeper insights into this complex, multi-faceted emotion,” Wang says.
“We all experience schadenfreude but we don’t like to think about it too much because it shows how ambivalent we can be to our fellow humans,” Rochat says. “But schadenfreude points to our ingrained concerns and it’s important to study it in a systematic way if we want to understand human nature.”
Related:
What is a psychopath?
Sharing ideas about the concept of fairness
Monday, October 22, 2018
Study gives new insight into how the brain perceives places
Example of an image from the fMRI study. Participants were asked to imagine they were standing in the room and indicate through a button press whether it was a bedroom, a kitchen or a living room. On separate trials, they were asked to imagine that they were walking on the continuous path through the room and indicate which door they could leave through. (Image by Andrew Persichetti)
By Carol Clark
Nearly 30 years ago, scientists demonstrated that visually recognizing an object, such as a cup, and performing a visually guided action, such as picking the cup up, involved distinct neural processes, located in different areas of the brain. A new study shows that the same is true for how the brain perceives our environment — it has two distinct systems, one for recognizing a place and another for navigating through it.
The Journal of Neuroscience published the finding by researchers at Emory University, based on experiments using functional magnetic resonance imaging (fMRI). The results showed that the brain’s parahippocampal place area responded more strongly to a scene recognition task while the occipital place area responded more to a navigation task.
The work could have important implications for helping people to recover from brain injuries and for the design of computer vision systems, such as self-driving cars.
“It’s thrilling to learn what different regions of the brain are doing,” says Daniel Dilks, senior author of the study and an assistant professor of psychology at Emory. “Learning how the mind makes sense of all the information that we’re bombarded with every day is one of the greatest of intellectual quests. It’s about understanding what makes us human.”
Entering a place and recognizing where you are — whether it’s a kitchen, a bedroom or a garden — occurs instantaneously and you can almost simultaneously make your way around it.
“People assumed that these two brain functions were jumbled up together — that recognizing a place was always navigationally relevant,” says first author Andrew Persichetti, who worked on the study as an Emory graduate student. “We showed that’s not true, that our brain has dedicated and dissociable systems for each of these tasks. It’s remarkable that the closer we look at the brain the more specialized systems we find — our brains have evolved to be super efficient.”
Persichetti, who has since received his PhD from Emory and now works at the National Institute of Mental Health, explains that an interest in philosophy led him to neuroscience. “Immanuel Kant made it clear that if we can’t understand the structure of our mind, the structure of knowledge, we’re not going to fully understand ourselves, or even a lot about the outside world, because that gets filtered through our perceptual and cognitive processes,” he says.
The Dilks lab focuses on mapping how the visual cortex is functionally organized. “We are visual creatures and the majority of the brain is related to processing visual information, one way or another,” Dilks says.
Researchers have wondered since the late 1800s why people suffering from brain damage sometimes experience strange visual consequences. For example, someone might have normal visual function in all ways except for the ability to recognize faces.
It was not until 1992, however, that David Milner and Melvyn Goodale came out with an influential paper delineating two distinct visual systems in the brain. The ventral stream, or the temporal lobe, is involved in object recognition and the dorsal stream, or the parietal lobe, guides an action related to the object.
In 1997, MIT’s Nancy Kanwisher and colleagues demonstrated that a region of the brain is specialized in face perception — the fusiform face area, or FFA. Just a year later, Kanwisher’s lab delineated a neural region specialized in processing places, the parahippocampal place area (PPA), located in the ventral stream.
While working as a post-doctoral fellow in the Kanwisher lab, Dilks led the finding of a second region of the brain specialized in processing places, the occipital place area, or OPA, located in the parietal lobe.
Dilks set up his own lab at Emory the same year that discovery was published, in 2013. Among the first questions he wanted to tackle was why the brain had two regions dedicated to processing places.
Persichetti designed an experiment to test the hypothesis that place processing was divided in the brain in a manner similar to object processing. Using software from the SIMS life simulation game, he created three digital images of places: A bedroom, a kitchen and a living room. Each room had a path leading through it and out one of three doors. Study participants in the fMRI scanner were asked to fixate their gaze on a tiny white cross. On each trial, an image of one of the rooms then appeared, centered behind the cross. Participants were asked to imagine they were standing in the room and indicate through a button press whether it was a bedroom, a kitchen or a living room. On separate trials, the same participants were also asked to imagine that they were walking on the continuous path through the exact same room and indicate whether they could leave through the door on the left, in the center, or on the right.
The resulting data showed that the two brain regions were selectively activated depending on the task: The PPA responded more strongly to the recognition task while the OPA responded more strongly to the navigation task.
“While it’s incredible that we can show that different parts of the cortex are responsible for different functions, it’s only the tip of the iceberg,” Dilks says. “Now that we understand what these areas of the brain are doing we want to know precisely how they’re doing it and why they’re organized this way.”
Dilks plans to run causal tests on the two scene-processing areas. Repetitive transcranial magnetic stimulation, or rTMS, is a non-invasive technology that can be attached to the scalp to temporarily deactivate the OPA in healthy participants and test whether someone can navigate without it.
The same technology cannot be used to deactivate the PPA, due to its deeper location in the temporal lobe. The Dilks lab plans to recruit participants suffering brain injury to the PPA region to test for any effects on their ability to recognize scenes.
Clinical applications for the research include more precise guidance for surgeons who operate on the brain and better brain rehabilitation methods.
“My ultimate goal is to reverse-engineer the human brain’s visual processes and replicate it in a computer vision system,” Dilks says. “In addition to improving robotic systems, a computer model could help us to more fully understand the human mind and brain.”
Related:
How babies see faces: New fMRI technology opens window onto infants' minds
By Carol Clark
Nearly 30 years ago, scientists demonstrated that visually recognizing an object, such as a cup, and performing a visually guided action, such as picking the cup up, involved distinct neural processes, located in different areas of the brain. A new study shows that the same is true for how the brain perceives our environment — it has two distinct systems, one for recognizing a place and another for navigating through it.
The Journal of Neuroscience published the finding by researchers at Emory University, based on experiments using functional magnetic resonance imaging (fMRI). The results showed that the brain’s parahippocampal place area responded more strongly to a scene recognition task while the occipital place area responded more to a navigation task.
The work could have important implications for helping people to recover from brain injuries and for the design of computer vision systems, such as self-driving cars.
“It’s thrilling to learn what different regions of the brain are doing,” says Daniel Dilks, senior author of the study and an assistant professor of psychology at Emory. “Learning how the mind makes sense of all the information that we’re bombarded with every day is one of the greatest of intellectual quests. It’s about understanding what makes us human.”
Entering a place and recognizing where you are — whether it’s a kitchen, a bedroom or a garden — occurs instantaneously and you can almost simultaneously make your way around it.
“People assumed that these two brain functions were jumbled up together — that recognizing a place was always navigationally relevant,” says first author Andrew Persichetti, who worked on the study as an Emory graduate student. “We showed that’s not true, that our brain has dedicated and dissociable systems for each of these tasks. It’s remarkable that the closer we look at the brain the more specialized systems we find — our brains have evolved to be super efficient.”
Persichetti, who has since received his PhD from Emory and now works at the National Institute of Mental Health, explains that an interest in philosophy led him to neuroscience. “Immanuel Kant made it clear that if we can’t understand the structure of our mind, the structure of knowledge, we’re not going to fully understand ourselves, or even a lot about the outside world, because that gets filtered through our perceptual and cognitive processes,” he says.
The Dilks lab focuses on mapping how the visual cortex is functionally organized. “We are visual creatures and the majority of the brain is related to processing visual information, one way or another,” Dilks says.
Researchers have wondered since the late 1800s why people suffering from brain damage sometimes experience strange visual consequences. For example, someone might have normal visual function in all ways except for the ability to recognize faces.
It was not until 1992, however, that David Milner and Melvyn Goodale came out with an influential paper delineating two distinct visual systems in the brain. The ventral stream, or the temporal lobe, is involved in object recognition and the dorsal stream, or the parietal lobe, guides an action related to the object.
In 1997, MIT’s Nancy Kanwisher and colleagues demonstrated that a region of the brain is specialized in face perception — the fusiform face area, or FFA. Just a year later, Kanwisher’s lab delineated a neural region specialized in processing places, the parahippocampal place area (PPA), located in the ventral stream.
While working as a post-doctoral fellow in the Kanwisher lab, Dilks led the finding of a second region of the brain specialized in processing places, the occipital place area, or OPA, located in the parietal lobe.
Dilks set up his own lab at Emory the same year that discovery was published, in 2013. Among the first questions he wanted to tackle was why the brain had two regions dedicated to processing places.
Persichetti designed an experiment to test the hypothesis that place processing was divided in the brain in a manner similar to object processing. Using software from the SIMS life simulation game, he created three digital images of places: A bedroom, a kitchen and a living room. Each room had a path leading through it and out one of three doors. Study participants in the fMRI scanner were asked to fixate their gaze on a tiny white cross. On each trial, an image of one of the rooms then appeared, centered behind the cross. Participants were asked to imagine they were standing in the room and indicate through a button press whether it was a bedroom, a kitchen or a living room. On separate trials, the same participants were also asked to imagine that they were walking on the continuous path through the exact same room and indicate whether they could leave through the door on the left, in the center, or on the right.
The resulting data showed that the two brain regions were selectively activated depending on the task: The PPA responded more strongly to the recognition task while the OPA responded more strongly to the navigation task.
“While it’s incredible that we can show that different parts of the cortex are responsible for different functions, it’s only the tip of the iceberg,” Dilks says. “Now that we understand what these areas of the brain are doing we want to know precisely how they’re doing it and why they’re organized this way.”
Dilks plans to run causal tests on the two scene-processing areas. Repetitive transcranial magnetic stimulation, or rTMS, is a non-invasive technology that can be attached to the scalp to temporarily deactivate the OPA in healthy participants and test whether someone can navigate without it.
The same technology cannot be used to deactivate the PPA, due to its deeper location in the temporal lobe. The Dilks lab plans to recruit participants suffering brain injury to the PPA region to test for any effects on their ability to recognize scenes.
Clinical applications for the research include more precise guidance for surgeons who operate on the brain and better brain rehabilitation methods.
“My ultimate goal is to reverse-engineer the human brain’s visual processes and replicate it in a computer vision system,” Dilks says. “In addition to improving robotic systems, a computer model could help us to more fully understand the human mind and brain.”
Related:
How babies see faces: New fMRI technology opens window onto infants' minds
Monday, October 15, 2018
Scientists chase mystery of how dogs process words
Eddie, one of the dogs that participated in the study, poses in the fMRI scanner with two of the toys used in the experiments, "Monkey" and "Piggy." (Photo courtesy Gregory Berns)
By Carol Clark
When some dogs hear their owners say “squirrel,” they perk up, become agitated. They may even run to a window and look out of it. But what does the word mean to the dog? Does it mean, “Pay attention, something is happening?” Or does the dog actually picture a small, bushy-tailed rodent in its mind?
Frontiers in Neuroscience published one of the first studies using brain imaging to probe how our canine companions process words they have been taught to associate with objects, conducted by scientists at Emory University. The results suggest that dogs have at least a rudimentary neural representation of meaning for words they have been taught, differentiating words they have heard before from those they have not.
“Many dog owners think that their dogs know what some words mean, but there really isn’t much scientific evidence to support that,” says Ashley Prichard, a PhD candidate in Emory’s Department of Psychology and first author of the study. “We wanted to get data from the dogs themselves — not just owner reports.”
“We know that dogs have the capacity to process at least some aspects of human language since they can learn to follow verbal commands,” adds Emory neuroscientist Gregory Berns, senior author of the study. “Previous research, however, suggests dogs may rely on many other cues to follow a verbal command, such as gaze, gestures and even emotional expressions from their owners.”
The Emory researchers focused on questions surrounding the brain mechanisms dogs use to differentiate between words, or even what constitutes a word to a dog.
Berns is founder of the Dog Project, which is researching evolutionary questions surrounding man’s best, and oldest friend. The project was the first to train dogs to voluntarily enter a functional magnetic resonance imaging (fMRI) scanner and remain motionless during scanning, without restraint or sedation. Studies by the Dog Project have furthered understanding of dogs’ neural response to expected reward, identified specialized areas in the dog brain for processing faces, demonstrated olfactory responses to human and dog odors, and linked prefrontal function to inhibitory control.
For the current study, 12 dogs of varying breeds were trained for months by their owners to retrieve two different objects, based on the objects’ names. Each dog’s pair of objects consisted of one with a soft texture, such as a stuffed animal, and another of a different texture, such as rubber, to facilitate discrimination. Training consisted of instructing the dogs to fetch one of the objects and then rewarding them with food or praise. Training was considered complete when a dog showed that it could discriminate between the two objects by consistently fetching the one requested by the owner when presented with both of the objects.
During one experiment, the trained dog lay in the fMRI scanner while the dog’s owner stood directly in front of the dog at the opening of the machine and said the names of the dog’s toys at set intervals, then showed the dog the corresponding toys.
Eddie, a golden retriever-Labrador mix, for instance, heard his owner say the words “Piggy” or “Monkey,” then his owner held up the matching toy. As a control, the owner then spoke gibberish words, such as “bobbu” and “bodmick,” then held up novel objects like a hat or a doll.
The results showed greater activation in auditory regions of the brain to the novel pseudowords relative to the trained words.
“We expected to see that dogs neurally discriminate between words that they know and words that they don’t,” Prichard says. “What’s surprising is that the result is opposite to that of research on humans — people typically show greater neural activation for known words than novel words.”
The researchers hypothesize that the dogs may show greater neural activation to a novel word because they sense their owners want them to understand what they are saying, and they are trying to do so. “Dogs ultimately want to please their owners, and perhaps also receive praise or food,” Berns says.
Half of the dogs in the experiment showed the increased activation for the novel words in their parietotemporal cortex, an area of the brain that the researchers believe may be analogous to the angular gyrus in humans, where lexical differences are processed.
The other half of the dogs, however, showed heightened activity to novel words in other brain regions, including the other parts of the left temporal cortex and amygdala, caudate nucleus, and the thalamus.
These differences may be related to a limitation of the study — the varying range in breeds and sizes of the dogs, as well as possible variations in their cognitive abilities. A major challenge in mapping the cognitive processes of the canine brain, the researchers acknowledge, is the variety of shapes and sizes of dogs’ brains across breeds.
“Dogs may have varying capacity and motivation for learning and understanding human words,” Berns says, “but they appear to have a neural representation for the meaning of words they have been taught, beyond just a low-level Pavlovian response.”
This conclusion does not mean that spoken words are the most effective way for an owner to communicate with a dog. In fact, other research also led by Prichard and Berns and recently published in Scientific Reports, showed that the neural reward system of dogs is more attuned to visual and to scent cues than to verbal ones.
“When people want to teach their dog a trick, they often use a verbal command because that’s what we humans prefer,” Prichard says. “From the dog’s perspective, however, a visual command might be more effective, helping the dog learn the trick faster.”
Co-authors of the Frontiers in Neuroscience study include Peter Cook (a neuroscientist at the New College of Florida), Mark Spivak (owner of Comprehensive Pet Therapy) and Raveena Chhibber (an information specialist in Emory’s Department of Psychology).
Co-authors of the Science Reports paper also include Spivak and Chhibber, along with Kate Athanassiades (from Emory’s School of Nursing).
Related:
Do dogs prefer praise or food?
Scent of the familiar: You may linger like perfume in your dog's brain
Multi-dog experiment points to canine brain's reward center
By Carol Clark
When some dogs hear their owners say “squirrel,” they perk up, become agitated. They may even run to a window and look out of it. But what does the word mean to the dog? Does it mean, “Pay attention, something is happening?” Or does the dog actually picture a small, bushy-tailed rodent in its mind?
Frontiers in Neuroscience published one of the first studies using brain imaging to probe how our canine companions process words they have been taught to associate with objects, conducted by scientists at Emory University. The results suggest that dogs have at least a rudimentary neural representation of meaning for words they have been taught, differentiating words they have heard before from those they have not.
“Many dog owners think that their dogs know what some words mean, but there really isn’t much scientific evidence to support that,” says Ashley Prichard, a PhD candidate in Emory’s Department of Psychology and first author of the study. “We wanted to get data from the dogs themselves — not just owner reports.”
Study participant Stella and her toys. |
The Emory researchers focused on questions surrounding the brain mechanisms dogs use to differentiate between words, or even what constitutes a word to a dog.
Berns is founder of the Dog Project, which is researching evolutionary questions surrounding man’s best, and oldest friend. The project was the first to train dogs to voluntarily enter a functional magnetic resonance imaging (fMRI) scanner and remain motionless during scanning, without restraint or sedation. Studies by the Dog Project have furthered understanding of dogs’ neural response to expected reward, identified specialized areas in the dog brain for processing faces, demonstrated olfactory responses to human and dog odors, and linked prefrontal function to inhibitory control.
For the current study, 12 dogs of varying breeds were trained for months by their owners to retrieve two different objects, based on the objects’ names. Each dog’s pair of objects consisted of one with a soft texture, such as a stuffed animal, and another of a different texture, such as rubber, to facilitate discrimination. Training consisted of instructing the dogs to fetch one of the objects and then rewarding them with food or praise. Training was considered complete when a dog showed that it could discriminate between the two objects by consistently fetching the one requested by the owner when presented with both of the objects.
During one experiment, the trained dog lay in the fMRI scanner while the dog’s owner stood directly in front of the dog at the opening of the machine and said the names of the dog’s toys at set intervals, then showed the dog the corresponding toys.
Eddie, a golden retriever-Labrador mix, for instance, heard his owner say the words “Piggy” or “Monkey,” then his owner held up the matching toy. As a control, the owner then spoke gibberish words, such as “bobbu” and “bodmick,” then held up novel objects like a hat or a doll.
The results showed greater activation in auditory regions of the brain to the novel pseudowords relative to the trained words.
“We expected to see that dogs neurally discriminate between words that they know and words that they don’t,” Prichard says. “What’s surprising is that the result is opposite to that of research on humans — people typically show greater neural activation for known words than novel words.”
The researchers hypothesize that the dogs may show greater neural activation to a novel word because they sense their owners want them to understand what they are saying, and they are trying to do so. “Dogs ultimately want to please their owners, and perhaps also receive praise or food,” Berns says.
Half of the dogs in the experiment showed the increased activation for the novel words in their parietotemporal cortex, an area of the brain that the researchers believe may be analogous to the angular gyrus in humans, where lexical differences are processed.
The other half of the dogs, however, showed heightened activity to novel words in other brain regions, including the other parts of the left temporal cortex and amygdala, caudate nucleus, and the thalamus.
These differences may be related to a limitation of the study — the varying range in breeds and sizes of the dogs, as well as possible variations in their cognitive abilities. A major challenge in mapping the cognitive processes of the canine brain, the researchers acknowledge, is the variety of shapes and sizes of dogs’ brains across breeds.
“Dogs may have varying capacity and motivation for learning and understanding human words,” Berns says, “but they appear to have a neural representation for the meaning of words they have been taught, beyond just a low-level Pavlovian response.”
This conclusion does not mean that spoken words are the most effective way for an owner to communicate with a dog. In fact, other research also led by Prichard and Berns and recently published in Scientific Reports, showed that the neural reward system of dogs is more attuned to visual and to scent cues than to verbal ones.
“When people want to teach their dog a trick, they often use a verbal command because that’s what we humans prefer,” Prichard says. “From the dog’s perspective, however, a visual command might be more effective, helping the dog learn the trick faster.”
Co-authors of the Frontiers in Neuroscience study include Peter Cook (a neuroscientist at the New College of Florida), Mark Spivak (owner of Comprehensive Pet Therapy) and Raveena Chhibber (an information specialist in Emory’s Department of Psychology).
Co-authors of the Science Reports paper also include Spivak and Chhibber, along with Kate Athanassiades (from Emory’s School of Nursing).
Related:
Do dogs prefer praise or food?
Scent of the familiar: You may linger like perfume in your dog's brain
Multi-dog experiment points to canine brain's reward center
Monday, October 1, 2018
Songbird data yields new theory for learning sensorimotor skills
"Our findings suggest that an animal knows that even the perfect neural command is not going to result in the right outcome every time," says Emory biophysicist Ilya Nemenman. (Image courtesy Samuel Sober.)
By Carol Clark
Songbirds learn to sing in a way similar to how humans learn to speak — by listening to their fathers and trying to duplicate the sounds. The bird’s brain sends commands to the vocal muscles to sing what it hears, and then the brain keeps trying to adjust the command until the sound echoes the one made by the parent.
During such trial-and-error processes of sensorimotor learning, a bird remembers not just the best possible command, but a whole suite of possibilities, suggests a study by scientists at Emory University.
The Proceedings of the National Academy of the Sciences (PNAS) published the study results, which include a new mathematical model for the distribution of sensory errors in learning.
“Our findings suggest that an animal knows that even the perfect neural command is not going to result in the right outcome every time,” says Ilya Nemenman, an Emory professor of biophysics and senior author of the paper. “Animals, including humans, want to explore and keep track of a range of possibilities when learning something in order to compensate for variabilities.”
Nemenman uses the example of learning to swing a tennis racket. “You’re only rarely going to hit the ball in the racket’s exact sweet spot,” he says. “And every day when you pick up the racket to play your swing is going to be a little bit different, because your body is different, the racket and the ball are different, and the environmental conditions are different. So your body needs to remember a whole range of commands, in order to adapt to these different situations and get the ball to go where you want.”
First author of the study is Baohua Zhou, a graduate student of physics. Co-authors include David Hofmann and Itai Pinkoviezky (post-doctoral fellows in physics) and Samuel Sober, an associate professor of biology.
Traditional theories of learning propose that animals use sensory error signals to zero in on the optimal motor command, based on a normal distribution of possible errors around it — what is known as a bell curve. Those theories, however, cannot explain the behavioral observations that small sensory errors are more readily corrected, while the larger ones may be ignored by the animal altogether.
For the PNAS paper, the researchers analyzed experimental data on Bengalese finches collected in previous work with the Sober lab. The lab uses finches as a model system for understanding how the brain controls complex vocal behavior and motor behavior in general.
Miniature headphones were custom-fitted to adult birds and used to provide auditory feedback in which the pitch that the bird perceives it vocalizes at could be manipulated, replacing what the bird hears — its natural auditory feedback — with the manipulated version. The birds would try to correct the pitch they were hearing to match the sound they were trying to make. Experiments allowed the researchers to record and measure the relationship between the size of a vocal error the bird perceives, and the probability of the brain making a correction of a specific size.
The researchers analyzed the data and found that the variability of errors in correction did not have the normal distribution of a bell curve, as previously proposed. Instead, the distribution had long tails of variability, indicating that the animal believed that even large fluctuations in the motor commands could sometimes produce a correct pitch. The researchers also found that the birds combined their hypotheses about the relationship between the motor command and the pitch with the new information that their brains received from their ears while singing. In fact, they did this surprisingly accurately.
“The birds are not just trying to sing in the best possible way, but appear to be exploring and trying wide variations,” Nemenman says. “In this way, they learn to correct small errors, but they don’t even try to correct large errors, unless the large error is broken down and built up gradually.”
The researchers created a mathematical model for this process, revealing the pattern of how small errors are corrected quickly and large errors take much longer to correct, and might be neglected altogether, when they contradict the animal’s “beliefs” about the errors that its sensorimotor system can produce.
“Our model provides a new theory for how an animal learns, one that allows us to make predictions for learning that we have tested experimentally,” Nemenman says.
The researchers are now exploring if this model can be used to predict learning in other animals, as well as predicting better rehabilitative protocols for people dealing with major disruptions to their learned behaviors, such as when recovering from a stroke.
The work was funded by the National Institutes of Health BRAIN Initiative, the James S. McDonnell Foundation, and the National Science Foundation. The NVIDIA corporation donated high-performance computing hardware that supported the work.
Related:
BRAIN grant to fund study of how the mind learns
How songbirds learn to sing
By Carol Clark
Songbirds learn to sing in a way similar to how humans learn to speak — by listening to their fathers and trying to duplicate the sounds. The bird’s brain sends commands to the vocal muscles to sing what it hears, and then the brain keeps trying to adjust the command until the sound echoes the one made by the parent.
During such trial-and-error processes of sensorimotor learning, a bird remembers not just the best possible command, but a whole suite of possibilities, suggests a study by scientists at Emory University.
The Proceedings of the National Academy of the Sciences (PNAS) published the study results, which include a new mathematical model for the distribution of sensory errors in learning.
“Our findings suggest that an animal knows that even the perfect neural command is not going to result in the right outcome every time,” says Ilya Nemenman, an Emory professor of biophysics and senior author of the paper. “Animals, including humans, want to explore and keep track of a range of possibilities when learning something in order to compensate for variabilities.”
Nemenman uses the example of learning to swing a tennis racket. “You’re only rarely going to hit the ball in the racket’s exact sweet spot,” he says. “And every day when you pick up the racket to play your swing is going to be a little bit different, because your body is different, the racket and the ball are different, and the environmental conditions are different. So your body needs to remember a whole range of commands, in order to adapt to these different situations and get the ball to go where you want.”
First author of the study is Baohua Zhou, a graduate student of physics. Co-authors include David Hofmann and Itai Pinkoviezky (post-doctoral fellows in physics) and Samuel Sober, an associate professor of biology.
Traditional theories of learning propose that animals use sensory error signals to zero in on the optimal motor command, based on a normal distribution of possible errors around it — what is known as a bell curve. Those theories, however, cannot explain the behavioral observations that small sensory errors are more readily corrected, while the larger ones may be ignored by the animal altogether.
For the PNAS paper, the researchers analyzed experimental data on Bengalese finches collected in previous work with the Sober lab. The lab uses finches as a model system for understanding how the brain controls complex vocal behavior and motor behavior in general.
Miniature headphones were custom-fitted to adult birds and used to provide auditory feedback in which the pitch that the bird perceives it vocalizes at could be manipulated, replacing what the bird hears — its natural auditory feedback — with the manipulated version. The birds would try to correct the pitch they were hearing to match the sound they were trying to make. Experiments allowed the researchers to record and measure the relationship between the size of a vocal error the bird perceives, and the probability of the brain making a correction of a specific size.
The researchers analyzed the data and found that the variability of errors in correction did not have the normal distribution of a bell curve, as previously proposed. Instead, the distribution had long tails of variability, indicating that the animal believed that even large fluctuations in the motor commands could sometimes produce a correct pitch. The researchers also found that the birds combined their hypotheses about the relationship between the motor command and the pitch with the new information that their brains received from their ears while singing. In fact, they did this surprisingly accurately.
“The birds are not just trying to sing in the best possible way, but appear to be exploring and trying wide variations,” Nemenman says. “In this way, they learn to correct small errors, but they don’t even try to correct large errors, unless the large error is broken down and built up gradually.”
The researchers created a mathematical model for this process, revealing the pattern of how small errors are corrected quickly and large errors take much longer to correct, and might be neglected altogether, when they contradict the animal’s “beliefs” about the errors that its sensorimotor system can produce.
“Our model provides a new theory for how an animal learns, one that allows us to make predictions for learning that we have tested experimentally,” Nemenman says.
The researchers are now exploring if this model can be used to predict learning in other animals, as well as predicting better rehabilitative protocols for people dealing with major disruptions to their learned behaviors, such as when recovering from a stroke.
The work was funded by the National Institutes of Health BRAIN Initiative, the James S. McDonnell Foundation, and the National Science Foundation. The NVIDIA corporation donated high-performance computing hardware that supported the work.
Related:
BRAIN grant to fund study of how the mind learns
How songbirds learn to sing
Friday, September 21, 2018
Climate change calls for a fresh approach to water woes
An egret spreads its wings above waters of the Everglades. "Climate change is a game changer" when it comes to managing major water basins across the country, says Lance Gunderson, chair of Emory's Department of Environmental Sciences.
By Carol Clark
The Everglades National Park, the largest subtropical wilderness in the United States, is home to 16 different species of wading birds and rare and endangered species like the manatee, the American crocodile and the Florida panther. But the area is also home to humans. The park is a portion of a larger wetland ecosystem, more than half of which has been converted into agricultural production or urban developments. The ecosystem must provide both flood protection and supply water for the park, the agricultural interests and South Forida’s rapidly growing population of nearly eight million people.
Meanwhile, a federal-state initiative to address this challenge, known as the Comprehensive Everglades Restoration Plan, is “sort of stuck in the muddle,” says Lance Gunderson, chair of Emory’s Department of Environmental Sciences. The plan was authorized in 2000 but it hasn’t made much progress.
Climate change throws another wrench in the works, affecting the Everglades and other large watersheds across the United States in new and unpredictable ways. Extreme weather events and rising sea levels, combined with a growing population, will lead to “more intense arguments” about already contested issues of water quality and water usage, Gunderson says.
Gunderson, a wetlands ecologist, recently partnered with Barbara Cosens, a legal scholar at the University of Idaho, to lead an interdisciplinary team of researchers in a project to assess the adaptive capacity of six major U.S. water basins to changing climates. In addition to the Everglades, the basins include the Anacostia, the Columbia, the Klamath, the Platte and the Rio Grande rivers. The project was funded by NSF Social-Ecological Synthesis Center at the University of Maryland. (Watch the video below to learn more.)
“Climate change is a game changer when it comes to the management of these regional-scale water systems across the country,” Gunderson says. “These systems are managed through assumptions about climate and models that are based on averages. Now, managers are struggling to adapt to more extremes — like earlier snow melts, more floods and droughts, and more intense storms.”
Even without extreme events, water management is complex. The Everglades, for example, is not just an issue of restoring biological diversity. It’s an economic problem that often puts government agencies, agriculture, developers, residents, and environmental groups at loggerheads.
“These are complex problems and we can’t plan or analyze our way out of them,” Gunderson says. “We have to learn our way out of them.”
Instead of relying on the court system or government policies, he says people need to come together in organic, self-organized ways for “adaptive governance.” Such approaches can forge new paths through a problem by trying small experiments to see if they work.
The Klamath River basin, for instance, benefited by farmers and Native Americans coming together informally, instead of going to court, to talk about possible ways to reallocate water to satisfy both sides.
“Informal, adaptive management lets you learn while you’re doing,” Gunderson says. “It allows people without resources to be engaged in the process. Change happens when little groups of people work together collectively on wicked problems that have no easy solutions or easy answers.”
As chair of Environmental Sciences Gunderson is also confronted with the problem of how to train students to deal with the issues that will face them when they graduate. The department is blending facets of political science, ecology, sociology, biology, geology and health into its curriculum.
“These specialties are at the intersection of major environmental problems and we are trying to build some integrated understanding around them,” Gunderson says. “Our world is becoming more complex and we want students to have the skills to confront that complexity.”
Related:
Students develop device to help cope with climate change
Responding to climate change
The growing role of farming and nitrous oxide in climate change
Putting people into the climate change picture
By Carol Clark
The Everglades National Park, the largest subtropical wilderness in the United States, is home to 16 different species of wading birds and rare and endangered species like the manatee, the American crocodile and the Florida panther. But the area is also home to humans. The park is a portion of a larger wetland ecosystem, more than half of which has been converted into agricultural production or urban developments. The ecosystem must provide both flood protection and supply water for the park, the agricultural interests and South Forida’s rapidly growing population of nearly eight million people.
Meanwhile, a federal-state initiative to address this challenge, known as the Comprehensive Everglades Restoration Plan, is “sort of stuck in the muddle,” says Lance Gunderson, chair of Emory’s Department of Environmental Sciences. The plan was authorized in 2000 but it hasn’t made much progress.
Climate change throws another wrench in the works, affecting the Everglades and other large watersheds across the United States in new and unpredictable ways. Extreme weather events and rising sea levels, combined with a growing population, will lead to “more intense arguments” about already contested issues of water quality and water usage, Gunderson says.
Gunderson, a wetlands ecologist, recently partnered with Barbara Cosens, a legal scholar at the University of Idaho, to lead an interdisciplinary team of researchers in a project to assess the adaptive capacity of six major U.S. water basins to changing climates. In addition to the Everglades, the basins include the Anacostia, the Columbia, the Klamath, the Platte and the Rio Grande rivers. The project was funded by NSF Social-Ecological Synthesis Center at the University of Maryland. (Watch the video below to learn more.)
“Climate change is a game changer when it comes to the management of these regional-scale water systems across the country,” Gunderson says. “These systems are managed through assumptions about climate and models that are based on averages. Now, managers are struggling to adapt to more extremes — like earlier snow melts, more floods and droughts, and more intense storms.”
Even without extreme events, water management is complex. The Everglades, for example, is not just an issue of restoring biological diversity. It’s an economic problem that often puts government agencies, agriculture, developers, residents, and environmental groups at loggerheads.
“These are complex problems and we can’t plan or analyze our way out of them,” Gunderson says. “We have to learn our way out of them.”
Instead of relying on the court system or government policies, he says people need to come together in organic, self-organized ways for “adaptive governance.” Such approaches can forge new paths through a problem by trying small experiments to see if they work.
The Klamath River basin, for instance, benefited by farmers and Native Americans coming together informally, instead of going to court, to talk about possible ways to reallocate water to satisfy both sides.
“Informal, adaptive management lets you learn while you’re doing,” Gunderson says. “It allows people without resources to be engaged in the process. Change happens when little groups of people work together collectively on wicked problems that have no easy solutions or easy answers.”
As chair of Environmental Sciences Gunderson is also confronted with the problem of how to train students to deal with the issues that will face them when they graduate. The department is blending facets of political science, ecology, sociology, biology, geology and health into its curriculum.
“These specialties are at the intersection of major environmental problems and we are trying to build some integrated understanding around them,” Gunderson says. “Our world is becoming more complex and we want students to have the skills to confront that complexity.”
Related:
Students develop device to help cope with climate change
Responding to climate change
The growing role of farming and nitrous oxide in climate change
Putting people into the climate change picture
Tuesday, August 28, 2018
The math of malaria: Drug resistance 'a numbers game' of competing parasites
"Computer models can sometimes give you insights that would be too difficult to get in a real-world setting," says Mary Bushman. She developed a malaria model for her PhD thesis, advised by Emory evolutionary biologist Jaap de Roode. (Ann Watson, Emory Photo/Video)
By Carol Clark
A new mathematical model for malaria shows how competition between parasite strains within a human host reduces the odds of drug resistance developing in a high-transmission setting. But if a drug-resistant strain does become established, that same competition drives the spread of resistance faster, under strong selection from antimalarial drug use.
“It’s basically a numbers game,” says Mary Bushman, who developed the model for her PhD thesis in Emory University’s Population Biology, Ecology and Evolution Graduate Program. “When you already have multiple strains of malaria within a population, and a drug-resistant strain comes along, it will usually go extinct simply because it’s a late-comer. Whichever strain is there first has the advantage.”
PLOS Biology published the findings, a computational framework that modeled a malaria epidemic across multiple scales: Transmission of parasites from mosquitos to humans, and the dynamics of parasites competing to infect blood cells while they also battle the immune system of a human host.
After creating the model, Bushman ran simulations tracking malaria in a population for roughly 14 years. The simulations included 400 theoretical people who were randomly bitten by 12,000 mosquitos that were infected with malaria parasites classified as either drug resistant or drug susceptible. Various levels of treatment with antimalarial drugs were also part of the simulations.
“Our model holds strong relevance for infectious diseases beyond malaria,” says Jaap de Roode, an evolutionary biologist at Emory and senior author of the paper. “We hope this research gives others a method to look at disease dynamics across scales of biological organisms to learn how drug resistance develops in a range of pathogens.”
The study’s authors also include Emory biologist Rustom Antia (a specialist in infectious disease modeling) and Venkatachalam Udhayakumar, a malaria expert from the Centers of Disease Control and Prevention’s Division of Parasitic Diseases and Malaria.
The researchers are now working to develop their specific model for malaria into a generalized software tool for infectious diseases. “Computer models can sometimes give you insights that would be too difficult to get in a real-world setting,” says Bushman, who is now a post-doctoral fellow in the Antia lab.
"The distinction between establishment and spread just jumped out of the data," Bushman says.
Malaria occurs in poor, tropical and subtropical areas of the world, although most of the global death toll consists of children from sub-Saharan Africa. People infected in this high-transmission area often have multiple strains of the parasite and, by the time they have reached adulthood, they have usually developed partial immunity.
“It’s a baffling disease,” Bushman says. “Malaria has been studied for more than 100 years, much longer than most diseases, but there is still a lot that we don’t understand about it."
Malaria is caused by several species of Plasmodium parasites that are transmitted to humans by mosquitos. Plasmodium falciparum, the most common malaria parasite on the continent of Africa, is the one responsible for the most malaria-related deaths globally.
P. falciparum has developed resistance to former first-line therapies chloroquine and sulfadoxine-pyrimethamine. Resistance has also emerged in Southeast Asia to the third and last available treatment, artemisinin combination therapy, or ACT.
One of the mysteries about malaria is why drug-resistant strains tend to emerge first in low-transmission areas, like Southeast Asia, and not appear until much later in Africa, where transmission is high.
Previous research led by de Roode and Bushman showed that when people are co-infected with drug-resistant and drug-sensitive strains of malaria, both strains are competitively suppressed.
For the current paper, the researchers wanted to get a more detailed understanding of these dynamics. Some evidence had shown that within-host competition could suppress resistance, while other studies showed that it could ramp resistance up.
“It was a little bit of a puzzle, why the findings were conflicting,” Bushman says.
The new model, driven by evidence for how malaria parasites work within the immune system and the blood cells they infect, provided a solution to the puzzle.
“Some previous models were based on the assumption that when you put two strains of malaria into a host, they split 50-50,” Bushman says. “But our model showed that the system is asymmetrical. When you put two strains in a host they virtually never split 50-50.”
The late-comer will usually go extinct, which explains why in high-transmission areas drug resistant strains are at a big disadvantage. But in low transmission areas, such as Southeast Asia, a drug resistant strain has a better chance of arriving first in a host and getting established.
The new model also showed how once a drug-resistant strain becomes established in a high transmission area, it will spread much faster than it would in a low transmission area.
“The distinction between establishment and spread just jumped out of the data,” Bushman says. “Our model validated both sides of the argument — that within-host dynamics of competing parasites could both repress and accelerate the spread of resistance. The phenomena are occurring at different stages of the process so they both can happen.”
The results offer a new explanation for why chloroquine resistance arrived relatively late in Africa, appearing in Kenya and Tanzania in 1978, but then spread rapidly across the continent.
Related:
Mixed-strain malaria infections influence drug resistance
Zeroing in on 'super spreaders' and other hidden patterns of epidemics
By Carol Clark
A new mathematical model for malaria shows how competition between parasite strains within a human host reduces the odds of drug resistance developing in a high-transmission setting. But if a drug-resistant strain does become established, that same competition drives the spread of resistance faster, under strong selection from antimalarial drug use.
“It’s basically a numbers game,” says Mary Bushman, who developed the model for her PhD thesis in Emory University’s Population Biology, Ecology and Evolution Graduate Program. “When you already have multiple strains of malaria within a population, and a drug-resistant strain comes along, it will usually go extinct simply because it’s a late-comer. Whichever strain is there first has the advantage.”
PLOS Biology published the findings, a computational framework that modeled a malaria epidemic across multiple scales: Transmission of parasites from mosquitos to humans, and the dynamics of parasites competing to infect blood cells while they also battle the immune system of a human host.
After creating the model, Bushman ran simulations tracking malaria in a population for roughly 14 years. The simulations included 400 theoretical people who were randomly bitten by 12,000 mosquitos that were infected with malaria parasites classified as either drug resistant or drug susceptible. Various levels of treatment with antimalarial drugs were also part of the simulations.
“Our model holds strong relevance for infectious diseases beyond malaria,” says Jaap de Roode, an evolutionary biologist at Emory and senior author of the paper. “We hope this research gives others a method to look at disease dynamics across scales of biological organisms to learn how drug resistance develops in a range of pathogens.”
The study’s authors also include Emory biologist Rustom Antia (a specialist in infectious disease modeling) and Venkatachalam Udhayakumar, a malaria expert from the Centers of Disease Control and Prevention’s Division of Parasitic Diseases and Malaria.
The researchers are now working to develop their specific model for malaria into a generalized software tool for infectious diseases. “Computer models can sometimes give you insights that would be too difficult to get in a real-world setting,” says Bushman, who is now a post-doctoral fellow in the Antia lab.
"The distinction between establishment and spread just jumped out of the data," Bushman says.
Malaria occurs in poor, tropical and subtropical areas of the world, although most of the global death toll consists of children from sub-Saharan Africa. People infected in this high-transmission area often have multiple strains of the parasite and, by the time they have reached adulthood, they have usually developed partial immunity.
“It’s a baffling disease,” Bushman says. “Malaria has been studied for more than 100 years, much longer than most diseases, but there is still a lot that we don’t understand about it."
Malaria is caused by several species of Plasmodium parasites that are transmitted to humans by mosquitos. Plasmodium falciparum, the most common malaria parasite on the continent of Africa, is the one responsible for the most malaria-related deaths globally.
P. falciparum has developed resistance to former first-line therapies chloroquine and sulfadoxine-pyrimethamine. Resistance has also emerged in Southeast Asia to the third and last available treatment, artemisinin combination therapy, or ACT.
One of the mysteries about malaria is why drug-resistant strains tend to emerge first in low-transmission areas, like Southeast Asia, and not appear until much later in Africa, where transmission is high.
Previous research led by de Roode and Bushman showed that when people are co-infected with drug-resistant and drug-sensitive strains of malaria, both strains are competitively suppressed.
For the current paper, the researchers wanted to get a more detailed understanding of these dynamics. Some evidence had shown that within-host competition could suppress resistance, while other studies showed that it could ramp resistance up.
“It was a little bit of a puzzle, why the findings were conflicting,” Bushman says.
The new model, driven by evidence for how malaria parasites work within the immune system and the blood cells they infect, provided a solution to the puzzle.
“Some previous models were based on the assumption that when you put two strains of malaria into a host, they split 50-50,” Bushman says. “But our model showed that the system is asymmetrical. When you put two strains in a host they virtually never split 50-50.”
The late-comer will usually go extinct, which explains why in high-transmission areas drug resistant strains are at a big disadvantage. But in low transmission areas, such as Southeast Asia, a drug resistant strain has a better chance of arriving first in a host and getting established.
The new model also showed how once a drug-resistant strain becomes established in a high transmission area, it will spread much faster than it would in a low transmission area.
“The distinction between establishment and spread just jumped out of the data,” Bushman says. “Our model validated both sides of the argument — that within-host dynamics of competing parasites could both repress and accelerate the spread of resistance. The phenomena are occurring at different stages of the process so they both can happen.”
The results offer a new explanation for why chloroquine resistance arrived relatively late in Africa, appearing in Kenya and Tanzania in 1978, but then spread rapidly across the continent.
Related:
Mixed-strain malaria infections influence drug resistance
Zeroing in on 'super spreaders' and other hidden patterns of epidemics
Monday, August 27, 2018
Sensitivity to how others evaluate you emerges by 24 months
"Image management is fascinating to me because it's so important to being human," says Sara Valencia Botto, shown posing with a toddler. The Emory graduate student published a study on how toddlers are attuned to image, along with psychology professor Philippe Rochat. (Kay Hinton, Emory Photo/Video)
By Carol Clark
Even before toddlers can form a complete sentence, they are attuned to how others may be judging them, finds a new study by psychologists at Emory University.
The journal Developmental Psychology is publishing the results, documenting that toddlers are sensitive to the opinions of others, and that they will modify their behavior accordingly when others are watching.
“We’ve shown that by the age of 24 months, children are not only aware that other people may be evaluating them, but that they will alter their behavior to seek a positive response,” says Sara Valencia Botto, an Emory PhD candidate and first author of the study.
While previous research has documented this behavior in four- to five-year-olds, the new study suggests that it may emerge much sooner, Botto says.
“There is something specifically human in the way that we’re sensitive to the gaze of others, and how systematic and strategic we are about controlling that gaze,” says Philippe Rochat, an Emory professor of psychology who specializes in childhood development and senior author of the study. “At the very bottom, our concern for image management and reputation is about the fear of rejection, one of the main engines of the human psyche.”
This concern for reputation manifests itself in everything from spending money on makeup and designer brands to checking how many “likes” a Facebook post garners.
“Image management is fascinating to me because it’s so important to being human,” Botto says. “Many people rate their fear of public speaking above their fear of dying. If we want to understand human nature, we need to understand when and how the foundation for caring about image emerges.”
The researchers conducted experiments involving 144 children between the ages of 14 and 24 months using a remotely controlled robot toy.
In one experiment, a researcher showed a toddler how to use the remote to operate the robot. The researcher then either watched the child with a neutral expression or turned away and pretended to read a magazine. When the child was being watched, he or she showed more inhibition when hitting the buttons on the remote than when the researcher was not watching.
In a second experiment, the researcher used two different remotes when demonstrating the toy to the child. While using the first remote, the researcher smiled and said, “Wow! Isn’t that great?” And when using the second remote, the researcher frowned and said “Uh-oh! Oops, oh no!” After inviting the child to play with the toy, the researcher once again either watched the child or turned to the magazine.
The children pressed the buttons on the remote associated with the positive response from the researcher significantly more while being watched. And they used the remote associated with the negative response more when not being watched.
During a third experiment, that served as a control, the researcher gave a neutral response of “Oh, wow!” when demonstrating how to use the two remotes. The children no longer chose one remote over the other depending on whether the researcher was watching them.
The control experiment showed that in the second experiment the children really did take into account the values expressed by the experimenter when interacting with the toy, and based on those values changed their behavior depending on whether they were being watched, Botto says.
A final experiment involved two researchers sitting next to one another and using one remote. One researcher smiled and gave a positive response, “Yay! The toy moved!” when pressing the remote. The second researcher frowned and said, “Yuck! The toy moved!” when pressing the same remote. The child was then invited to play with the toy while the two researchers alternated between either watching or turning their back to the child. Results showed that the children were much more likely to press the remote when the researcher who gave the positive response was watching.
“We were surprised by the flexibility of the children’s sensitivity to others and their reactions,” Botto says. “They could track one researcher’s values of two objects and two researchers’ values of one object. It reinforces the idea that children are usually smarter than we think.”
Botto is continuing to lead the research in the Rochat lab for her PhD thesis. She is now developing experiments for children as young as 12 months to see if the sensitivity to being evaluated by others emerges even earlier than the current study documents.
And she is following the 14- to 24-month-old children involved in the published study, to see if the individual differences they showed in the experiments are maintained as they turn four and five. The researchers are measuring social and cognitive factors that may have predictive power for individual differences — such as language ability, temperament and a child’s ability to pick up on social norms and to understand that people can have beliefs different from their own.
“Ultimately, we hope to determine exactly when children begin to be sensitive to others’ evaluations and the social and cognitive factors that are necessary for that sensitivity to emerge,” Botto says.
Such basic research may translate into helping people in a clinical environment who are at the extremes of the spectrum of such sensitivity, she adds.
“It’s normal and necessary to a certain extent to care about our image with others,” Botto says. “But some people care so much that they suffer from social anxiety, while others care so little that it is not optimal in a society where cooperation is essential.”
The American Psychological Association contributed to this report.
Related:
Babies have logical reasoning before age one, study finds
Babies' spatial reasoning predicts later math skills
By Carol Clark
Even before toddlers can form a complete sentence, they are attuned to how others may be judging them, finds a new study by psychologists at Emory University.
The journal Developmental Psychology is publishing the results, documenting that toddlers are sensitive to the opinions of others, and that they will modify their behavior accordingly when others are watching.
“We’ve shown that by the age of 24 months, children are not only aware that other people may be evaluating them, but that they will alter their behavior to seek a positive response,” says Sara Valencia Botto, an Emory PhD candidate and first author of the study.
While previous research has documented this behavior in four- to five-year-olds, the new study suggests that it may emerge much sooner, Botto says.
“There is something specifically human in the way that we’re sensitive to the gaze of others, and how systematic and strategic we are about controlling that gaze,” says Philippe Rochat, an Emory professor of psychology who specializes in childhood development and senior author of the study. “At the very bottom, our concern for image management and reputation is about the fear of rejection, one of the main engines of the human psyche.”
This concern for reputation manifests itself in everything from spending money on makeup and designer brands to checking how many “likes” a Facebook post garners.
“Image management is fascinating to me because it’s so important to being human,” Botto says. “Many people rate their fear of public speaking above their fear of dying. If we want to understand human nature, we need to understand when and how the foundation for caring about image emerges.”
The researchers conducted experiments involving 144 children between the ages of 14 and 24 months using a remotely controlled robot toy.
In one experiment, a researcher showed a toddler how to use the remote to operate the robot. The researcher then either watched the child with a neutral expression or turned away and pretended to read a magazine. When the child was being watched, he or she showed more inhibition when hitting the buttons on the remote than when the researcher was not watching.
In a second experiment, the researcher used two different remotes when demonstrating the toy to the child. While using the first remote, the researcher smiled and said, “Wow! Isn’t that great?” And when using the second remote, the researcher frowned and said “Uh-oh! Oops, oh no!” After inviting the child to play with the toy, the researcher once again either watched the child or turned to the magazine.
The children pressed the buttons on the remote associated with the positive response from the researcher significantly more while being watched. And they used the remote associated with the negative response more when not being watched.
During a third experiment, that served as a control, the researcher gave a neutral response of “Oh, wow!” when demonstrating how to use the two remotes. The children no longer chose one remote over the other depending on whether the researcher was watching them.
The control experiment showed that in the second experiment the children really did take into account the values expressed by the experimenter when interacting with the toy, and based on those values changed their behavior depending on whether they were being watched, Botto says.
A final experiment involved two researchers sitting next to one another and using one remote. One researcher smiled and gave a positive response, “Yay! The toy moved!” when pressing the remote. The second researcher frowned and said, “Yuck! The toy moved!” when pressing the same remote. The child was then invited to play with the toy while the two researchers alternated between either watching or turning their back to the child. Results showed that the children were much more likely to press the remote when the researcher who gave the positive response was watching.
“We were surprised by the flexibility of the children’s sensitivity to others and their reactions,” Botto says. “They could track one researcher’s values of two objects and two researchers’ values of one object. It reinforces the idea that children are usually smarter than we think.”
Botto is continuing to lead the research in the Rochat lab for her PhD thesis. She is now developing experiments for children as young as 12 months to see if the sensitivity to being evaluated by others emerges even earlier than the current study documents.
And she is following the 14- to 24-month-old children involved in the published study, to see if the individual differences they showed in the experiments are maintained as they turn four and five. The researchers are measuring social and cognitive factors that may have predictive power for individual differences — such as language ability, temperament and a child’s ability to pick up on social norms and to understand that people can have beliefs different from their own.
“Ultimately, we hope to determine exactly when children begin to be sensitive to others’ evaluations and the social and cognitive factors that are necessary for that sensitivity to emerge,” Botto says.
Such basic research may translate into helping people in a clinical environment who are at the extremes of the spectrum of such sensitivity, she adds.
“It’s normal and necessary to a certain extent to care about our image with others,” Botto says. “But some people care so much that they suffer from social anxiety, while others care so little that it is not optimal in a society where cooperation is essential.”
The American Psychological Association contributed to this report.
Related:
Babies have logical reasoning before age one, study finds
Babies' spatial reasoning predicts later math skills
Wednesday, August 22, 2018
Students develop personal cooling device to help cope with climate change
The Vimband was developed by Emory undergraduates Ryan James, Jesse Rosen-Gooding and Hieren Helmn, in the hopes of winning the Hult Prize.
A trio of Emory students is on a globe-trotting million-dollar quest this summer to address one of the world’s most urgent challenges — helping people find physical comfort in the face of climate change.
One answer, they believe, might be the “Vimband,” their idea for a personal temperature-regulation device that could be worn to cool the body in extremely hot weather or warm individuals enduring severely cold temperatures.
Amid scientific reports that global temperatures are climbing, direct body cooling could go far in providing personal relief, especially for populations living in increasingly hot climates, says Ryan James, a sophomore from Highland, Maryland, majoring in business and computer science, who convened a team of Emory students eager to pose a solution to the problem.
“World-wide, the use of air-conditioning is expected to nearly triple by 2050, and with detrimental environmental effects, that isn’t a sustainable solution,” James says. “There needs to be an alternative.”
So instead of controlling the temperatures of large buildings or residences, the Emory team set their sights on a smaller, more efficient target — the individual. Together, they’ve created a prototype for a rechargeable device that essentially functions as a small, personalized heating and cooling unit. The compact box may be worn around the wrist, neck or head — pulse points on the human body near major arteries that play a critical role in regulating body temperature.
Click here to read more about the Vimband, and the students' quest to win the the Hult Prize, an annual business innovation challenge open to students around the world.
A trio of Emory students is on a globe-trotting million-dollar quest this summer to address one of the world’s most urgent challenges — helping people find physical comfort in the face of climate change.
One answer, they believe, might be the “Vimband,” their idea for a personal temperature-regulation device that could be worn to cool the body in extremely hot weather or warm individuals enduring severely cold temperatures.
Amid scientific reports that global temperatures are climbing, direct body cooling could go far in providing personal relief, especially for populations living in increasingly hot climates, says Ryan James, a sophomore from Highland, Maryland, majoring in business and computer science, who convened a team of Emory students eager to pose a solution to the problem.
“World-wide, the use of air-conditioning is expected to nearly triple by 2050, and with detrimental environmental effects, that isn’t a sustainable solution,” James says. “There needs to be an alternative.”
So instead of controlling the temperatures of large buildings or residences, the Emory team set their sights on a smaller, more efficient target — the individual. Together, they’ve created a prototype for a rechargeable device that essentially functions as a small, personalized heating and cooling unit. The compact box may be worn around the wrist, neck or head — pulse points on the human body near major arteries that play a critical role in regulating body temperature.
Click here to read more about the Vimband, and the students' quest to win the the Hult Prize, an annual business innovation challenge open to students around the world.
Tuesday, August 7, 2018
The search for secrets of ancient remedies
Cassandra Quave is a world leader in the field of medical ethnobotany — studying how indigenous people used plants in their healing practices to identify promising candidates for modern drugs.
Cassandra Quave (it rhymes with “wave”) is an assistant professor in Emory’s Center for the Study of Human Health and in the School of Medicine’s Department of Dermatology. She is also a member of the Emory Antibiotic Resistance Center.
The Florida native looks at home in the sweltering heat of South Georgia, standing behind a pick-up truck parked on a dirt road that winds through a longleaf pine forest. She tilts a straw cowboy hat back from her face and waves off a flurry of gnats. Her utility belt bristles with shears and a hunting knife. The unfolded gate of the truck bed serves as her desk, as she wrangles a leafy vine of passionflower into a wooden plant press.
“The Cherokee pounded the roots of passionflower into a poultice to draw out pus from wounds, boils and abscesses,” Quave says. “Everywhere I look in this ecosystem I see plants that have a history of medicinal use by native peoples. The resin of the pine trees all around us, the fronds from the ferns beneath them and the roots of those beautiful yellow flowers over there — black-eyed Susans — were all used to treat wounds and sores.”
Read more here about Quave's field work this summer, and the undergraduates who helped her collect plants of importance to Native Americans.
Cassandra Quave (it rhymes with “wave”) is an assistant professor in Emory’s Center for the Study of Human Health and in the School of Medicine’s Department of Dermatology. She is also a member of the Emory Antibiotic Resistance Center.
The Florida native looks at home in the sweltering heat of South Georgia, standing behind a pick-up truck parked on a dirt road that winds through a longleaf pine forest. She tilts a straw cowboy hat back from her face and waves off a flurry of gnats. Her utility belt bristles with shears and a hunting knife. The unfolded gate of the truck bed serves as her desk, as she wrangles a leafy vine of passionflower into a wooden plant press.
“The Cherokee pounded the roots of passionflower into a poultice to draw out pus from wounds, boils and abscesses,” Quave says. “Everywhere I look in this ecosystem I see plants that have a history of medicinal use by native peoples. The resin of the pine trees all around us, the fronds from the ferns beneath them and the roots of those beautiful yellow flowers over there — black-eyed Susans — were all used to treat wounds and sores.”
Read more here about Quave's field work this summer, and the undergraduates who helped her collect plants of importance to Native Americans.
Monday, August 6, 2018
Neuroscientists team with engineers to explore how the brain controls movement
The labs of Georgia Tech's Muhannad Bakir (far left) and Emory's Samuel Sober (far right) combined forces for the project. The work will be led by post-doctoral fellows in their labs, Georgia Tech's Muneeb Zia (center left) and Emory's Bryce Chung (center right). Photos by Ann Watson, Emory Photo/Video.
By Carol Clark
Scientists have made remarkable advances into recording the electrical activity that the nervous system uses to control complex skills, leading to insights into how the nervous system directs an animal’s behavior.
“We can record the electrical activity of a single neuron, and large groups of neurons, as animals learn and perform skilled behaviors,” says Samuel Sober, an associate professor of biology at Emory University who studies the brain and nervous system. “What’s missing,” he adds, “is the technology to precisely record the electrical signals of the muscles that ultimately control that movement.”
The Sober lab is now developing that technology through a collaboration with the lab of Muhannad Bakir, a professor in Georgia Tech’s School of Electrical and Computer Engineering. The researchers recently received a $200,000 Technological Innovations in Neuroscience Award from the McKnight Foundation to create a device that can record electrical action potentials, or “spikes” within muscles of songbirds and rodents. The technology will be used to help understand the neural control of many different skilled behaviors to potentially gain insights into neurological disorders that affect motor control.
“Our device will be the first that lets you record populations of spikes from all of the muscles involved in controlling a complex behavior,” Sober says. “This technique will offer unprecedented access to the neural signals that control muscles, allowing previously impossible investigations into how the brain controls the body.”
“By combining expertise in the life sciences at Emory with the engineering expertise of Georgia Tech, we are able to enter new scientific territory,” Bakir says. “The ultimate goal is to make discoveries that improve the quality of life of people.”
A prototype of the proposed device has 16 electrodes that can record data from a single muscle. The McKnight Award will allow the researchers to scale up to a device with more than 1,000 electrodes that can record from 10 or more muscles.
The Sober lab previously developed a prototype device — electrodes attached to flexible wires — to measure electrical activity in a breathing muscle used by Bengalese finches to sing. The way birds control their song has a lot in common with human speech, both in how it is learned early in life and how it is produced in adulthood. The neural pathways for birdsong are also well known, and restricted to that one activity, making birds a good model system for studying nervous system function.
“In experiments using our prototype, we discovered that, just like in brain cells, precise spike timing patterns in muscle cells are critical for controlling behavior — in this case breathing,” Sober says.
The prototype device, however, is basic. Its 16 electrodes can only record activity from a single muscle — not the entire ensemble of muscles involved in birdsong. In order to gain a fuller picture of how neural signals control movement, neuroscientists need a much more sophisticated device.
The McKnight funding allowed Sober to team up with Bakir. Their goal is to create a micro-scale electromyography (EMG) sensor array, containing more than 1,000 electrodes, to record single-cellular data across many muscles.
The engineering challenges are formidable. The arrays need to be flexible enough to fit the shape of small muscles used in fine motor skills, and to change shape as the muscles contract. The entire device must also be tiny enough not to impede the movement of a small animal.
“Our first step is to build a flexible substrate on the micro-scale that can support high-density electrodes,” Bakir says. “And we will need to use microchips that work in parallel with 1,000 electrodes, and then attach them to that substrate.”
To meet that challenge, the Bakir lab will create a 3D integrated circuit. “Essentially, it’s building a miniature skyscraper of electrical circuits stacked vertically atop one another,” Bakir says. This vertical design will allow the researchers to minimize the size of the flexible substrate.
“To our knowledge, no one has done what we are trying to do in this project,” Bakir says. “That makes it more difficult, but also exciting because we are entering new space.”
The Sober lab will use the new device to expand its songbird vocalization studies. And it will explore how the nervous system controls the muscles involved when a mouse performs skilled movements with its forelimbs.
An early version of the technology will also be shared with collaborators of the Sober lab at three different universities. These collaborators will further test the arrays, while also gathering data across more species.
“We know so little about how the brain organizes skilled behaviors,” Sober says. “Once we perfect this technology, we will make it available to researchers in this field around the world, to advance knowledge as rapidly as possible.”
The mission of the McKnight Foundation’s Technological Innovations in Neuroscience Award, as described on its website, is “to bring science closer to the day when diseases of the brain and behavior can be accurately diagnosed, prevented and treated.”
Related:
Singing in the brain: Songbirds sing like humans
Dopamine key to vocal learning, songbird study finds
By Carol Clark
Scientists have made remarkable advances into recording the electrical activity that the nervous system uses to control complex skills, leading to insights into how the nervous system directs an animal’s behavior.
“We can record the electrical activity of a single neuron, and large groups of neurons, as animals learn and perform skilled behaviors,” says Samuel Sober, an associate professor of biology at Emory University who studies the brain and nervous system. “What’s missing,” he adds, “is the technology to precisely record the electrical signals of the muscles that ultimately control that movement.”
The Sober lab is now developing that technology through a collaboration with the lab of Muhannad Bakir, a professor in Georgia Tech’s School of Electrical and Computer Engineering. The researchers recently received a $200,000 Technological Innovations in Neuroscience Award from the McKnight Foundation to create a device that can record electrical action potentials, or “spikes” within muscles of songbirds and rodents. The technology will be used to help understand the neural control of many different skilled behaviors to potentially gain insights into neurological disorders that affect motor control.
“Our device will be the first that lets you record populations of spikes from all of the muscles involved in controlling a complex behavior,” Sober says. “This technique will offer unprecedented access to the neural signals that control muscles, allowing previously impossible investigations into how the brain controls the body.”
“By combining expertise in the life sciences at Emory with the engineering expertise of Georgia Tech, we are able to enter new scientific territory,” Bakir says. “The ultimate goal is to make discoveries that improve the quality of life of people.”
A prototype of the proposed device has 16 electrodes that can record data from a single muscle. The McKnight Award will allow the researchers to scale up to a device with more than 1,000 electrodes that can record from 10 or more muscles.
The Sober lab previously developed a prototype device — electrodes attached to flexible wires — to measure electrical activity in a breathing muscle used by Bengalese finches to sing. The way birds control their song has a lot in common with human speech, both in how it is learned early in life and how it is produced in adulthood. The neural pathways for birdsong are also well known, and restricted to that one activity, making birds a good model system for studying nervous system function.
“In experiments using our prototype, we discovered that, just like in brain cells, precise spike timing patterns in muscle cells are critical for controlling behavior — in this case breathing,” Sober says.
The prototype device, however, is basic. Its 16 electrodes can only record activity from a single muscle — not the entire ensemble of muscles involved in birdsong. In order to gain a fuller picture of how neural signals control movement, neuroscientists need a much more sophisticated device.
The McKnight funding allowed Sober to team up with Bakir. Their goal is to create a micro-scale electromyography (EMG) sensor array, containing more than 1,000 electrodes, to record single-cellular data across many muscles.
The engineering challenges are formidable. The arrays need to be flexible enough to fit the shape of small muscles used in fine motor skills, and to change shape as the muscles contract. The entire device must also be tiny enough not to impede the movement of a small animal.
“Our first step is to build a flexible substrate on the micro-scale that can support high-density electrodes,” Bakir says. “And we will need to use microchips that work in parallel with 1,000 electrodes, and then attach them to that substrate.”
To meet that challenge, the Bakir lab will create a 3D integrated circuit. “Essentially, it’s building a miniature skyscraper of electrical circuits stacked vertically atop one another,” Bakir says. This vertical design will allow the researchers to minimize the size of the flexible substrate.
“To our knowledge, no one has done what we are trying to do in this project,” Bakir says. “That makes it more difficult, but also exciting because we are entering new space.”
The Sober lab will use the new device to expand its songbird vocalization studies. And it will explore how the nervous system controls the muscles involved when a mouse performs skilled movements with its forelimbs.
An early version of the technology will also be shared with collaborators of the Sober lab at three different universities. These collaborators will further test the arrays, while also gathering data across more species.
“We know so little about how the brain organizes skilled behaviors,” Sober says. “Once we perfect this technology, we will make it available to researchers in this field around the world, to advance knowledge as rapidly as possible.”
The mission of the McKnight Foundation’s Technological Innovations in Neuroscience Award, as described on its website, is “to bring science closer to the day when diseases of the brain and behavior can be accurately diagnosed, prevented and treated.”
Related:
Singing in the brain: Songbirds sing like humans
Dopamine key to vocal learning, songbird study finds
Thursday, July 26, 2018
Templeton World Charity awards $550,000 to global STEM initiative
The Templeton World Charity Foundation awarded $550,000 to Emory mathematician Ken Ono, for a global program to identify and nurture gifted students in the areas of science, technology, engineering and math (STEM). The program, now known as the Spirit of Ramanujan STEM Talent Initiative, began in 2016 with pilot funding of $100,000 from the Templeton Foundation.
“This additional funding will allow us not only to continue the program, but to expand its mission and impact,” says Ono, Asa Griggs Candler Professor Mathematics at Emory and the vice president of the American Mathematical Society.
The pilot Spirit of Ramanujan program, or SOR, focused on finding exceptional young mathematicians, and awarded grants to 16 grade-school students from across the United States as well as from China, Egypt, India, Kenya and Qatar. SOR matched the mathematicians with mentors and the grants funded summer research and enrichment activities.
SOR will now also offer similar opportunities for individuals showing exceptional promise for STEM fields in which mathematics plays a prominent role, such as computational chemistry, computer science, electrical and computer engineering, mathematical biology, mathematical physics and statistics. Up to 30 eligible people each year will be awarded Templeton-Ramanujan Fellows Prizes (financial grants up to $5,000 per award to cover summer enrichment/research programs) or Templeton-Ramanujan Scholarly Development Prizes (educational materials such as STEM books).
"The Spirit of Ramanujan initiative aims to break the mold and find brilliant outliers who may not be thriving in the system, so we can match them up with the resources they need," says Emory mathematician Ken Ono, one of the founders of the initiative.
“We are looking for brilliant, creative people who have ideas and abilities that will drive the future of science,” Ono says. “Young people with great promise are often outliers, so far ahead of their classes that teachers don’t know what to do with them. Genius cannot be taught, it can only be nurtured.”
Ono founded the SOR program along with the Templeton World Charity Foundation; Expii.com, an open-source, personalized learning platform; and IFC Films and Pressman Film — producers of the 2015 biographical film, “The Man Who Knew Infinity.”
The SOR initiative was inspired by the subject of the film, Indian mathematician Srinivasa Ramanujan. A poor Hindu college dropout who was self-taught in mathematics, Ramanujan sent a letter containing some of his theories to British mathematician G.H. Hardy in 1913. Hardy was so impressed that he invited Ramanujan to Cambridge to study and collaborate. His mentorship burnished Ramanujan’s insights and brought them to a world stage. Ramanujan's work played a central role in the development of modern number theory and algebraic geometry, changing math and science forever.
Although the expanded SOR initiative is open to all ages, preference will be given to those under 32 — the age Ramanujan was when he died.
The SOR initiative invites people worldwide to solve creative mathematical puzzles via Expii.com’s Solve feature, to identify exceptional talent. The Art of Problem Solving, a web site that trains students in mathematical concepts and problem-solving techniques, is also advertising the initiative to its worldwide online community.
For more details about how to apply for an SOR grant, and the criteria for an award, visit the program’s web site: https://v1.expii.com/ramanujan
“The program is not intended to just benefit those who receive the awards,” Ono says. “We also hope they become important mathematicians and scientists who make the world a better place.”
Ono heads the SOR program, with an advisory board of other mathematicians, including Manjul Bhargava (Princetone), Olga Holtz (Berkeley), Po-Shen Loh (Carnegie Mellon) and Sujatha Ramdorai (University of British Columbia).
Sir John Templeton established the Templeton World Charity Foundation in 1996 to serve as “a global philanthropic catalyst for discoveries relating to big questions of life and the universe, in areas of science, theology, philosophy and human society.”
Related:
Templeton World Charity to fund 'Spirit of Ramanujan' fellows
Celebrating math, miracles and a movie
Mathematicians find 'magic key' to drive Ramanujan's taxi-cab number
Wednesday, July 11, 2018
Evidence reveals our fractured African roots
Anthropologists are challenging the long-held view that humans evolved from a single ancestral population in one region of Africa. Instead, a scientific consortium has found that human ancestors were diverse in form and culture and scattered across the continent. These populations were subdivided by different habitats and shifting environmental boundaries, such as forests and deserts.
The journal Trends in Ecology and Evolution published the findings, which drew from studies of bones (anthropology), stones (archaeology) and genes (population genomics), along with new and more detailed reconstructions of Africa’s climates and habitats over the last 300,000 years.
Emory University anthropologist Jessica Thompson was one of 23 authors on the paper. The research was led by the Max Planck Institute for the Science of Human History in Germany and the University of Oxford in England. In the following Q&A, Thompson explains the paper and its significance.
Can you provide some background on our understanding of human evolution?
Jessica Thompson: Even as early as 20 years ago, fossils were the main material we had to try to answer the question of where humans originated. A multi-regionalist theory hypothesized that Homo sapiens emerged in different places at the same time, evolving at the same rate across the Old World. This would mean that there was extensive gene exchange across ancient Asia, Europe and Africa, and that groups such as Neanderthals would not be a separate species but just a localized form of Homo sapiens. But it is difficult to get that level of resolution from bones alone.
By the 1990s, mitochondrial DNA analyses provided growing genetic evidence for the competing theory — that all modern humans originated in Africa and then dispersed from there around the globe. The implication of this is that groups such as the Neanderthals would actually have been different species, and that they were replaced by modern human groups dispersing from Africa.
Intense debate continued over the two theories but, by the early 2000s, it was clear that the out-of-Africa group had won. Only a small percentage of modern humans from the total population living in Africa actually left the continent, creating a genetic bottleneck in populations outside of Africa. So there is more diversity within the genomes of some living peoples in Africa today than there is, say, between an Australian aboriginal person and a Norwegian person.
As a final twist, whole-genome DNA now shows that there was some gene flow with Neanderthals as those first modern populations emerged from Africa. This could have happened several times over many thousands of years, and so a “leaky out-of-Africa” model seems to be the best fit for the data.
How does the current paper fit into this model?
JT: While it was well established that modern humans originated in Africa, there was still the question of where in Africa. East Africa and South Africa have been strong candidates, but that could be due to the long historical bias of where fossils were being found.
Our paper takes the global idea of multi-regionalism and shrinks it down to the boundaries of Africa. The answer to where humans originated appears to be lots of places within the continent, often separated for long periods, but again with leaky boundaries. Essentially, there is not a single ancestral human population. Who we are today probably evolved as a mosaic of populations of very near modern humans who were separated by geographic and cultural boundaries but were also all interacting with one another at different points in time. Our origin story is one of lots and lots of different humans that came together and then separated and later came together again in this really confusing manner. There’s a lot of moving parts. Humans, for a very long time, have been a culturally and phenotypically diverse bunch.
What new questions does this paradigm shift bring up?
JT: Instead of seeking the origin of humans in one spot, we need to look for pieces of the puzzle in many different places. Then we can ask, what adaptations did different populations have that contributed to who we are today? How did they come to be present in the single species we are now? And, perhaps more philosophically, what are the unifying characteristics that bind us together as that species, in spite of our differences?
While we need more data from places like East Africa and South Africa, it’s apparent now that West Africa and Central Africa are also key players in the story. They’re at the crossroads for much of the continent and yet we know very little about ancient populations from those regions. I’m hoping I can contribute to that effort with my current work in Malawi, which is positioned between southern and eastern Africa. There, we find a long, but relatively unexplored cultural record of human behavior that goes back into the last Ice Age.
We also recently recovered some of the oldest DNA in Africa from a site in Malawi, which we published last year. This helped to actually show some of those ancient interactions between populations at least over the last 10,000 years or so — as well as some of the differences between them. The implications are that this kind of structure went back even farther in time, to our origins as a species.
Related:
Malawi yields oldest-known DNA from Africa
Bonding over bones, stones and beads
Have skull drill, will travel
Monday, July 9, 2018
Science on stage: Atlanta playwrights explore the human microbiome
Learning about the microbiome "is shifting my perspective of what it means to be human and an individual," says playwright Margaret Baldwin. "What bacteria are driving our dreams?"
Four Atlanta playwrights + 48 hours = four new plays at the forefront of art and science.
That’s the premise of Theater Emory’s “ 4:48,” a frenetic yet focused showcase of new works inspired by the human microbiome that will be performed July 14 at the Schwartz Center for Performing Arts.
The annual speed-writing challenge always yields compelling results, as talented local playwrights come together at Emory to quickly produce plays based on common source material. But this year, for the first time, the Playwriting Center of Theater Emory is teaming up with the Emory Center for the Study of Human Health for “4:48” — an innovative, interdisciplinary collaboration that promises to push the boundaries of both fields.
“Theater offers an exciting communication mechanism to convey cutting edge-research findings to a wide audience, while simultaneously encouraging curiosity and imagination,” says Amanda Freeman, instructor in the Center for the Study of Human Health.
The collaborators hope that this project will introduce the human microbiome — the trillions of microorganisms that live in us and on us — to a whole new audience, providing a spotlight for research that is being done right here on campus.
“I have found very few venues where new science and new art can emerge from a single exercise, so ‘4:48’ is special,” says David Lynn, Asa Griggs Candler Professor of Chemistry and Biology, one of several Emory science faculty offering support as resources for the writers.
Readings of the work developed during "4:48" begin at 4 pm on Saturday, July 14, in the Theater Lab of Schwartz Center. All readings are free and open to the public. For the schedule of readings and play titles, visit the Theater Emory website.
Click here to learn more.
Related:
Learning to love our bugs: Meet the microorganisms that help keep us healthy
Environment, the microbiome and preterm birth
Four Atlanta playwrights + 48 hours = four new plays at the forefront of art and science.
That’s the premise of Theater Emory’s “ 4:48,” a frenetic yet focused showcase of new works inspired by the human microbiome that will be performed July 14 at the Schwartz Center for Performing Arts.
The annual speed-writing challenge always yields compelling results, as talented local playwrights come together at Emory to quickly produce plays based on common source material. But this year, for the first time, the Playwriting Center of Theater Emory is teaming up with the Emory Center for the Study of Human Health for “4:48” — an innovative, interdisciplinary collaboration that promises to push the boundaries of both fields.
“Theater offers an exciting communication mechanism to convey cutting edge-research findings to a wide audience, while simultaneously encouraging curiosity and imagination,” says Amanda Freeman, instructor in the Center for the Study of Human Health.
The collaborators hope that this project will introduce the human microbiome — the trillions of microorganisms that live in us and on us — to a whole new audience, providing a spotlight for research that is being done right here on campus.
“I have found very few venues where new science and new art can emerge from a single exercise, so ‘4:48’ is special,” says David Lynn, Asa Griggs Candler Professor of Chemistry and Biology, one of several Emory science faculty offering support as resources for the writers.
Readings of the work developed during "4:48" begin at 4 pm on Saturday, July 14, in the Theater Lab of Schwartz Center. All readings are free and open to the public. For the schedule of readings and play titles, visit the Theater Emory website.
Click here to learn more.
Related:
Learning to love our bugs: Meet the microorganisms that help keep us healthy
Environment, the microbiome and preterm birth