Emory professor Eri Saikawa (left) in the field with Historic Westside Gardens member and Westside resident Rosario Hernandez (center) and Xinyi Yao (right), an Emory student pursuing a master's degree in environmental sciences who is involved in the research project. (Photo by Carol Clark)
An Emory University collaboration with members of Atlanta’s Westside community, to test urban soil for contaminants, has led to a site investigation by the U.S. Environmental Protection Agency (EPA). The ongoing community collaboration is funded by Emory’s HERCULES Exposome Research Center, dedicated to understanding how environmental exposures affect health and community well-being.
The EPA told the Atlanta Journal-Constitution that it is continuing to collect samples and has so far identified 64 sites where the soil contains elevated levels of lead — a dangerous neurotoxin. The agency plans to begin decontaminating properties, possibly by removing and replacing soil, in the first quarter of next year, at no expense to residents or homeowners, according to the AJC. The report also appeared in Georgia Health News.
“It’s important for people to know that soil contamination by heavy metals can be serious,” says Eri Saikawa, an associate professor of environmental sciences at Emory and the lead researcher on the original project that sparked the EPA investigation. “If you are thinking about gardening in an urban area, or if children are playing in your yard, it makes sense to test your soil and make sure that it’s not contaminated.”
Read the full story here.
Related:
Creating an atmosphere for change
The growing role of farming and nitrous oxide in climate change
Showing posts with label Sociology. Show all posts
Showing posts with label Sociology. Show all posts
Friday, December 13, 2019
Monday, May 13, 2019
Artificial intelligence and 'deep ethics'
In the sci-fi film “2001: A Space Odyssey,” astronauts go into a soundproof pod to discuss their concerns about some of the decisions made by the supercomputer Hal (seen through the window) without realizing that Hal knows how to lip read.
Advances in neurotechnology, genetics and artificial intelligence are not only going to change society as a whole, they are actually going to challenge what it means to be human and change our ethics, argues Paul Root Wolpe, director of the Emory Center for Ethics, in a recent TEDx Atlanta talk.
He uses self-driving cars as just one example.
“These vehicles are going to be going down the road and in a crisis they’re going to have to make decisions about what to do,” Wolpe says. “Do I crash into the wall and endanger my passengers or do I turn left and hit those pedestrians? For the first time we’re going to have to create ethical algorithms. That is, we’re going to have to teach a vehicle to make ethical decisions. For the first time, machines will be making ethical decisions that will have a profound impact on human beings.”
Watch Wolpe’s talk in the video below to learn what he means by the term “deep ethics,” and how artificial intelligence may someday help us navigate through the ethical complexities raised by technology itself.
Related:
Why robots should care about their looks
The science and ethics of X-Men
Advances in neurotechnology, genetics and artificial intelligence are not only going to change society as a whole, they are actually going to challenge what it means to be human and change our ethics, argues Paul Root Wolpe, director of the Emory Center for Ethics, in a recent TEDx Atlanta talk.
He uses self-driving cars as just one example.
“These vehicles are going to be going down the road and in a crisis they’re going to have to make decisions about what to do,” Wolpe says. “Do I crash into the wall and endanger my passengers or do I turn left and hit those pedestrians? For the first time we’re going to have to create ethical algorithms. That is, we’re going to have to teach a vehicle to make ethical decisions. For the first time, machines will be making ethical decisions that will have a profound impact on human beings.”
Watch Wolpe’s talk in the video below to learn what he means by the term “deep ethics,” and how artificial intelligence may someday help us navigate through the ethical complexities raised by technology itself.
Related:
Why robots should care about their looks
The science and ethics of X-Men
Thursday, April 11, 2019
When do children alter behavior to please others?
“I have spent the past four years at Emory University investigating how an infant, who has no problem walking around the grocery store in her onesie, develops into an adult that fears public speaking for fear of being negatively judged,” says Sara Botto in her newly released TEDxAtlanta talk.
Botto is a doctoral candidate in the Cognition and Development program of Emory’s Department of Psychology. Together with Emory psychologist Philippe Rochat, she designed experiments to investigate when in development we become sensitive to others’ evaluations — a big part of being human.
Watch the TEDxAtlanta video below to see young children reacting to the opinions of others during the experiments, which take the form of a game called “The Robot Task.”
Botto’s research showed that, even before they can form a simple sentence, children are sensitive to the evaluations of others, and alter their behavior accordingly.
“Whether we’re aware of it or not, we’re constantly communicating values to others,” Botto says. “We’re communicating a value when we mostly compliment girls for their pretty hair or their pretty dress but boys for their intelligence. Or when we choose to offer candy as opposed to nutritious food as a reward for good behavior.”
Visit Botto’s web site, AdultingWithKids.com, to learn more about credible, science-based child development research.
Related:
Sensitivity to how others evaluate you emerges by 24 months
Gender gap in spatial reasoning starts in elementary school, meta-analysis finds
"We're interested in the origins of gender differences in spatial skills because of their potential role in the gender gap we see in math and science fields," says Jillian Lauer, who is set to graduate from Emory in May with a PhD in psychology. (Getty Images)
By Carol Clark
It is well-established that, on average, men outperform women on a spatial reasoning task known as mental rotation — imagining multi-dimensional objects from different points of view. Men are not, however, born with this advantage, suggests a major meta-analysis by psychologists at Emory University. Instead, males gain a slight advantage in mental-rotation performance during the first years of formal schooling, and this advantage slowly grows with age, tripling in size by the end of adolescence.
The Psychological Bulletin, a journal of the American Psychological Association, is publishing the findings.
“Some researchers have argued that there is an intrinsic gender difference in spatial reasoning — that boys are naturally better at it than girls,” says lead author Jillian Lauer, who is set to graduate from Emory in May with a PhD in psychology. “While our results don’t exclude any possibility that biological influences contribute to the gender gap, they suggest that other factors may be more important in driving the gender difference in spatial skills during childhood.”
Co-authors of the paper include Stella Lourenco, associate professor of psychology at Emory, whose lab specializes in the development of spatial and numerical cognition. Co-author Eukyung Yhang worked on the paper as an Emory undergraduate, funded by the university’s Institute for Quantitative Theory and Methods. Yhang graduated in 2018 and is now at Yale University School of Medicine.
The meta-analysis included 128 studies of gender differences in spatial reasoning, combining statistics on more than 30,000 children and adolescents aged three to 18 years. The authors found no gender difference in mental-rotation skills among preschoolers, but a small male advantage emerged in children between the ages of six and eight.
While differences in verbal and mathematical abilities between men and women tend to be small or non-existent, twice as many men as women are top performers in mental rotation, making it one of the largest gender differences in cognition.
Mental rotation is considered one of the hallmarks of spatial reasoning. “If you’re packing your suitcase and trying to figure out how each item can fit within that space, or you’re building furniture based on a diagram, you’re likely engaged in mental rotation, imagining how different objects can rotate to fit together,” Lauer explains.
It takes most of childhood and adolescence for the gender gap in spatial skills to reach the size of the difference seen in adulthood.
Prior research has also shown that superior spatial skills predict success in male-dominated science, technology engineering and math (STEM) fields, and that the gender difference in spatial reasoning may contribute to the gender disparity in these STEM fields.
“We’re interested in the origins of gender differences in spatial skills because of their potential role in the gender gap we see in math and science fields,” Lauer says. “By determining when the gender difference can first be detected in childhood and how it changes with age, we may be able to develop ways to make educational systems more equitable.”
It takes most of childhood and adolescence for the gender gap in spatial skills to reach the size of the difference seen in adulthood, Lauer says. She adds that the meta-analysis did not address causes for why the gender gap for mental rotation emerges and grows.
Lauer notes that previous research has shown that parents use more spatial language when they talk to preschool sons than daughters. Studies have also found that girls report more anxiety about having to perform spatial tasks than do boys by first grade, and that children are aware of gender stereotypes about spatial intelligence during elementary school.
“Now that we’ve characterized how gender differences in spatial reasoning skills develop in children over time we can start to hone in on the reasons for those differences,” Lauer says.
Meanwhile, she adds, parents may want to be aware to encourage both their daughters and sons to play with blocks and other construction items that might help in the development of spatial reasoning skills, since evidence shows that these skills can be improved with training.
“Giving both girls and boys more opportunities to develop their spatial skills is something that parents and educators have the power to do,” Lauer says.
Lauer has accepted a post-doctoral fellowship position at New York University. Her PhD advisor is Patricia Bauer, a professor of psychology at Emory focused on cognition and child development.
Related:
Babies' spatial reasoning predicts later math skills
How babies use numbers, space and time
Higher-math skills entwined with lower-order magnitude sense
By Carol Clark
It is well-established that, on average, men outperform women on a spatial reasoning task known as mental rotation — imagining multi-dimensional objects from different points of view. Men are not, however, born with this advantage, suggests a major meta-analysis by psychologists at Emory University. Instead, males gain a slight advantage in mental-rotation performance during the first years of formal schooling, and this advantage slowly grows with age, tripling in size by the end of adolescence.
The Psychological Bulletin, a journal of the American Psychological Association, is publishing the findings.
“Some researchers have argued that there is an intrinsic gender difference in spatial reasoning — that boys are naturally better at it than girls,” says lead author Jillian Lauer, who is set to graduate from Emory in May with a PhD in psychology. “While our results don’t exclude any possibility that biological influences contribute to the gender gap, they suggest that other factors may be more important in driving the gender difference in spatial skills during childhood.”
Co-authors of the paper include Stella Lourenco, associate professor of psychology at Emory, whose lab specializes in the development of spatial and numerical cognition. Co-author Eukyung Yhang worked on the paper as an Emory undergraduate, funded by the university’s Institute for Quantitative Theory and Methods. Yhang graduated in 2018 and is now at Yale University School of Medicine.
The meta-analysis included 128 studies of gender differences in spatial reasoning, combining statistics on more than 30,000 children and adolescents aged three to 18 years. The authors found no gender difference in mental-rotation skills among preschoolers, but a small male advantage emerged in children between the ages of six and eight.
While differences in verbal and mathematical abilities between men and women tend to be small or non-existent, twice as many men as women are top performers in mental rotation, making it one of the largest gender differences in cognition.
Mental rotation is considered one of the hallmarks of spatial reasoning. “If you’re packing your suitcase and trying to figure out how each item can fit within that space, or you’re building furniture based on a diagram, you’re likely engaged in mental rotation, imagining how different objects can rotate to fit together,” Lauer explains.
It takes most of childhood and adolescence for the gender gap in spatial skills to reach the size of the difference seen in adulthood.
Prior research has also shown that superior spatial skills predict success in male-dominated science, technology engineering and math (STEM) fields, and that the gender difference in spatial reasoning may contribute to the gender disparity in these STEM fields.
“We’re interested in the origins of gender differences in spatial skills because of their potential role in the gender gap we see in math and science fields,” Lauer says. “By determining when the gender difference can first be detected in childhood and how it changes with age, we may be able to develop ways to make educational systems more equitable.”
It takes most of childhood and adolescence for the gender gap in spatial skills to reach the size of the difference seen in adulthood, Lauer says. She adds that the meta-analysis did not address causes for why the gender gap for mental rotation emerges and grows.
Lauer notes that previous research has shown that parents use more spatial language when they talk to preschool sons than daughters. Studies have also found that girls report more anxiety about having to perform spatial tasks than do boys by first grade, and that children are aware of gender stereotypes about spatial intelligence during elementary school.
“Now that we’ve characterized how gender differences in spatial reasoning skills develop in children over time we can start to hone in on the reasons for those differences,” Lauer says.
Meanwhile, she adds, parents may want to be aware to encourage both their daughters and sons to play with blocks and other construction items that might help in the development of spatial reasoning skills, since evidence shows that these skills can be improved with training.
“Giving both girls and boys more opportunities to develop their spatial skills is something that parents and educators have the power to do,” Lauer says.
Lauer has accepted a post-doctoral fellowship position at New York University. Her PhD advisor is Patricia Bauer, a professor of psychology at Emory focused on cognition and child development.
Related:
Babies' spatial reasoning predicts later math skills
How babies use numbers, space and time
Higher-math skills entwined with lower-order magnitude sense
Tags:
Anthropology,
Psychology,
Sociology
Friday, March 15, 2019
A nod to World Sleep Day
World Sleep Day is March 15 this year. The annual event is a celebration of sleep and a call to action on issues related to sleep, including medicine, education and social aspects.
In Emory anthropologist Carol Worthman's research around the world, "sleep has emerged as both more flexible and more social than one would think from the perspective of the West," writes Todd Pitock in Aeon Magazine.
"When Worthman started exploring the anthropology of sleep more than a decade ago," the article continues, "the topic was way below the radar of colleagues who believed that culture was something you did while awake. But she found otherwise."
Read the whole article here.
Related:
Some eye-opening thoughts on sleep
What literature can teach us about sleep
In Emory anthropologist Carol Worthman's research around the world, "sleep has emerged as both more flexible and more social than one would think from the perspective of the West," writes Todd Pitock in Aeon Magazine.
"When Worthman started exploring the anthropology of sleep more than a decade ago," the article continues, "the topic was way below the radar of colleagues who believed that culture was something you did while awake. But she found otherwise."
Read the whole article here.
Related:
Some eye-opening thoughts on sleep
What literature can teach us about sleep
Tags:
Anthropology,
Health,
Sociology
Thursday, March 14, 2019
The importance of puberty: A call for better research models
“Due to the global slowdown in fertility, this is probably the biggest
cohort of young people we will ever see,” says anthropologist Carol Worthman. “If we are ever
going to get serious about helping adolescents reach their full
potential, now is the time.”
By Carol Clark
Puberty is much more than just a time of biological overdrive, propelled by sexual maturation. Progress in developmental science has greatly broadened the perspective of this critical maturational milestone.
“We’ve moved beyond thinking of puberty as simply raging hormones,” says Carol Worthman, professor of anthropology at Emory University. “Major advances in understanding of brain development clearly show that the sociological and psychological impacts during puberty are just as important as the hormones.”
What’s needed now, Worthman argues as lead author on a new paper, is to integrate this understanding into more comprehensive research models. The Journal of Research on Adolescence published the paper, which reviews key theories and methods that are relevant to studies of puberty.
“Puberty was once thought of as the biological process of teen development and adolescence was considered the cultural process,” Worthman says. “We want to raise awareness that bracketing research in this way is no longer a useful approach.”
For decades, researchers have focused on improving the health of infants and children, resulting in substantial declines in child mortality worldwide.
While babies and children are labeled as cute and positive, full of possibility, adolescents are more often seen as problems. They have generally been less studied, Worthman says, even though the second decade of life is a critical time when risks spike for the development of mental illness, substance abuse and the escalation of injuries. And what happens in puberty, she adds, impacts health and well-being across the lifespan.
The global population is now bulging with young people aged 10 to 19, who today number more than 1.2 billion, or 17 percent of humanity. These young people must deal with finding their way into adulthood amid massive, rapid social transformations.
“Due to the global slowdown in fertility, this is probably the biggest cohort of young people we will ever see,” Worthman says. “If we are ever going to get serious about helping adolescents reach their full potential, now is the time.”
In her own research, Worthman uses a biocultural approach to conduct comparative interdisciplinary studies of human development. Samatha Dockray, a co-author of the paper from University College Cork, studies psychobiological mechanisms to understand their effects on adolescent health and behavior. The third co-author, Kristine Marceau from Purdue University, integrates genetics, prenatal risk, neuroendocrine development and the family environment into her developmental research.
The paper outlines minimally invasive methods to study different aspects of puberty. For instance, hair and fingernail clippings can be used to track stress levels and hormones over time. Changes in the microbiome, immune function and brain are other critical aspects of puberty that can be measured, along with cognition, behavior and ecological contexts.
“By taking advantage of new methods, and working in interdisciplinary teams, developmental scientists can explore more questions about adolescent development and welfare in more integrated ways,” Worthman says.
The review paper is part of a special section on puberty published by the Journal of Research on Adolescence. Topics covered include emerging genetic-environmental complexities of puberty, the role of puberty in the developing brain, how puberty impacts health and well-being across the lifespan and the need to explore puberty in understudied populations.
Related:
Scientists zeroing in on psychosis risk factors
By Carol Clark
Puberty is much more than just a time of biological overdrive, propelled by sexual maturation. Progress in developmental science has greatly broadened the perspective of this critical maturational milestone.
“We’ve moved beyond thinking of puberty as simply raging hormones,” says Carol Worthman, professor of anthropology at Emory University. “Major advances in understanding of brain development clearly show that the sociological and psychological impacts during puberty are just as important as the hormones.”
What’s needed now, Worthman argues as lead author on a new paper, is to integrate this understanding into more comprehensive research models. The Journal of Research on Adolescence published the paper, which reviews key theories and methods that are relevant to studies of puberty.
“Puberty was once thought of as the biological process of teen development and adolescence was considered the cultural process,” Worthman says. “We want to raise awareness that bracketing research in this way is no longer a useful approach.”
For decades, researchers have focused on improving the health of infants and children, resulting in substantial declines in child mortality worldwide.
While babies and children are labeled as cute and positive, full of possibility, adolescents are more often seen as problems. They have generally been less studied, Worthman says, even though the second decade of life is a critical time when risks spike for the development of mental illness, substance abuse and the escalation of injuries. And what happens in puberty, she adds, impacts health and well-being across the lifespan.
The global population is now bulging with young people aged 10 to 19, who today number more than 1.2 billion, or 17 percent of humanity. These young people must deal with finding their way into adulthood amid massive, rapid social transformations.
“Due to the global slowdown in fertility, this is probably the biggest cohort of young people we will ever see,” Worthman says. “If we are ever going to get serious about helping adolescents reach their full potential, now is the time.”
In her own research, Worthman uses a biocultural approach to conduct comparative interdisciplinary studies of human development. Samatha Dockray, a co-author of the paper from University College Cork, studies psychobiological mechanisms to understand their effects on adolescent health and behavior. The third co-author, Kristine Marceau from Purdue University, integrates genetics, prenatal risk, neuroendocrine development and the family environment into her developmental research.
The paper outlines minimally invasive methods to study different aspects of puberty. For instance, hair and fingernail clippings can be used to track stress levels and hormones over time. Changes in the microbiome, immune function and brain are other critical aspects of puberty that can be measured, along with cognition, behavior and ecological contexts.
“By taking advantage of new methods, and working in interdisciplinary teams, developmental scientists can explore more questions about adolescent development and welfare in more integrated ways,” Worthman says.
The review paper is part of a special section on puberty published by the Journal of Research on Adolescence. Topics covered include emerging genetic-environmental complexities of puberty, the role of puberty in the developing brain, how puberty impacts health and well-being across the lifespan and the need to explore puberty in understudied populations.
Related:
Scientists zeroing in on psychosis risk factors
Wednesday, February 20, 2019
Computer scientist explains 'Dangers of the Coded Gaze'
As a master’s student at MIT, Joy Buolamwini discovered that her own face read as male in many facial-analysis software systems — if her face was detected at all. And, no, it was not a personal slight; it turns out that many darker-skinned women of color also read as male. If you were what Buolamwini terms a “pale male,” though, most systems could categorize you with a high rate of accuracy.
Buolamwini, a computer scientist and digital activist, went on to found the Algorithmic Justice League, raising awareness to the problem of any company refusing to acknowledge bias in its facial-recognition software. She recently spoke at Emory as part of the Provost Lecture Series, which is designed to inspire the Emory community to think about big questions and collaborate on innovative solutions.
As Deboleena Roy, chair of Emory’s Department of Women’s, Gender, and Sexuality Studies and faculty in Neuroscience and Behavioral Biology, said in introducing Buolamwini, “Rather than sit back and see the creation of new technologies that work to reinforce dominant and discriminatory gender and racial norms, she instead is using her expertise as a scientist to envision a more socially just technological future. Applying her awareness of gender and race issues, she has dedicated her research and her career to creating more inclusive code and more inclusive coding practices.”
Click here to read more about Buolamwini's Emory talk, entitled "Dangers of the Coded Gaze."
Tuesday, October 23, 2018
Schadenfreude sheds light on the darker side of humanity
“We all experience schadenfreude but we don’t like to think about it too much because it shows how ambivalent we can be to our fellow humans,” says Emory psychologist Philippe Rochat.
By Carol Clark
Schadenfreude, the sense of pleasure people derive from the misfortune of others, is a familiar feeling to many — perhaps especially during these times of pervasive social media.
This common, yet poorly understood, emotion may provide a valuable window into the darker side of humanity, finds a review article by psychologists at Emory University. New Ideas in Psychology published the review, which drew upon evidence from three decades of social, developmental, personality and clinical research to devise a novel framework to systematically explain schadenfreude.
The authors propose that schadenfreude comprises three separable but interrelated subforms — aggression, rivalry and justice — which have distinct developmental origins and personality correlates.
They also singled out a commonality underlying these subforms.
“Dehumanization appears to be at the core of schadenfreude,” says Shensheng Wang, a PhD candidate in psychology at Emory and first author of the paper. “The scenarios that elicit schadenfreude, such as intergroup conflicts, tend to also promote dehumanization.”
Co-authors of the study are Emory psychology professors Philippe Rochat, who studies infant and child development, and Scott Lilienfeld, whose research focuses on personality and personality disorders.
Dehumanization is the process of perceiving a person or social group as lacking the attributes that define what it means to be human. It can range from subtle forms, such as assuming that someone from another ethnic group does not feel the full range of emotions as one’s in-group members do, all the way to blatant forms — such as equating sex offenders to animals. Individuals who regularly dehumanize others may have a disposition towards it. Dehumanization can also be situational, such as soldiers dehumanizing the enemy during a battle.
“Our literature review strongly suggests that the propensity to experience schadenfreude isn’t entirely unique, but that it overlaps substantially with several other ‘dark’ personality traits, such as sadism, narcissism and psychopathy,” Lilienfeld says. “Moreover, different subforms of schadenfreude may relate somewhat differently to these often malevolent traits.”
One problem with studying the phenomenon is the lack of an agreed definition of schadenfreude, which literally means “harm joy” in German. Since ancient times, some scholars have condemned schadenfreude as malicious, while others have perceived it as morally neutral or even virtuous.
“Schadenfreude is an uncanny emotion that is difficult to assimilate,” Rochat says. “It’s kind of a warm-cold experience that is associated with a sense of guilt. It can make you feel odd to experience pleasure when hearing about bad things happening to someone else.”
Psychologists view schadenfreude through the lens of three theories. Envy theory focuses on a concern for self-evaluation, and a lessening of painful feelings when someone perceived as enviable gets knocked down a peg. Deservingness theory links schadenfreude to a concern for social justice and the feeling that someone dealt a misfortune received what was coming to them. Intergroup-conflict theory concerns social identity and the schadenfreude experienced after the defeat of members of a rival group, such as during sporting or political competitions.
The authors of the current article wanted to explore how all these different facets of schadenfreude are interrelated, how they differ, and how they can arise in response to these concerns.
Their review delved into the primordial role of these concerns demonstrated in developmental studies. Research suggests that infants as young as eight months demonstrate a sophisticated sense of social justice. In experiments, they showed a preference for puppets who assisted a helpful puppet, and who punished puppets that had exhibited antisocial behavior. Research on infants also points to the early roots of intergroup aggression, showing that, by nine months, infants preferred puppets who punish others who are unlike themselves.
“When you think of normal child development, you think of children becoming good natured and sociable,” Rochat says. “But there’s a dark side to becoming socialized. You create friends and other in-groups to the exclusion of others.”
Spiteful rivalry appears by at least age five or six, when research has shown that children will sometimes opt to maximize their gain over another child, even if they have to sacrifice a resource to do so.
By the time they reach adulthood, many people have learned to hide any tendencies for making a sacrifice just for spite, but they may be more open about making sacrifices that are considered pro-social.
The review article posits a unifying, motivational theory: Concerns of self-evaluation, social identity and justice are the three motivators that drive people toward schadenfreude. What pulls people away from schadenfreude is the ability to feel empathy for others and to perceive them as fully human and to show empathy for them.
Ordinary people may temporarily lose empathy for others. But those with certain personality disorders and associated traits — such as psychopathy, narcissism or sadism — are either less able or less motivated to put themselves in the shoes of others.
“By broadening the perspective of schadenfreude, and connecting all of the related phenomena underlying it, we hope we’ve provided a framework to gain deeper insights into this complex, multi-faceted emotion,” Wang says.
“We all experience schadenfreude but we don’t like to think about it too much because it shows how ambivalent we can be to our fellow humans,” Rochat says. “But schadenfreude points to our ingrained concerns and it’s important to study it in a systematic way if we want to understand human nature.”
Related:
What is a psychopath?
Sharing ideas about the concept of fairness
Friday, September 21, 2018
Climate change calls for a fresh approach to water woes
An egret spreads its wings above waters of the Everglades. "Climate change is a game changer" when it comes to managing major water basins across the country, says Lance Gunderson, chair of Emory's Department of Environmental Sciences.
By Carol Clark
The Everglades National Park, the largest subtropical wilderness in the United States, is home to 16 different species of wading birds and rare and endangered species like the manatee, the American crocodile and the Florida panther. But the area is also home to humans. The park is a portion of a larger wetland ecosystem, more than half of which has been converted into agricultural production or urban developments. The ecosystem must provide both flood protection and supply water for the park, the agricultural interests and South Forida’s rapidly growing population of nearly eight million people.
Meanwhile, a federal-state initiative to address this challenge, known as the Comprehensive Everglades Restoration Plan, is “sort of stuck in the muddle,” says Lance Gunderson, chair of Emory’s Department of Environmental Sciences. The plan was authorized in 2000 but it hasn’t made much progress.
Climate change throws another wrench in the works, affecting the Everglades and other large watersheds across the United States in new and unpredictable ways. Extreme weather events and rising sea levels, combined with a growing population, will lead to “more intense arguments” about already contested issues of water quality and water usage, Gunderson says.
Gunderson, a wetlands ecologist, recently partnered with Barbara Cosens, a legal scholar at the University of Idaho, to lead an interdisciplinary team of researchers in a project to assess the adaptive capacity of six major U.S. water basins to changing climates. In addition to the Everglades, the basins include the Anacostia, the Columbia, the Klamath, the Platte and the Rio Grande rivers. The project was funded by NSF Social-Ecological Synthesis Center at the University of Maryland. (Watch the video below to learn more.)
“Climate change is a game changer when it comes to the management of these regional-scale water systems across the country,” Gunderson says. “These systems are managed through assumptions about climate and models that are based on averages. Now, managers are struggling to adapt to more extremes — like earlier snow melts, more floods and droughts, and more intense storms.”
Even without extreme events, water management is complex. The Everglades, for example, is not just an issue of restoring biological diversity. It’s an economic problem that often puts government agencies, agriculture, developers, residents, and environmental groups at loggerheads.
“These are complex problems and we can’t plan or analyze our way out of them,” Gunderson says. “We have to learn our way out of them.”
Instead of relying on the court system or government policies, he says people need to come together in organic, self-organized ways for “adaptive governance.” Such approaches can forge new paths through a problem by trying small experiments to see if they work.
The Klamath River basin, for instance, benefited by farmers and Native Americans coming together informally, instead of going to court, to talk about possible ways to reallocate water to satisfy both sides.
“Informal, adaptive management lets you learn while you’re doing,” Gunderson says. “It allows people without resources to be engaged in the process. Change happens when little groups of people work together collectively on wicked problems that have no easy solutions or easy answers.”
As chair of Environmental Sciences Gunderson is also confronted with the problem of how to train students to deal with the issues that will face them when they graduate. The department is blending facets of political science, ecology, sociology, biology, geology and health into its curriculum.
“These specialties are at the intersection of major environmental problems and we are trying to build some integrated understanding around them,” Gunderson says. “Our world is becoming more complex and we want students to have the skills to confront that complexity.”
Related:
Students develop device to help cope with climate change
Responding to climate change
The growing role of farming and nitrous oxide in climate change
Putting people into the climate change picture
By Carol Clark
The Everglades National Park, the largest subtropical wilderness in the United States, is home to 16 different species of wading birds and rare and endangered species like the manatee, the American crocodile and the Florida panther. But the area is also home to humans. The park is a portion of a larger wetland ecosystem, more than half of which has been converted into agricultural production or urban developments. The ecosystem must provide both flood protection and supply water for the park, the agricultural interests and South Forida’s rapidly growing population of nearly eight million people.
Meanwhile, a federal-state initiative to address this challenge, known as the Comprehensive Everglades Restoration Plan, is “sort of stuck in the muddle,” says Lance Gunderson, chair of Emory’s Department of Environmental Sciences. The plan was authorized in 2000 but it hasn’t made much progress.
Climate change throws another wrench in the works, affecting the Everglades and other large watersheds across the United States in new and unpredictable ways. Extreme weather events and rising sea levels, combined with a growing population, will lead to “more intense arguments” about already contested issues of water quality and water usage, Gunderson says.
Gunderson, a wetlands ecologist, recently partnered with Barbara Cosens, a legal scholar at the University of Idaho, to lead an interdisciplinary team of researchers in a project to assess the adaptive capacity of six major U.S. water basins to changing climates. In addition to the Everglades, the basins include the Anacostia, the Columbia, the Klamath, the Platte and the Rio Grande rivers. The project was funded by NSF Social-Ecological Synthesis Center at the University of Maryland. (Watch the video below to learn more.)
“Climate change is a game changer when it comes to the management of these regional-scale water systems across the country,” Gunderson says. “These systems are managed through assumptions about climate and models that are based on averages. Now, managers are struggling to adapt to more extremes — like earlier snow melts, more floods and droughts, and more intense storms.”
Even without extreme events, water management is complex. The Everglades, for example, is not just an issue of restoring biological diversity. It’s an economic problem that often puts government agencies, agriculture, developers, residents, and environmental groups at loggerheads.
“These are complex problems and we can’t plan or analyze our way out of them,” Gunderson says. “We have to learn our way out of them.”
Instead of relying on the court system or government policies, he says people need to come together in organic, self-organized ways for “adaptive governance.” Such approaches can forge new paths through a problem by trying small experiments to see if they work.
The Klamath River basin, for instance, benefited by farmers and Native Americans coming together informally, instead of going to court, to talk about possible ways to reallocate water to satisfy both sides.
“Informal, adaptive management lets you learn while you’re doing,” Gunderson says. “It allows people without resources to be engaged in the process. Change happens when little groups of people work together collectively on wicked problems that have no easy solutions or easy answers.”
As chair of Environmental Sciences Gunderson is also confronted with the problem of how to train students to deal with the issues that will face them when they graduate. The department is blending facets of political science, ecology, sociology, biology, geology and health into its curriculum.
“These specialties are at the intersection of major environmental problems and we are trying to build some integrated understanding around them,” Gunderson says. “Our world is becoming more complex and we want students to have the skills to confront that complexity.”
Related:
Students develop device to help cope with climate change
Responding to climate change
The growing role of farming and nitrous oxide in climate change
Putting people into the climate change picture
Tags:
Climate change,
Community Outreach,
Ecology,
Sociology
Monday, August 27, 2018
Sensitivity to how others evaluate you emerges by 24 months
"Image management is fascinating to me because it's so important to being human," says Sara Valencia Botto, shown posing with a toddler. The Emory graduate student published a study on how toddlers are attuned to image, along with psychology professor Philippe Rochat. (Kay Hinton, Emory Photo/Video)
By Carol Clark
Even before toddlers can form a complete sentence, they are attuned to how others may be judging them, finds a new study by psychologists at Emory University.
The journal Developmental Psychology is publishing the results, documenting that toddlers are sensitive to the opinions of others, and that they will modify their behavior accordingly when others are watching.
“We’ve shown that by the age of 24 months, children are not only aware that other people may be evaluating them, but that they will alter their behavior to seek a positive response,” says Sara Valencia Botto, an Emory PhD candidate and first author of the study.
While previous research has documented this behavior in four- to five-year-olds, the new study suggests that it may emerge much sooner, Botto says.
“There is something specifically human in the way that we’re sensitive to the gaze of others, and how systematic and strategic we are about controlling that gaze,” says Philippe Rochat, an Emory professor of psychology who specializes in childhood development and senior author of the study. “At the very bottom, our concern for image management and reputation is about the fear of rejection, one of the main engines of the human psyche.”
This concern for reputation manifests itself in everything from spending money on makeup and designer brands to checking how many “likes” a Facebook post garners.
“Image management is fascinating to me because it’s so important to being human,” Botto says. “Many people rate their fear of public speaking above their fear of dying. If we want to understand human nature, we need to understand when and how the foundation for caring about image emerges.”
The researchers conducted experiments involving 144 children between the ages of 14 and 24 months using a remotely controlled robot toy.
In one experiment, a researcher showed a toddler how to use the remote to operate the robot. The researcher then either watched the child with a neutral expression or turned away and pretended to read a magazine. When the child was being watched, he or she showed more inhibition when hitting the buttons on the remote than when the researcher was not watching.
In a second experiment, the researcher used two different remotes when demonstrating the toy to the child. While using the first remote, the researcher smiled and said, “Wow! Isn’t that great?” And when using the second remote, the researcher frowned and said “Uh-oh! Oops, oh no!” After inviting the child to play with the toy, the researcher once again either watched the child or turned to the magazine.
The children pressed the buttons on the remote associated with the positive response from the researcher significantly more while being watched. And they used the remote associated with the negative response more when not being watched.
During a third experiment, that served as a control, the researcher gave a neutral response of “Oh, wow!” when demonstrating how to use the two remotes. The children no longer chose one remote over the other depending on whether the researcher was watching them.
The control experiment showed that in the second experiment the children really did take into account the values expressed by the experimenter when interacting with the toy, and based on those values changed their behavior depending on whether they were being watched, Botto says.
A final experiment involved two researchers sitting next to one another and using one remote. One researcher smiled and gave a positive response, “Yay! The toy moved!” when pressing the remote. The second researcher frowned and said, “Yuck! The toy moved!” when pressing the same remote. The child was then invited to play with the toy while the two researchers alternated between either watching or turning their back to the child. Results showed that the children were much more likely to press the remote when the researcher who gave the positive response was watching.
“We were surprised by the flexibility of the children’s sensitivity to others and their reactions,” Botto says. “They could track one researcher’s values of two objects and two researchers’ values of one object. It reinforces the idea that children are usually smarter than we think.”
Botto is continuing to lead the research in the Rochat lab for her PhD thesis. She is now developing experiments for children as young as 12 months to see if the sensitivity to being evaluated by others emerges even earlier than the current study documents.
And she is following the 14- to 24-month-old children involved in the published study, to see if the individual differences they showed in the experiments are maintained as they turn four and five. The researchers are measuring social and cognitive factors that may have predictive power for individual differences — such as language ability, temperament and a child’s ability to pick up on social norms and to understand that people can have beliefs different from their own.
“Ultimately, we hope to determine exactly when children begin to be sensitive to others’ evaluations and the social and cognitive factors that are necessary for that sensitivity to emerge,” Botto says.
Such basic research may translate into helping people in a clinical environment who are at the extremes of the spectrum of such sensitivity, she adds.
“It’s normal and necessary to a certain extent to care about our image with others,” Botto says. “But some people care so much that they suffer from social anxiety, while others care so little that it is not optimal in a society where cooperation is essential.”
The American Psychological Association contributed to this report.
Related:
Babies have logical reasoning before age one, study finds
Babies' spatial reasoning predicts later math skills
By Carol Clark
Even before toddlers can form a complete sentence, they are attuned to how others may be judging them, finds a new study by psychologists at Emory University.
The journal Developmental Psychology is publishing the results, documenting that toddlers are sensitive to the opinions of others, and that they will modify their behavior accordingly when others are watching.
“We’ve shown that by the age of 24 months, children are not only aware that other people may be evaluating them, but that they will alter their behavior to seek a positive response,” says Sara Valencia Botto, an Emory PhD candidate and first author of the study.
While previous research has documented this behavior in four- to five-year-olds, the new study suggests that it may emerge much sooner, Botto says.
“There is something specifically human in the way that we’re sensitive to the gaze of others, and how systematic and strategic we are about controlling that gaze,” says Philippe Rochat, an Emory professor of psychology who specializes in childhood development and senior author of the study. “At the very bottom, our concern for image management and reputation is about the fear of rejection, one of the main engines of the human psyche.”
This concern for reputation manifests itself in everything from spending money on makeup and designer brands to checking how many “likes” a Facebook post garners.
“Image management is fascinating to me because it’s so important to being human,” Botto says. “Many people rate their fear of public speaking above their fear of dying. If we want to understand human nature, we need to understand when and how the foundation for caring about image emerges.”
The researchers conducted experiments involving 144 children between the ages of 14 and 24 months using a remotely controlled robot toy.
In one experiment, a researcher showed a toddler how to use the remote to operate the robot. The researcher then either watched the child with a neutral expression or turned away and pretended to read a magazine. When the child was being watched, he or she showed more inhibition when hitting the buttons on the remote than when the researcher was not watching.
In a second experiment, the researcher used two different remotes when demonstrating the toy to the child. While using the first remote, the researcher smiled and said, “Wow! Isn’t that great?” And when using the second remote, the researcher frowned and said “Uh-oh! Oops, oh no!” After inviting the child to play with the toy, the researcher once again either watched the child or turned to the magazine.
The children pressed the buttons on the remote associated with the positive response from the researcher significantly more while being watched. And they used the remote associated with the negative response more when not being watched.
During a third experiment, that served as a control, the researcher gave a neutral response of “Oh, wow!” when demonstrating how to use the two remotes. The children no longer chose one remote over the other depending on whether the researcher was watching them.
The control experiment showed that in the second experiment the children really did take into account the values expressed by the experimenter when interacting with the toy, and based on those values changed their behavior depending on whether they were being watched, Botto says.
A final experiment involved two researchers sitting next to one another and using one remote. One researcher smiled and gave a positive response, “Yay! The toy moved!” when pressing the remote. The second researcher frowned and said, “Yuck! The toy moved!” when pressing the same remote. The child was then invited to play with the toy while the two researchers alternated between either watching or turning their back to the child. Results showed that the children were much more likely to press the remote when the researcher who gave the positive response was watching.
“We were surprised by the flexibility of the children’s sensitivity to others and their reactions,” Botto says. “They could track one researcher’s values of two objects and two researchers’ values of one object. It reinforces the idea that children are usually smarter than we think.”
Botto is continuing to lead the research in the Rochat lab for her PhD thesis. She is now developing experiments for children as young as 12 months to see if the sensitivity to being evaluated by others emerges even earlier than the current study documents.
And she is following the 14- to 24-month-old children involved in the published study, to see if the individual differences they showed in the experiments are maintained as they turn four and five. The researchers are measuring social and cognitive factors that may have predictive power for individual differences — such as language ability, temperament and a child’s ability to pick up on social norms and to understand that people can have beliefs different from their own.
“Ultimately, we hope to determine exactly when children begin to be sensitive to others’ evaluations and the social and cognitive factors that are necessary for that sensitivity to emerge,” Botto says.
Such basic research may translate into helping people in a clinical environment who are at the extremes of the spectrum of such sensitivity, she adds.
“It’s normal and necessary to a certain extent to care about our image with others,” Botto says. “But some people care so much that they suffer from social anxiety, while others care so little that it is not optimal in a society where cooperation is essential.”
The American Psychological Association contributed to this report.
Related:
Babies have logical reasoning before age one, study finds
Babies' spatial reasoning predicts later math skills
Thursday, April 26, 2018
DNA analysis adds twists to ancient story of a Native American group
"I want to help Native American tribes to reclaim knowledge of their very ancient evolutionary histories — histories that have been largely wiped away because of colonialism," says Emory geneticist John Lindo. Photo by Kay Hinton, Emory Photo/Video.
By Carol Clark
The ancient genomes of the Tsimshian indigenous people left tell-tale markers on the trail of their past, revealing that at least 6,000 years ago their population size was on a slow but steady decline.
The American Journal of Human Genetics published the findings, which draw from the first population-level nuclear DNA analysis of a Native American group from ancient to modern times.
“The finding contradicts a popular notion,” says John Lindo, a geneticist in Emory University’s Department of Anthropology and first author on the paper. “There is this idea that after Native Americans came in through the Bering Strait that they were all expanding in population size until Europeans showed up. At least for this one population, we’ve shown that was not the case.”
A boon in next-generation DNA sequencing technology has opened the possibility to explore the evolutionary history of different populations. “Ancient nuclear DNA analysis is a relatively new field,” Lindo says. “Not until recently have we had methods to sequence an entire genome quickly and inexpensively.”
Nuclear DNA provides information on an individual’s lineages going back hundreds of thousands of years. Lindo is one of the few geneticists looking at ancient whole genomes of Native Americans. He is especially interested in understanding how the genomes of their different populations evolved over time.
“Their evolutionary histories are radically different,” Lindo says. “Over thousands of years, various Native American populations have adapted to living in every ecology throughout North and South America, from the Arctic to the Amazon. That’s about as an extreme as you can get for differences in environments.”
The Tsimshian people historically lived in longhouses in coastal British Columbia and southern Alaska where they harvested the abundant sea life. Lindo and his colleagues sequenced the genomes of 25 living Tsimshian people and 25 ancient individuals who lived in the same region between 6,000 and 500 years ago, and confirmed that they were a continuous population through time.
Members of the Tsimshian Native American tribe hold a tea party near Fort Simpson, British Columbia, in 1889. Image from the Library and Archives Canada.
In a previous paper, drawing from the same data set, they found a dramatic shift between the two time periods in a class of genes associated with the immune system, suggesting a strong evolutionary pressure on the population to adapt to pathogens. A demographic model indicated a crash in the Tsimshian population size of about 57 percent during the early-to-mid 19th century. That finding fitted with historical accounts for how smallpox, introduced by European colonization, devastated the Tsimshian population during two epidemics within that time-frame.
The current paper looked at broader genetic variations between the ancient and modern DNA. An analysis showed both how the variation declined slowly in the ancient population before the collapse, but has since recovered.
“After a population collapse, only a subset of the genetic diversity remains,” Lindo says. “We find a more nuanced story, that despite the population collapse, the genetic diversity of modern Tsimshian people varies significantly.”
Intermarriage with other Native American groups and non-native populations increased the genetic diversity of some of the modern-day Tsimshian people so that it is near the levels prior to their population collapse, the analysis showed.
“A population with relatively high genetic diversity has a greater potential to fight off pathogens and avoid recessive traits,” Lindo says. “It exemplifies the benefits of gene flow between populations, especially following catastrophic events such as the small pox epidemics that the Tsimshian endured.”
Senior authors on the paper are Michael DeGiorgio from Pennsylvania State University and Ripan Malhi from the University of Illinois. The paper’s coauthors include Tsimshian representatives Joycelynn Mitchell and Barbara Petzelt from the Metlakatla Treaty Office in Prince Rupert, Canada.
Malhi, a leader in forging trusting relationships between genetic researchers and indigenous people, was a mentor to Lindo, who earned his PhD at the University of Illinois at Champaign-Urbana.
Lindo is continuing that tradition of building trust and working closely with indigenous populations. His ancient DNA research at Emory integrates the approaches of ancient whole genomes, statistical modeling and functional methods.
One of his projects is focused on genetic fluctuations to help understand ancient adaptions in various Native American populations. He is currently working with 10 different tribes from throughout North America.
“Community engagement is essential when working with indigenous communities,” says Lindo, explaining that he first meets personally with a tribal community to talk about how a genetic study might add to their knowledge of their own history.
“I listen to their stories and how they are working to keep their cultures alive,” he says. “One elder from a southwestern tribe told me that his grandfather was taken away in the early 1900s because he was a shaman and Christianity was swelling through the area. Each tribe’s stories are different but they are all powerful, and sometimes difficult, stories to hear.”
Most ancient DNA analyses have come out of Europe, where more ancient DNA labs are based and cold temperatures have helped preserve specimens.
Lindo wants to bring some of the same insights that those of European ancestry are gaining about their past to Native Americans.
“I’d like to disentangle this idea that Native Americans are part of a singular race,” he says. “I want to help Native American tribes to reclaim knowledge of their very ancient evolutionary histories — histories that have been largely wiped away because of colonialism.”
Related:
Malawi yields oldest-known DNA from Africa
By Carol Clark
The ancient genomes of the Tsimshian indigenous people left tell-tale markers on the trail of their past, revealing that at least 6,000 years ago their population size was on a slow but steady decline.
The American Journal of Human Genetics published the findings, which draw from the first population-level nuclear DNA analysis of a Native American group from ancient to modern times.
“The finding contradicts a popular notion,” says John Lindo, a geneticist in Emory University’s Department of Anthropology and first author on the paper. “There is this idea that after Native Americans came in through the Bering Strait that they were all expanding in population size until Europeans showed up. At least for this one population, we’ve shown that was not the case.”
A boon in next-generation DNA sequencing technology has opened the possibility to explore the evolutionary history of different populations. “Ancient nuclear DNA analysis is a relatively new field,” Lindo says. “Not until recently have we had methods to sequence an entire genome quickly and inexpensively.”
Nuclear DNA provides information on an individual’s lineages going back hundreds of thousands of years. Lindo is one of the few geneticists looking at ancient whole genomes of Native Americans. He is especially interested in understanding how the genomes of their different populations evolved over time.
“Their evolutionary histories are radically different,” Lindo says. “Over thousands of years, various Native American populations have adapted to living in every ecology throughout North and South America, from the Arctic to the Amazon. That’s about as an extreme as you can get for differences in environments.”
The Tsimshian people historically lived in longhouses in coastal British Columbia and southern Alaska where they harvested the abundant sea life. Lindo and his colleagues sequenced the genomes of 25 living Tsimshian people and 25 ancient individuals who lived in the same region between 6,000 and 500 years ago, and confirmed that they were a continuous population through time.
Members of the Tsimshian Native American tribe hold a tea party near Fort Simpson, British Columbia, in 1889. Image from the Library and Archives Canada.
In a previous paper, drawing from the same data set, they found a dramatic shift between the two time periods in a class of genes associated with the immune system, suggesting a strong evolutionary pressure on the population to adapt to pathogens. A demographic model indicated a crash in the Tsimshian population size of about 57 percent during the early-to-mid 19th century. That finding fitted with historical accounts for how smallpox, introduced by European colonization, devastated the Tsimshian population during two epidemics within that time-frame.
The current paper looked at broader genetic variations between the ancient and modern DNA. An analysis showed both how the variation declined slowly in the ancient population before the collapse, but has since recovered.
“After a population collapse, only a subset of the genetic diversity remains,” Lindo says. “We find a more nuanced story, that despite the population collapse, the genetic diversity of modern Tsimshian people varies significantly.”
Intermarriage with other Native American groups and non-native populations increased the genetic diversity of some of the modern-day Tsimshian people so that it is near the levels prior to their population collapse, the analysis showed.
“A population with relatively high genetic diversity has a greater potential to fight off pathogens and avoid recessive traits,” Lindo says. “It exemplifies the benefits of gene flow between populations, especially following catastrophic events such as the small pox epidemics that the Tsimshian endured.”
Senior authors on the paper are Michael DeGiorgio from Pennsylvania State University and Ripan Malhi from the University of Illinois. The paper’s coauthors include Tsimshian representatives Joycelynn Mitchell and Barbara Petzelt from the Metlakatla Treaty Office in Prince Rupert, Canada.
Malhi, a leader in forging trusting relationships between genetic researchers and indigenous people, was a mentor to Lindo, who earned his PhD at the University of Illinois at Champaign-Urbana.
Lindo is continuing that tradition of building trust and working closely with indigenous populations. His ancient DNA research at Emory integrates the approaches of ancient whole genomes, statistical modeling and functional methods.
One of his projects is focused on genetic fluctuations to help understand ancient adaptions in various Native American populations. He is currently working with 10 different tribes from throughout North America.
“Community engagement is essential when working with indigenous communities,” says Lindo, explaining that he first meets personally with a tribal community to talk about how a genetic study might add to their knowledge of their own history.
“I listen to their stories and how they are working to keep their cultures alive,” he says. “One elder from a southwestern tribe told me that his grandfather was taken away in the early 1900s because he was a shaman and Christianity was swelling through the area. Each tribe’s stories are different but they are all powerful, and sometimes difficult, stories to hear.”
Most ancient DNA analyses have come out of Europe, where more ancient DNA labs are based and cold temperatures have helped preserve specimens.
Lindo wants to bring some of the same insights that those of European ancestry are gaining about their past to Native Americans.
“I’d like to disentangle this idea that Native Americans are part of a singular race,” he says. “I want to help Native American tribes to reclaim knowledge of their very ancient evolutionary histories — histories that have been largely wiped away because of colonialism.”
Related:
Malawi yields oldest-known DNA from Africa
Sunday, March 25, 2018
Frankenstein at 200 sparks wonder and debate
It’s the 200th anniversary year of “Frankenstein, Or the Modern Prometheus,” an enduring novel at the nexus of major questions of our time. Emory faculty explore many of them in a newly published anthology, “Frankenstein: How a Monster Became an Icon, the Science and Enduring Allure of Mary Shelley’s Creation.”
“When you see a contemporary film about androids, like ‘Blade Runner 2049,’ you’re seeing the ‘Frankenstein’ story in a 21st-century guise,” says Sidney Perkowitz, Emory emeritus physicist and co-editor of the new anthology. “The androids are sleek and modern instead of the shambling, stitched-together creature in ‘Frankenstein,’ but they have the same questions swirling around them. Even as we’re on the verge of artificially generating life, we’re no closer to knowing whether we should.”
You can read more here.
Related:
Chemists boldly go in search of 'little green molecules'
Prometheus: Seeding wonder and science
Monday, February 26, 2018
Ecosystems hanging by a thread
Emory disease ecologist Thomas Gillespie served on an international committee that developed best practice guidelines for health monitoring and disease control in great ape populations, part of a growing public education effort.
By Tony Rehagen
Emory Magazine
Thomas Gillespie’s parents and teachers always wanted him to go into medicine.
“Growing up in Rockford, Illinois, if you were smart and interested in biology, you were supposed to be a doctor,” he says.
Gillespie, meanwhile, was always more interested in primates. In seventh grade, he phoned animal psychologist Penny Patterson, famous for teaching the gorilla Koko how to use sign language, and interviewed the scientist about Koko’s diet while punching out notes on a typewriter. He was premed at the University of Illinois, but spent his internship at the Brookfield Zoo in Chicago, working in the “Tropic World” primate exhibit. His favorite undergrad course was biological anthropology, the study of biological and behavioral aspects of humans and nonhuman primates, looking at our closest relatives to better understand ourselves.
Gillespie eventually took a year off before graduate school to work with primate communities in the Peruvian Amazon. The apes finally won out — Gillespie would choose a doctorate in zoology over medical school.
But it wasn’t long before the two fields of study collided. While monitoring the group behavior of colobine monkeys in Africa, Gillespie observed that some of the animals were eating bark from the African cherry tree — not a typical food source for them. When he dug deeper, Gillespie learned that human doctors in the region used that same bark to treat parasites in their patients. The monkeys, he realized, were self-medicating.
“That discovery in these monkeys brought me back toward the health science side of biology,” says Gillespie.
Gillespie’s return to a medical approach to zoology came not a moment too soon—for the sake of the primates and maybe even all of humankind. As an associate professor in Emory’s Department of Environmental Sciences specializing in the disease ecology of primates, Gillespie and his team of researchers have helped uncover a crisis among our nearest taxonomic neighbors. According to an article coauthored by Gillespie and thirty other experts and published in the journal Science Advances, 75 percent of the world’s five-hundred-plus primate species are declining in population, and a whopping 60 percent face extinction, largely due to human encroachment.
Read more in Emory Magazine.
Related:
Experts warn of impending extinction of many of the world's primates
Chimpanzee studies highlight disease risks to all endangered wildlife
By Tony Rehagen
Emory Magazine
Thomas Gillespie’s parents and teachers always wanted him to go into medicine.
“Growing up in Rockford, Illinois, if you were smart and interested in biology, you were supposed to be a doctor,” he says.
Gillespie, meanwhile, was always more interested in primates. In seventh grade, he phoned animal psychologist Penny Patterson, famous for teaching the gorilla Koko how to use sign language, and interviewed the scientist about Koko’s diet while punching out notes on a typewriter. He was premed at the University of Illinois, but spent his internship at the Brookfield Zoo in Chicago, working in the “Tropic World” primate exhibit. His favorite undergrad course was biological anthropology, the study of biological and behavioral aspects of humans and nonhuman primates, looking at our closest relatives to better understand ourselves.
Gillespie eventually took a year off before graduate school to work with primate communities in the Peruvian Amazon. The apes finally won out — Gillespie would choose a doctorate in zoology over medical school.
But it wasn’t long before the two fields of study collided. While monitoring the group behavior of colobine monkeys in Africa, Gillespie observed that some of the animals were eating bark from the African cherry tree — not a typical food source for them. When he dug deeper, Gillespie learned that human doctors in the region used that same bark to treat parasites in their patients. The monkeys, he realized, were self-medicating.
“That discovery in these monkeys brought me back toward the health science side of biology,” says Gillespie.
Gillespie’s return to a medical approach to zoology came not a moment too soon—for the sake of the primates and maybe even all of humankind. As an associate professor in Emory’s Department of Environmental Sciences specializing in the disease ecology of primates, Gillespie and his team of researchers have helped uncover a crisis among our nearest taxonomic neighbors. According to an article coauthored by Gillespie and thirty other experts and published in the journal Science Advances, 75 percent of the world’s five-hundred-plus primate species are declining in population, and a whopping 60 percent face extinction, largely due to human encroachment.
Read more in Emory Magazine.
Related:
Experts warn of impending extinction of many of the world's primates
Chimpanzee studies highlight disease risks to all endangered wildlife
Friday, February 16, 2018
'Divine Felines' showcases Egypt's exaltation of cats
From ancient Egypt to modern times, cats rule many peoples' lives. Photo by Stephen Nowland, Emory Photo/Video.
By Leslie King
Emory Report
“In ancient Egypt, cats and dogs were gods, and they have not forgotten this!” says Melinda Hartwig, curator of Ancient Egyptian, Nubian and Near Eastern Art at the Michael C. Carlos Museum.
That exalted stature is illuminated in the exhibition “Divine Felines: Cats of Ancient Egypt,” which opened Feb. 10 at the museum and will be on view through Nov. 11.
The exhibit showcases cats and lions, plus dogs and jackals, as domesticated pets, creatures of the wild or mythic symbols of divinities, in ancient Egyptian mythology, kingship and everyday life. Animal burial practices and luxury items decorated with feline and canine features are also on display.
“Cats and dogs reveal so much about ancient Egyptian culture,” says Hartwig. “These animals were just as important to the ancient Egyptians as they are to us today.”
The kings of Egypt were associated with the lion, thus, the human head on the lion’s body or the sphinx.
“Cats were first domesticated in Egypt around 4000 BC. They were lovable pets, hunters of vermin and divine embodiments of fertility and protection. Lions and jungle cats were admired for their power, and were linked with royalty and divinity,” Hartwig continues. “Dogs were also kept as pets. Their loyalty and hunting abilities were keenly valued. Often found roaming the ancient necropolises, dogs and jackals became embodiments of the gods who protected the dead.”
Read more in Emory Report.
By Leslie King
Emory Report
“In ancient Egypt, cats and dogs were gods, and they have not forgotten this!” says Melinda Hartwig, curator of Ancient Egyptian, Nubian and Near Eastern Art at the Michael C. Carlos Museum.
That exalted stature is illuminated in the exhibition “Divine Felines: Cats of Ancient Egypt,” which opened Feb. 10 at the museum and will be on view through Nov. 11.
The exhibit showcases cats and lions, plus dogs and jackals, as domesticated pets, creatures of the wild or mythic symbols of divinities, in ancient Egyptian mythology, kingship and everyday life. Animal burial practices and luxury items decorated with feline and canine features are also on display.
“Cats and dogs reveal so much about ancient Egyptian culture,” says Hartwig. “These animals were just as important to the ancient Egyptians as they are to us today.”
The kings of Egypt were associated with the lion, thus, the human head on the lion’s body or the sphinx.
“Cats were first domesticated in Egypt around 4000 BC. They were lovable pets, hunters of vermin and divine embodiments of fertility and protection. Lions and jungle cats were admired for their power, and were linked with royalty and divinity,” Hartwig continues. “Dogs were also kept as pets. Their loyalty and hunting abilities were keenly valued. Often found roaming the ancient necropolises, dogs and jackals became embodiments of the gods who protected the dead.”
Read more in Emory Report.
Monday, February 5, 2018
Twitter reveals how future-thinking Americans are and how that affects their decisions
By Carol Clark
Individuals who tend to think further into the future are more likely to invest money and to avoid risks, finds a new paper by psychologists at Emory University. The Proceedings of the National Academy of Sciences (PNAS) published the research, which tapped big data tools to conduct text analyses of nearly 40,000 Twitter users, and to run online experiments of behavior of people who provided their Twitter handles.
The researchers also found an association between longer future-sightedness and less risky decision-making at a U.S. state population level.
“Twitter is like a microscope for psychologists,” says co-author Phillip Wolff, an Emory associate professor of psychology. “Naturalistic data mined from tweets appears to give insights not just into tweeters’ thoughts at a particular time, but into a relatively stable cognitive process. Using social media and big-data analytical tools opens up a new paradigm in the way we study human behavior.”
Co-author Robert Thorstad, an Emory PhD candidate in the Wolff lab, came up with the idea for the research, worked on the design and analyses, and conducted the experiments.
“I'm fascinated by how peoples’ everyday behavior can give away a lot of information about their psychology,” Thorstad says. “Much of our work was automated, so we were able to analyze millions of Tweets from thousands of individuals’ day-to-day lives.”
The future-sightedness found in individuals’ tweets was short, usually just a few days, which differs from prior research suggesting future-sightedness on the order of years.
“One possible interpretation is that the difference is due to a feature of social media,” Wolff says. Another possible reason, he adds, is that prior studies explicitly asked individuals how far they thought into the future while the PNAS paper used the implicit measure of previous tweets.
While the relationship between future-sightedness and decision-making may seem obvious, the researchers note that previous findings on the subject have not been consistent. Those inconsistencies may be due to factors such as observer bias in a laboratory setting and small sample sizes.
The PNAS paper used a suite of methods (such as the Stanford CoreNLP natural language processing toolkit and SUTime, a rule-based temporal tagger built on regular expression patterns) to automatically analyze Twitter text trails previously left by individual subjects. Experimental data was gathered using the Amazon crowdsourcing tool Mechanical Turk, a web site where individuals can complete psychology experiments and other internet-based tasks. Participants in the Mechanical Turk experiments were asked to supply their Twitter handles.
In one experiment for the PNAS paper, Mechanical Turk participants answered a classic delay discounting question, such as: Would you prefer $60 today or $100 in six months? The participants’ Tweets were also analyzed. Future orientation was measured by the tendency of participants to tweet about the future compared to the past. Future-sightedness was measured based on how often tweets referred to the future, and how far into the future.
The results showed that future orientation was not associated with investment behavior, but that individuals with far future-sightedness were more likely to choose to wait for future rewards than those with near future-sightedness. That indicates that investment behavior depends on how far individuals think into the future and not their tendency to think about the future in general.
A second Mechanical Turk experiment used a digital Balloon Analogue Risk Task (BART). Participants’ could earn real money every time they inflated a balloon, but each inflation could lead to the balloon popping, resulting in no money earned for that trial. If participants stopped inflating before the balloon popped, they could bank the money that they have earned and proceed to the next trial.
The BART participants’ tweets were also analyzed. The results showed that those with longer future-sightedness were less likely to take the risk of fully inflating the balloon.
Another study in the PNAS paper focused on Twitter users whose profiles tied them to a particular state. About eight million of their tweets were analyzed for future-sightedness.
The researchers measured a state’s risk-taking behaviors at the population level using the proxy of publicly available statistics, such as seat-belt compliance rates, drunken driving rates and teen-aged pregnancy rates. The results showed that shorter future-sightedness measures for tweets from individual states correlated closely to higher rates of risky behaviors, in a pattern similar to the results of the individual experimental studies.
To measure a state’s investment behavior, the researchers used state statistics for spending on state parks, pre-kindergarten education, highways and per-pupil education. The researchers found that states that invested more in these areas were associated with tweets from individuals with longer future-sightedness, but not at a statistically significant level.
The researchers controlled for state demographics such as political orientation, per capita income, household income and GDP. “We found that, while demographics are important, they couldn’t explain away the effects of future-thinking,” Wolff says.
The estimated 21 percent of American adults who use Twitter tend to be younger and more technologically literate than the general population, Thorstad concedes. But he adds that Twitter’s demographics are not that far off from the general population in terms of gender, economic status and education levels. And the percentages of Twitter users living in rural, urban and suburban areas are virtually the same.
“Twitter can provide a much broader participant pool than many psychology experiments that primarily use undergraduates as subjects,” Thorstad notes. “Big-data methods may ultimately improve generalizability for psychology results.”
“Through social media, we’re amassing huge amounts of data on ourselves, behaviorally and over time, that is leaving behind a kind of digital phenotype,” Wolff adds. “We’re now in an age where we have big-data analytical tools that can extract information to tell us something indirectly about an individual’s cognitive life, and to predict what an individual might do in the future.”
Friday, January 26, 2018
Chimpanzee studies highlight disease risk to all endangered wildlife
Famed primatologist Jane Goodall with Emory disease ecologist Thomas Gillespie, who is working with the Jane Goodall Institute to study the health of chimpanzees in Tanzania's Gombe National Park.
The American Journal of Primatology just published a special edition bringing together experts who have contributed to the understanding of chimpanzee health at Gombe National Park in Tanzania and beyond. Gombe is the site where Jane Goodall pioneered her behavioral research of chimpanzees. Goodall’s work at Gombe began in 1960, and continues today through the Jane Goodall Institute, making it the longest field study of any animal.
Thomas Gillespie, associate professor in Emory’s Department of Environmental Sciences, was a guest editor of the special journal edition, along with fellow scientists Dominic Travis and Elizabeth Lonsdorf. Gillespie works at the interface of biodiversity conservation and global health. Much of his research examines how and why anthropogenic influences within tropical forests alter disease dynamics and place wild primates, people and other animals in such ecosystems at increased risk of pathogen exchange.
Following is an interview with Gillespie about the special journal issue and why research on chimpanzee health is important.
What is the current status of chimpanzees?
Both the common chimpanzee and the bonobo, the two chimpanzee subspecies, are endangered. Chimpanzees are the most closely related species to humans and we see them declining precipitously due to habitat loss and poaching. Typical estimates for the chimpanzee population are in the hundreds of thousands. That’s far less than the number of people in Atlanta for the entire chimpanzee species spread across all of Africa. There is a real risk of chimpanzees going locally extinct in core parts of their habitat. Chimpanzee communities in West Africa, for instance, have very little habitat left. They’re often found living in scraps of habitat between villages.
How important is health to conservation?
Wildlife health is a critical conservation issue, but that’s something that’s only recently been recognized. Wildlife populations already dealing with poaching and habitat loss are more vulnerable to being knocked out by disease. It becomes even more difficult when they are exposed to new pathogens, from humans or domesticated animals.
On top of that, primates are dealing with shifts in the dynamics of pathogens like Ebola. Ebola’s been around for a long time in natural systems but now we’re seeing big mortality events in wild chimpanzees and other apes. The Lowland Gorillas are actually listed as critically endangered due to Ebola.
How did you become involved with Gombe and the Jane Goodall Institute?
Fifteen years ago, as evidence mounted that disease was playing an important role in the population declines observed in Gombe chimpanzees in Tanzania, Dominic Travis and Elizabeth Lonsdorf developed a prospective health monitoring system. They began to collect specific behavioral data on signs of respiratory and gastrointestinal illnesses, combined with body condition scoring on a monthly basis for the chimpanzee communities at Gombe, that paralleled efforts by the Mountain Gorilla Veterinary Project in Rwanda and Uganda.
When I met Dom and Elizabeth at a workshop in Germany in 2004, I was six years into efforts to understand how logging and forest fragmentation in and around Kibale National Park, Uganda, affected disease dynamics in resident primates. My findings in Uganda highlighted that some forms of anthropogenic disturbance can alter the dynamics of natural pathogens in wildlife, such as a legacy of selective logging. It also revealed that other forms of disturbance, such as active forest fragmentation, can lead to opportunities for pathogens to jump between species, including the introduction of pathogens from people and domesticated animals to wild primates.
Dom and Elizabeth asked me to join their effort and expand the scope of their project to a One Health approach. I initiated diagnostic surveillance linked to geographical indicators of species overlap for Gombe’s chimpanzees and baboons, as well as the people and domesticated animals within the Greater Gombe Ecosystems. It serves as a map of all the places these species are interacting, for a greater sense of how transmission may be occurring. Integration of these new data streams, along with the ongoing observational health data and in-depth post-mortem necropsies, have allowed us to establish baselines of health indicators to inform outbreak contingency plans.
Dom, Elizabeth and I now co-direct this effort, which is known as the Gombe Ecosystem Health Project.
How does Gombe fit into the bigger picture of wildlife conservation?
As a result of Jane Goodall’s initial observations of disease outbreaks impacting Gombe’s chimpanzees, it became apparent that infectious diseases have the capacity to threaten the conservation of endangered species.
Some people call Gombe “a living laboratory.” It’s unique in the sense that it’s a place where there has been long-term data collection on the behavior patterns of chimpanzees, and for the past 15 years we’ve been collecting all this data on their health.
Methods have been developed at Gombe that allow us to monitor chimpanzee health non-invasively, through fecal sampling, so that we don’t have to dart the animals and tranquilize them to take blood samples. Many of the tools and approaches developed at Gombe have the capacity to manage disease-related threats to other wildlife populations globally.
Ashley Sullivan from the Jane Goodall Institute contributed to this report.
Related:
Disease poses risk to chimpanzee conservation, Gombe study finds
Sanctuary chimps show high rates of drug-resistant staph
The American Journal of Primatology just published a special edition bringing together experts who have contributed to the understanding of chimpanzee health at Gombe National Park in Tanzania and beyond. Gombe is the site where Jane Goodall pioneered her behavioral research of chimpanzees. Goodall’s work at Gombe began in 1960, and continues today through the Jane Goodall Institute, making it the longest field study of any animal.
Thomas Gillespie, associate professor in Emory’s Department of Environmental Sciences, was a guest editor of the special journal edition, along with fellow scientists Dominic Travis and Elizabeth Lonsdorf. Gillespie works at the interface of biodiversity conservation and global health. Much of his research examines how and why anthropogenic influences within tropical forests alter disease dynamics and place wild primates, people and other animals in such ecosystems at increased risk of pathogen exchange.
Following is an interview with Gillespie about the special journal issue and why research on chimpanzee health is important.
What is the current status of chimpanzees?
Both the common chimpanzee and the bonobo, the two chimpanzee subspecies, are endangered. Chimpanzees are the most closely related species to humans and we see them declining precipitously due to habitat loss and poaching. Typical estimates for the chimpanzee population are in the hundreds of thousands. That’s far less than the number of people in Atlanta for the entire chimpanzee species spread across all of Africa. There is a real risk of chimpanzees going locally extinct in core parts of their habitat. Chimpanzee communities in West Africa, for instance, have very little habitat left. They’re often found living in scraps of habitat between villages.
How important is health to conservation?
Wildlife health is a critical conservation issue, but that’s something that’s only recently been recognized. Wildlife populations already dealing with poaching and habitat loss are more vulnerable to being knocked out by disease. It becomes even more difficult when they are exposed to new pathogens, from humans or domesticated animals.
On top of that, primates are dealing with shifts in the dynamics of pathogens like Ebola. Ebola’s been around for a long time in natural systems but now we’re seeing big mortality events in wild chimpanzees and other apes. The Lowland Gorillas are actually listed as critically endangered due to Ebola.
How did you become involved with Gombe and the Jane Goodall Institute?
Fifteen years ago, as evidence mounted that disease was playing an important role in the population declines observed in Gombe chimpanzees in Tanzania, Dominic Travis and Elizabeth Lonsdorf developed a prospective health monitoring system. They began to collect specific behavioral data on signs of respiratory and gastrointestinal illnesses, combined with body condition scoring on a monthly basis for the chimpanzee communities at Gombe, that paralleled efforts by the Mountain Gorilla Veterinary Project in Rwanda and Uganda.
When I met Dom and Elizabeth at a workshop in Germany in 2004, I was six years into efforts to understand how logging and forest fragmentation in and around Kibale National Park, Uganda, affected disease dynamics in resident primates. My findings in Uganda highlighted that some forms of anthropogenic disturbance can alter the dynamics of natural pathogens in wildlife, such as a legacy of selective logging. It also revealed that other forms of disturbance, such as active forest fragmentation, can lead to opportunities for pathogens to jump between species, including the introduction of pathogens from people and domesticated animals to wild primates.
Dom and Elizabeth asked me to join their effort and expand the scope of their project to a One Health approach. I initiated diagnostic surveillance linked to geographical indicators of species overlap for Gombe’s chimpanzees and baboons, as well as the people and domesticated animals within the Greater Gombe Ecosystems. It serves as a map of all the places these species are interacting, for a greater sense of how transmission may be occurring. Integration of these new data streams, along with the ongoing observational health data and in-depth post-mortem necropsies, have allowed us to establish baselines of health indicators to inform outbreak contingency plans.
Dom, Elizabeth and I now co-direct this effort, which is known as the Gombe Ecosystem Health Project.
How does Gombe fit into the bigger picture of wildlife conservation?
As a result of Jane Goodall’s initial observations of disease outbreaks impacting Gombe’s chimpanzees, it became apparent that infectious diseases have the capacity to threaten the conservation of endangered species.
Some people call Gombe “a living laboratory.” It’s unique in the sense that it’s a place where there has been long-term data collection on the behavior patterns of chimpanzees, and for the past 15 years we’ve been collecting all this data on their health.
Methods have been developed at Gombe that allow us to monitor chimpanzee health non-invasively, through fecal sampling, so that we don’t have to dart the animals and tranquilize them to take blood samples. Many of the tools and approaches developed at Gombe have the capacity to manage disease-related threats to other wildlife populations globally.
Ashley Sullivan from the Jane Goodall Institute contributed to this report.
Related:
Disease poses risk to chimpanzee conservation, Gombe study finds
Sanctuary chimps show high rates of drug-resistant staph
Tags:
Climate change,
Ecology,
Health,
Sociology
Tuesday, December 5, 2017
Goldwater Rule 'gagging' psychiatrists no longer relevant, analysis finds
The Goldwater Rule takes its name from a 1964 incident during the failed presidential bid of Barry Goldwater. An article in a now defunct magazine declared, "1,189 Psychiatrists Say Goldwater is Psychologically Unfit to be President."
By Carol Clark
The rationale for the Goldwater Rule — which prohibits psychiatrists from publicly commenting on the mental health of public figures they have not examined in person — does not hold up to current scientific scrutiny, a new analysis finds.
Perspectives on Psychological Science is publishing the analysis, which concludes that the Goldwater Rule is not well-supported scientifically and is outdated in today’s media-saturated environment. A preprint of the article is available online.
“We reviewed a large body of published scientific literature and it clearly showed that examining someone directly is often not necessary if you compile other valid sources of information,” says Scott Lilienfeld, lead author of the analysis and a professor of psychology at Emory University.
As examples of those sources, the authors cite interviews with family members, friends and others who know a person well, and extensive public records such as media interviews, biographies, YouTube videos, social media accounts and other material that may reveal a person’s longstanding behavioral patterns. The authors also report that direct interviews are subject to a host of biasing factors that are difficult to eliminate, including efforts on the part of interviewees to create positive impressions.
“Even though it is often possible to make a reasonably valid psychiatric diagnosis at a distance, that doesn’t necessarily mean that a mental health professional should,” Lilienfeld cautions. “Such a diagnosis should only be made with great discretion and after a thorough investigation.”
The Goldwater Rule, implemented in 1973 by the American Psychiatric Association (APA), gained new attention after Donald Trump entered the political arena. Some mental health professionals have expressed serious concerns about Trump’s mental health, most notably in the new book “The Dangerous Case of Donald Trump: 27 Psychiatrists and Mental Health Experts Assess a President.”
The Goldwater Rule takes its name from an incident during the failed presidential bid of Barry Goldwater. A 1964 article in a now defunct magazine declared, “1,189 Psychiatrists say Goldwater is Psychologically Unfit to be President.” Many of the psychiatrists described the candidate in terms such as “emotionally unstable,” “cowardly,” “grossly psychotic,” “paranoid,” “delusional” and a “dangerous lunatic.” Some of the psychiatrists went so far as to offer diagnoses of Goldwater, including schizophrenia and obsessive-compulsive disorder.
Goldwater lost the election to Lyndon B. Johnson, but went on to successfully sue the magazine for libel.
“Many psychiatrists who commented on Goldwater in that article crossed an ethical line,” Lilienfeld says. “A lot of unfair statements were made about him that were poorly supported or unwarranted.”
The APA later responded by passing what came to be known as the Goldwater Rule, in part to protect public figures from humiliation and in part to safeguard the integrity of the psychiatric profession.
The Goldwater Rule may have been more defensible at the time it was implemented, Lilienfeld says, because much less information was available on public figures.
Times have changed, however, particularly with the advent of the Internet and social media.
“If someone is running for the most powerful position in the world, behavioral professionals should be able to speak out if they take the time to properly investigate a candidate,” Lilienfeld says. “There should be a high threshold for doing so, but psychologists and psychiatrists should not feel gagged if they want to contribute to a national conversation about a presidential candidate or current president.”
While the authors of the analysis recommend abandoning the Goldwater Rule, they add that mental health professionals should avoid making diagnoses of celebrities in general, simply for the sake of prurient interest.
Lilienfeld’s co-authors are Joshua Miller from the University of Georgia and Donald Lynam from Purdue University.
By Carol Clark
The rationale for the Goldwater Rule — which prohibits psychiatrists from publicly commenting on the mental health of public figures they have not examined in person — does not hold up to current scientific scrutiny, a new analysis finds.
Perspectives on Psychological Science is publishing the analysis, which concludes that the Goldwater Rule is not well-supported scientifically and is outdated in today’s media-saturated environment. A preprint of the article is available online.
“We reviewed a large body of published scientific literature and it clearly showed that examining someone directly is often not necessary if you compile other valid sources of information,” says Scott Lilienfeld, lead author of the analysis and a professor of psychology at Emory University.
As examples of those sources, the authors cite interviews with family members, friends and others who know a person well, and extensive public records such as media interviews, biographies, YouTube videos, social media accounts and other material that may reveal a person’s longstanding behavioral patterns. The authors also report that direct interviews are subject to a host of biasing factors that are difficult to eliminate, including efforts on the part of interviewees to create positive impressions.
“Even though it is often possible to make a reasonably valid psychiatric diagnosis at a distance, that doesn’t necessarily mean that a mental health professional should,” Lilienfeld cautions. “Such a diagnosis should only be made with great discretion and after a thorough investigation.”
The Goldwater Rule, implemented in 1973 by the American Psychiatric Association (APA), gained new attention after Donald Trump entered the political arena. Some mental health professionals have expressed serious concerns about Trump’s mental health, most notably in the new book “The Dangerous Case of Donald Trump: 27 Psychiatrists and Mental Health Experts Assess a President.”
The Goldwater Rule takes its name from an incident during the failed presidential bid of Barry Goldwater. A 1964 article in a now defunct magazine declared, “1,189 Psychiatrists say Goldwater is Psychologically Unfit to be President.” Many of the psychiatrists described the candidate in terms such as “emotionally unstable,” “cowardly,” “grossly psychotic,” “paranoid,” “delusional” and a “dangerous lunatic.” Some of the psychiatrists went so far as to offer diagnoses of Goldwater, including schizophrenia and obsessive-compulsive disorder.
Goldwater lost the election to Lyndon B. Johnson, but went on to successfully sue the magazine for libel.
“Many psychiatrists who commented on Goldwater in that article crossed an ethical line,” Lilienfeld says. “A lot of unfair statements were made about him that were poorly supported or unwarranted.”
The APA later responded by passing what came to be known as the Goldwater Rule, in part to protect public figures from humiliation and in part to safeguard the integrity of the psychiatric profession.
The Goldwater Rule may have been more defensible at the time it was implemented, Lilienfeld says, because much less information was available on public figures.
Times have changed, however, particularly with the advent of the Internet and social media.
“If someone is running for the most powerful position in the world, behavioral professionals should be able to speak out if they take the time to properly investigate a candidate,” Lilienfeld says. “There should be a high threshold for doing so, but psychologists and psychiatrists should not feel gagged if they want to contribute to a national conversation about a presidential candidate or current president.”
While the authors of the analysis recommend abandoning the Goldwater Rule, they add that mental health professionals should avoid making diagnoses of celebrities in general, simply for the sake of prurient interest.
Lilienfeld’s co-authors are Joshua Miller from the University of Georgia and Donald Lynam from Purdue University.
Monday, November 13, 2017
The Lying Conference: Uncovering truths about deception
The Lying Conference will unmask the many factors involved in deception, including evolution, culture and the human affinity for storytelling and make believe.
By Carol Clark
We grow up with this notion that we should always tell the truth. But can we live without lying?
That’s one of the questions to be explored in a day-long event, “The Lying Conference,” on Friday, November 17, from 8:30 am to 6:30 pm at Emory Conference Center. Emory’s Department of Psychology is bringing together scientists from psychology, neuroscience and anthropology — along with a leading journalist, a theater director and a professional magician — to discuss their insights into lying and deception. The conference is free and open to the public, but registration is requested.
Topics to be covered include: The deep, evolutionary roots of lying. How children learn to tell lies. Cultural differences in lying. How we decide whether someone is trustworthy. How technology and the changing media and political landscapes are affecting our collective beliefs. The role of deception in the arts and entertainment.
“Lying is kind of a hot topic right now, with all the buzz about fake news and accusations of cover-ups and deception,” says Emory developmental psychologist Philippe Rochat, lead organizer of the event. “When we talk about lying, what we are indirectly trying to understand is, what is the truth? It can be a profound question.”
Science uses probabilities to approximate the truth, Rochat notes. “It’s a never-ending journey and you keep trying to get closer.”
In day-to-day interactions, we regularly negotiate the truth with one another, trying to convince others of a point of view. “People put on makeup to exaggerate their features,” Rochat says. “We amplify some things about ourselves and hide others. We make believe. We seduce.”
People can lie maliciously, in an anti-social way. Or they can tell white lies, to be polite and avoid hurting another person’s feelings.
Rochat is particularly interested in the developmental trajectory of lying. Between the ages of two and three, children begin to engage in pretend play. By around age four, when children start to have ideas about what other people are thinking, lying emerges. “They can be explicit at this stage, because they can understand that someone can be deceived,” Rochat says. “But they still cannot lie very well. They tend to leak the truth.” By the age of six or seven, he adds, “we become much better at concealing the truth and keeping a secret tight.”
Whatever the reasons for lying, one thing is clear: “We’ve evolved to lie,” Rochat says. “It’s deeply rooted in our nature and somehow important to our survival.”
Following are the seven speakers of the conference and brief summaries of their topics.
“Perspective-taking and Dishonest Communication in Primates and Other Animals,” by Emory primatologist Frans de Waal: While there is plenty of evidence for functional deception in animals — such as the way a butterfly might use mimicry as camouflage — but tactical deception requires anticipating the reaction of others. Tactical deception is clearly more developed in apes than most other species, although there is also evidence for corvids.
“Lying, American Style,” by Emory anthropologist Bradd Shore: He will discuss the role of culture in lying and how it differs across cultures. Shore will also touch on the some of the ways the American cultural model has been politically deployed and manipulated in recent decades.
“Little Liars — How Children Learn to Tell Lies,” by Kang Lee a developmental psychologist from the University of Toronto: Lee will use scientific evidence from his lab to show how lying begins early in life, what factors contribute to the development of lying, why children lie and whether adults can easily detect children’s lies. He will also discuss recent developments in technology that may help in detecting lies.
“Face Value — The Irresistible (and Misleading) Influence of First Impressions,” by neuroscientist Alexander Todorov from Princeton University: People form instantaneous impressions from faces and act on these impressions. In the last 10 years, data-driven computational methods allow scientists to visualize the configurations of face features leading to specific impressions such as trustworthiness. But these appearance stereotypes are not often accurate. So why do we form first impressions?
“What Happened to the News? Technology, Politics and the Vanishing Truth,” by Johnathan Mann, former CNN International anchor: Many American believe that the news media intentionally lie to them. President Donald Trump is the best-known detractor of “fake news,” though he himself has been accused of lying more than any other public figure in recent memory. Mann will address the overlapping changes to technology, politics and business that have crippled our national conversation with deception and distrust.
“Onions and Identities — Theater and the True Self,” by Emory dramatist Tim McDonough: Drama is densely populated by duplicitous schemers, by power figures whose lies maintain the sociopolitical status quo, and by characters in search of themselves, who mirror to us our confusions and self-deceptions. Theater provides a template for understanding identity and insight into existentially and socially necessary forms of deceit.
“The Science of Magic and the Art of Deception,” by professional magician Alex Stone: Magicians trick our brains into seeing what isn’t real, and for whatever reason our brains let them get away with it. Through a mix of psychology, storytelling and sleight-of-hand, Stone will explore the cognitive underpinnings of misdirection, illusion, scams and secrecy, pulling back the curtain on the many curious and powerful ways our brains deceive us not just when we’re watching a magician but throughout our everyday lives.
By Carol Clark
We grow up with this notion that we should always tell the truth. But can we live without lying?
That’s one of the questions to be explored in a day-long event, “The Lying Conference,” on Friday, November 17, from 8:30 am to 6:30 pm at Emory Conference Center. Emory’s Department of Psychology is bringing together scientists from psychology, neuroscience and anthropology — along with a leading journalist, a theater director and a professional magician — to discuss their insights into lying and deception. The conference is free and open to the public, but registration is requested.
Topics to be covered include: The deep, evolutionary roots of lying. How children learn to tell lies. Cultural differences in lying. How we decide whether someone is trustworthy. How technology and the changing media and political landscapes are affecting our collective beliefs. The role of deception in the arts and entertainment.
“Lying is kind of a hot topic right now, with all the buzz about fake news and accusations of cover-ups and deception,” says Emory developmental psychologist Philippe Rochat, lead organizer of the event. “When we talk about lying, what we are indirectly trying to understand is, what is the truth? It can be a profound question.”
Science uses probabilities to approximate the truth, Rochat notes. “It’s a never-ending journey and you keep trying to get closer.”
In day-to-day interactions, we regularly negotiate the truth with one another, trying to convince others of a point of view. “People put on makeup to exaggerate their features,” Rochat says. “We amplify some things about ourselves and hide others. We make believe. We seduce.”
People can lie maliciously, in an anti-social way. Or they can tell white lies, to be polite and avoid hurting another person’s feelings.
Rochat is particularly interested in the developmental trajectory of lying. Between the ages of two and three, children begin to engage in pretend play. By around age four, when children start to have ideas about what other people are thinking, lying emerges. “They can be explicit at this stage, because they can understand that someone can be deceived,” Rochat says. “But they still cannot lie very well. They tend to leak the truth.” By the age of six or seven, he adds, “we become much better at concealing the truth and keeping a secret tight.”
Whatever the reasons for lying, one thing is clear: “We’ve evolved to lie,” Rochat says. “It’s deeply rooted in our nature and somehow important to our survival.”
Following are the seven speakers of the conference and brief summaries of their topics.
“Perspective-taking and Dishonest Communication in Primates and Other Animals,” by Emory primatologist Frans de Waal: While there is plenty of evidence for functional deception in animals — such as the way a butterfly might use mimicry as camouflage — but tactical deception requires anticipating the reaction of others. Tactical deception is clearly more developed in apes than most other species, although there is also evidence for corvids.
“Lying, American Style,” by Emory anthropologist Bradd Shore: He will discuss the role of culture in lying and how it differs across cultures. Shore will also touch on the some of the ways the American cultural model has been politically deployed and manipulated in recent decades.
“Little Liars — How Children Learn to Tell Lies,” by Kang Lee a developmental psychologist from the University of Toronto: Lee will use scientific evidence from his lab to show how lying begins early in life, what factors contribute to the development of lying, why children lie and whether adults can easily detect children’s lies. He will also discuss recent developments in technology that may help in detecting lies.
“Face Value — The Irresistible (and Misleading) Influence of First Impressions,” by neuroscientist Alexander Todorov from Princeton University: People form instantaneous impressions from faces and act on these impressions. In the last 10 years, data-driven computational methods allow scientists to visualize the configurations of face features leading to specific impressions such as trustworthiness. But these appearance stereotypes are not often accurate. So why do we form first impressions?
“What Happened to the News? Technology, Politics and the Vanishing Truth,” by Johnathan Mann, former CNN International anchor: Many American believe that the news media intentionally lie to them. President Donald Trump is the best-known detractor of “fake news,” though he himself has been accused of lying more than any other public figure in recent memory. Mann will address the overlapping changes to technology, politics and business that have crippled our national conversation with deception and distrust.
“Onions and Identities — Theater and the True Self,” by Emory dramatist Tim McDonough: Drama is densely populated by duplicitous schemers, by power figures whose lies maintain the sociopolitical status quo, and by characters in search of themselves, who mirror to us our confusions and self-deceptions. Theater provides a template for understanding identity and insight into existentially and socially necessary forms of deceit.
“The Science of Magic and the Art of Deception,” by professional magician Alex Stone: Magicians trick our brains into seeing what isn’t real, and for whatever reason our brains let them get away with it. Through a mix of psychology, storytelling and sleight-of-hand, Stone will explore the cognitive underpinnings of misdirection, illusion, scams and secrecy, pulling back the curtain on the many curious and powerful ways our brains deceive us not just when we’re watching a magician but throughout our everyday lives.
Friday, October 20, 2017
Responding to climate change
By Martha McKenzie
Emory Public Health
Climate change. Partisan politicians debate its reality, and many citizens see it as a faraway threat, something that endangers the future of polar bears but not them personally.
The health effects of global warming, however, are already being felt. Extreme weather events such as wildfires, droughts, and flooding are becoming more frequent, resulting in more injuries, deaths, and relocations. Heat and air pollution are sending people with asthma and other respiratory ailments to the emergency room. Diseases carried by mosquitoes, fleas, and ticks are expanding their territory—dengue has become endemic in Florida, Lyme disease has worked its way up to Canada and over to California, and some fear that malaria may re-emerge in the U.S.
Tie these health burdens—which are only likely to worsen—with the current administration’s decision to pull out of the Paris climate agreement and dismantle environmental regulations, and the call to action becomes more urgent. “The federal government’s actions might be a headwind from a funding perspective, but they are also very much a tailwind from an inspiration and motivation perspective,” says Daniel Rochberg, an instructor in environmental health who worked for the U.S. State Department as special assistant to the lead U.S. climate negotiators under presidents Bush and Obama. “As others have said, ‘We are the first generation to feel the sting of climate change, and we are the last generation that can do something about it.’ We have to get busy doing something about it.”
Rollins School of Public Health has gotten busy. Faculty researchers are building the science of climate impacts, strategies for reducing greenhouse gas emissions, and approaches for increasing resilience to climate change. Climate@Emory, a university-wide organization of concerned students, faculty, and staff, is partnering with other academic institutions, industries, and governments to support education and climate remediation efforts. Through Climate@Emory’s initiative, Emory University is an accredited, official observer to the UN climate talks and has sent students and faculty to the climate conferences in Paris in 2015 and in Marrakech in 2016. And, of course, Rollins is educating the next generation of scientists who will be dealing with the fallout of today’s climate decisions.
“For environmental scientists, it’s a challenging climate,” says Paige Tolbert, O. Wayne Rollins Chair of Environmental Health. “That means we have to be creative, because we can’t step aside and wait four years. It’s more critical than ever that we keep moving forward and make whatever contributions we possibly can.”
Read more in Emory Public Health.
Related:
Georgia climate project creates state 'climate research roadmap'
Catalyst for change
How will the shifting political winds affect U.S. climate policy?
Peachtree to Paris: Emory delegation headed to U.N. climate talks
Tags:
Bioethics,
Biology,
Chemistry,
Climate change,
Community Outreach,
Ecology,
Health,
Sociology
Thursday, September 21, 2017
Malawi yields oldest-known DNA from Africa
Emory anthropologist Jessica Thompson next to Malawi rock art paintings, likely made by hunter-gatherers. Thompson's work in Malawi is part of a major new paper in the journal Cell, filling in thousands of years of human prehistory of hunter-gatherers in Africa. (Photo by Suzanne Kunitz)
By Carol Clark
Emory anthropologist Jessica Thompson was at a human origins conference years ago when she heard a presenter lament: “Of course, there is no ancient DNA from Africa because of the poor preservation there.”
That’s when it clicked in Thompson’s mind: She had visited a place in Africa — the highlands of northern Malawi — that had neither extremes of heat or wetness — two main environmental factors that degrade DNA. She also knew that scant archaeological research had been done in the region, although a team had unearthed several ancient skeletons there decades ago.
“It’s a strange and fascinating landscape,” says Thompson, who made that 2005 visit as a tourist and was struck by the surreal beauty of the high mountain grassland.
It’s also remote and off the radar of most of the world. “We saw maybe three other tourists while we were there,” she recalls.
That fateful trip laid the groundwork for discoveries of the oldest-known DNA from Africa. The journal Cell just published an analysis of the new discoveries, filling in thousands of years of human prehistory of hunter-gatherers in Africa, led by Harvard geneticist David Reich.
Thompson is second author of the paper. She contributed and described the cultural context for nearly half of the 15 new DNA finds, including the oldest samples. Her fieldwork in Malawi uncovered human remains that yielded DNA ranging in age from about 2,500 to 6,100 years old. And her work is ongoing at a site where a skeleton recovered in 1950 was just dated to 8,100 years old and also yielded DNA.
The other DNA in the Cell paper ranges in age from 3,000-to-500 years ago and comes from South Africa, Tanzania and Kenya.
“Malawi is positioned in between where living hunter-gatherers survive,” Thompson says. “For the first time, we can see the distribution of ancient hunter-gatherer DNA across Africa, showing how these populations were connected in the past.”
Ancient hunter-gatherers do not have a lot of living representatives in Africa today, and they occur as remnants of people scattered across the continent. The remains of Malawi hunter-gatherers that Thompson is studying may represent a population that was once thriving but subsequently pushed into marginal areas during the expansion of agriculturalists and pastoralists during the past 3,000 years.
Some of this population may have survived until much more recently.
“There are legends in Malawi of the original people who came there, passed down through oral histories,” Thompson says. “They are described as hunters and little people, short in stature. There is also a story of a last, epic battle — that occurred about 200 years ago — when these people got eradicated.”
Mount Hora, where the oldest DNA included in the Cell paper was obtained, from a woman who lived more than 8,000 years ago. (Photo by Jessica Thompson)
Malawi captivated Thompson during that first visit as a tourist, in 2005. She was a graduate student when she spent a summer working on a dig in the Serengeti. She and two companions decided to make a road trip before returning to the United States, including a stop in Malawi.
The landlocked country is located in southeast Africa, bordered by Zambia, Tanzania and Mozambique. It is one of the least-developed and smallest countries in Africa, about the size of the state of Tennessee, and runs north to south along the Rift Valley. An enormous body of water, Lake Malawi, makes up about one-third of the country.
“My traveling companies wanted to relax by the lake in the lowlands,” Thompson recalls. “I had read about the Malawi highlands and really wanted to see this unique ecosystem, so I convinced them to go there instead.”
Her companions complained of the cold — it’s windy and regularly freezes in the highlands of Malawi and summer temperatures peak at around 65 or 70 degrees Fahrenheit. Despite the cold, Thompson admired the rugged, isolated beauty of rocky outcrops and grasslands studded with orchids and fairy ferns where zebra and shaggy antelope grazed.
Thompson, who joined Emory as an assistant professor of anthropology in 2015, dug through the archaeological literature surrounding Malawi and started making exploratory trips there in 2009. She learned of two digs in the Malawi highlands — in 1950 and 1966 — that revealed human skeletons alongside rich cultural evidence of an extinct hunting-and-gathering lifeway.
Dancers at a festival in Malawi. The people living in the country today are the descendants of the Iron Age agriculturalists and pastoralists who swept across the African continent about 3,000 years ago. (Photo by Jessica Thompson)
The 1950 dig turned out to be led by the renowned archaeologist J. Desmond Clark, who Thompson calls her “academic grandfather.” Although Clark died before Thompson could meet him, he served as the mentor to her mentor, Curtis Marean.
On the slopes of Mount Hora — a striking 1,500-meter peak and a major landmark in the highlands — Clark uncovered two skeletons: A woman who had died at around age 22 and a nearby male, who had died in his 40s. The skeletons had been taken out of the country, to the Livingstone Museum in Zambia, and were never dated.
“It was impossible to accurately do radiocarbon dating on bone in 1950,” Thompson explains. “The skeletons became, quite frankly, forgotten over time.”
Guided by the clues from the previous excavations, Thompson began heading digs in the Malawi highlands. A site at a landmark outcrop, known as Fingira Rock, is particularly isolated, requiring the team to hike up a mountainside to more than 2,000 meters on the Nyika Plateau. “Working there you feel the wind, you feel the chill,” Thompson says.
Poachers are a hazard in the area, along with the occasional black mamba — one of the world’s deadliest snakes.
The Fingira site had not been excavated since 1966. “We were appalled to discover that it had been heavily disturbed since then,” Thompson says. Her team uncovered two human leg bones, from two different adult males, which yielded DNA that was about 6,100 years old.
The leg bone of a hunter-gatherer that lived 6,100 years ago, found at the Fingira Rock site. (Photo by Jessica Thompson)
In the back of a cave, they found fragments of a child’s skull in a termite mound. A tiny leg bone next to it indicated that the remains were from a baby younger than age one. DNA analysis revealed that she had been a girl and radiocarbon dating showed that she had died about 2,500 years ago. The analysis also showed that the bones from the infant and the two men were from the same hunter-gatherer population — even though they were separated by thousands of years of time.
The archaeological sediments suggest that Fingira was a place where the dead were buried, although the skeletal material has become scattered over time. Human bones are mixed with the bones of animals that they hunted and ate, as well as with stone tools and shell beads that they used for ornaments.
“When you visit the site,” Thompson says, “you wonder, why were these people living up here when it’s not the most comfortable conditions you can imagine? What was bringing them here? Why were they burying their dead, over and over again, for many thousands of years, in the same place?”
Meanwhile, Thompson tracked down the skeletons that Clark had discovered at Mount Hora in 1950. She learned they had been moved from Zambia to the University of Cape Town in South Africa.
Here’s where Emory graduate student Kendra Ann Sirak enters the story. Sirak had the distinction of being the last graduate student of Emory anthropologist George Armelagos, one of the founders of the field of paleopathology. He spent decades working with graduate students to study the bones of ancient Sudanese Nubians to learn about patterns of health, illness and death in the past. Armelagos sent Sirak to one of the best ancient DNA labs in the world, at University College Dublin (UCD), in Ireland, with samples of the Nubian bones.
After Armelagos died in 2014, at age 77, Thompson stepped in as one of Sirak’s mentors.
Thompson, left, examines fragments of artifacts from the Malawi excavations in her lab with Emory graduate student Kendra Ann Sirak. Sirak helped with the radiocarbon dating and DNA extraction of the "forgotten" 8,100-year-old skeleton from Mount Hora. (Photo by Ann Borden, Emory Photo/Video)
Thompson contacted the curator of the two skeletons from Mount Hora, to ask about the possibility of getting DNA from them. Alan Morris, now Professor Emeritus at the University of Cape Town, had had the same idea. A sample from the female skeleton was already slated to be sent to the UCD lab where Sirak was working. So Thompson, Morris and Sirak teamed up on the quest.
The petrous bone, which contains components of the inner ear, is the most promising site to drill for ancient DNA. The skeleton's petrous bone had already broken away from the skull, so only this tiny, triangular-shaped piece of the skeleton was sent to Dublin.
"It was extremely fragile," says Sirak, whose job was to drill into the petrous bone and get about 200 millimeters of bone powder without shattering the specimen.
She drank a cope of coffee, donned a hair cover, overalls, a face mask, two pairs of gloves and shoe covers, then entered a small, sterile room where the petrous bone awaited. "I said to myself, 'Here we go, I've got this!'" Sirak recalls.
Sirak was successful. Her colleagues in Dublin processed the sample and then sent it to the genetics team at Harvard Medical School for DNA analysis, which was also successful.
Meanwhile, radiocarbon dating revealed that the skeleton was 8,100 years old.
"It was like Christmas," Sirak says, "knowing that we had DNA data on such an ancient specimen."
The skeleton's genetics connected her to the same population of hunter-gatherers who died thousands of years later and were found 70 kilometers away at Fingira.
Another surprise revealed by the genetic analysis of the Malawi hunter-gatherers: They did not contribute any detectable ancestry to the people living in Malawi today, the descendants of the Iron Age agriculturalists and pastoralists who began sweeping across the African continent about 3,000 years ago.
“In most parts of Africa, you see quite a bit of admixture,” Thompson says. “When you take genetic samples from modern people who are living today, you find that they are a combination of the folks who were expanding into a region and also the folks who were living there before. In Malawi we see that’s not the case. It appears that there was a complete replacement of the original hunter-gatherer people. They are not just gone as a lifeway, they are actually gone as a people as well.”
One of the mysteries Thompson hopes to solve is how that replacement happened. Was it violent? Was it a sudden or a slow process? Did the entrance of strange new technologies, like pottery and iron working, play a role?
“We can’t use genetics to answer these questions,” Thompson says. “We have to use the archaeology.”
Emory anthropology undergraduates assisting with the Malawi excavations this past summer included, from left: Alexa Rome, Alexandra Davis, Suzanne Kunitz and Aditi Majoe. Graduate student Grace Veatch is on the far right.
She continues to excavate in Malawi, aided by local technicians and other collaborators. This summer, five Emory anthropology students accompanied her in the field: Graduate student Grace Veatch, senior Alexandra Davis, juniors Aditi Majoe and Suzanne Kunitz, and sophomore Alexa Rome. They uncovered more human remains at Mount Hora — a charred bone from a human arm and parts of two legs. These bones, recently dated to between 9,500 and 9,300 years old, show that the Hora site still has many secrets to reveal.
While radiocarbon dating of charcoal samples from just above and below the bones establishes their age, it is not clear whether they will yield DNA. “We don’t have high hopes,” Thompson says, “as they were burned and that tends to create even more preservation problems.”
The students assisted in the tedious work of carefully sifting through grey dust and ash, marking coordinates through GPS and other surveying tools, and recording the data into a computer.
Back in her lab at Emory, Thompson uses the data to generate three-dimensional images of the digs and pinpoint where each bone fragment, shell bead or stone tool was found. Her digital model for the this summer’s Mount Hora dig uses different-colored dots to give a glimpse of how hunter-gatherers were depositing both human remains and ordinary objects from their day-to-day lives over time.
“And then at this point,” Thompson says as she moves her cursor on her computer screen, “you see the introduction of pottery and iron technology. And right after that you see this fundamental change in the way that the site was used. People are no longer going there frequently. They’re no longer making these big bonfires. And they’re no longer interring their dead there.”
Thompson and her students are also sorting through hundreds of gallon-sized Ziploc plastic bags containing fragments from the Malawi sites. “As you excavate,” she explains, “you clean away the dirt and you’re left with all these tiny pieces of stone and bone artifacts. The bones are mostly animals. But every once in a while you find something that looks like it might be human. Any one one of them could be a new individual, a new piece to the story.”
She pulls out a small plastic bag labeled “Human distal phalanx.” It contains a piece of bone about the size of a Tic-Tac. “In this case, we think we have a finger bone, most likely from a child,” Thompson says.
Ultimately, Thompson seeks to understand how and when the earliest members of our species — Stone Age Homo sapiens — interacted with one another and with their environments in Africa.
“One thing that’s really easy to forget, when we look at the way people live today, is that for most of our evolution we lived as hunter-gatherers,” she says. “So if we want to understand our own origins as a species, we have to know what those lifeways looked like in the past.”
Related:
A bone to pick on origins of meat eating
Brain trumps hand in Stone Age tool study
Stone tools from Jordan point to dawn of division of labor
By Carol Clark
Emory anthropologist Jessica Thompson was at a human origins conference years ago when she heard a presenter lament: “Of course, there is no ancient DNA from Africa because of the poor preservation there.”
That’s when it clicked in Thompson’s mind: She had visited a place in Africa — the highlands of northern Malawi — that had neither extremes of heat or wetness — two main environmental factors that degrade DNA. She also knew that scant archaeological research had been done in the region, although a team had unearthed several ancient skeletons there decades ago.
“It’s a strange and fascinating landscape,” says Thompson, who made that 2005 visit as a tourist and was struck by the surreal beauty of the high mountain grassland.
It’s also remote and off the radar of most of the world. “We saw maybe three other tourists while we were there,” she recalls.
That fateful trip laid the groundwork for discoveries of the oldest-known DNA from Africa. The journal Cell just published an analysis of the new discoveries, filling in thousands of years of human prehistory of hunter-gatherers in Africa, led by Harvard geneticist David Reich.
Thompson is second author of the paper. She contributed and described the cultural context for nearly half of the 15 new DNA finds, including the oldest samples. Her fieldwork in Malawi uncovered human remains that yielded DNA ranging in age from about 2,500 to 6,100 years old. And her work is ongoing at a site where a skeleton recovered in 1950 was just dated to 8,100 years old and also yielded DNA.
The other DNA in the Cell paper ranges in age from 3,000-to-500 years ago and comes from South Africa, Tanzania and Kenya.
“Malawi is positioned in between where living hunter-gatherers survive,” Thompson says. “For the first time, we can see the distribution of ancient hunter-gatherer DNA across Africa, showing how these populations were connected in the past.”
Ancient hunter-gatherers do not have a lot of living representatives in Africa today, and they occur as remnants of people scattered across the continent. The remains of Malawi hunter-gatherers that Thompson is studying may represent a population that was once thriving but subsequently pushed into marginal areas during the expansion of agriculturalists and pastoralists during the past 3,000 years.
Some of this population may have survived until much more recently.
“There are legends in Malawi of the original people who came there, passed down through oral histories,” Thompson says. “They are described as hunters and little people, short in stature. There is also a story of a last, epic battle — that occurred about 200 years ago — when these people got eradicated.”
Mount Hora, where the oldest DNA included in the Cell paper was obtained, from a woman who lived more than 8,000 years ago. (Photo by Jessica Thompson)
Malawi captivated Thompson during that first visit as a tourist, in 2005. She was a graduate student when she spent a summer working on a dig in the Serengeti. She and two companions decided to make a road trip before returning to the United States, including a stop in Malawi.
The landlocked country is located in southeast Africa, bordered by Zambia, Tanzania and Mozambique. It is one of the least-developed and smallest countries in Africa, about the size of the state of Tennessee, and runs north to south along the Rift Valley. An enormous body of water, Lake Malawi, makes up about one-third of the country.
“My traveling companies wanted to relax by the lake in the lowlands,” Thompson recalls. “I had read about the Malawi highlands and really wanted to see this unique ecosystem, so I convinced them to go there instead.”
Her companions complained of the cold — it’s windy and regularly freezes in the highlands of Malawi and summer temperatures peak at around 65 or 70 degrees Fahrenheit. Despite the cold, Thompson admired the rugged, isolated beauty of rocky outcrops and grasslands studded with orchids and fairy ferns where zebra and shaggy antelope grazed.
Thompson, who joined Emory as an assistant professor of anthropology in 2015, dug through the archaeological literature surrounding Malawi and started making exploratory trips there in 2009. She learned of two digs in the Malawi highlands — in 1950 and 1966 — that revealed human skeletons alongside rich cultural evidence of an extinct hunting-and-gathering lifeway.
Dancers at a festival in Malawi. The people living in the country today are the descendants of the Iron Age agriculturalists and pastoralists who swept across the African continent about 3,000 years ago. (Photo by Jessica Thompson)
The 1950 dig turned out to be led by the renowned archaeologist J. Desmond Clark, who Thompson calls her “academic grandfather.” Although Clark died before Thompson could meet him, he served as the mentor to her mentor, Curtis Marean.
On the slopes of Mount Hora — a striking 1,500-meter peak and a major landmark in the highlands — Clark uncovered two skeletons: A woman who had died at around age 22 and a nearby male, who had died in his 40s. The skeletons had been taken out of the country, to the Livingstone Museum in Zambia, and were never dated.
“It was impossible to accurately do radiocarbon dating on bone in 1950,” Thompson explains. “The skeletons became, quite frankly, forgotten over time.”
Guided by the clues from the previous excavations, Thompson began heading digs in the Malawi highlands. A site at a landmark outcrop, known as Fingira Rock, is particularly isolated, requiring the team to hike up a mountainside to more than 2,000 meters on the Nyika Plateau. “Working there you feel the wind, you feel the chill,” Thompson says.
Poachers are a hazard in the area, along with the occasional black mamba — one of the world’s deadliest snakes.
The Fingira site had not been excavated since 1966. “We were appalled to discover that it had been heavily disturbed since then,” Thompson says. Her team uncovered two human leg bones, from two different adult males, which yielded DNA that was about 6,100 years old.
In the back of a cave, they found fragments of a child’s skull in a termite mound. A tiny leg bone next to it indicated that the remains were from a baby younger than age one. DNA analysis revealed that she had been a girl and radiocarbon dating showed that she had died about 2,500 years ago. The analysis also showed that the bones from the infant and the two men were from the same hunter-gatherer population — even though they were separated by thousands of years of time.
The archaeological sediments suggest that Fingira was a place where the dead were buried, although the skeletal material has become scattered over time. Human bones are mixed with the bones of animals that they hunted and ate, as well as with stone tools and shell beads that they used for ornaments.
“When you visit the site,” Thompson says, “you wonder, why were these people living up here when it’s not the most comfortable conditions you can imagine? What was bringing them here? Why were they burying their dead, over and over again, for many thousands of years, in the same place?”
Meanwhile, Thompson tracked down the skeletons that Clark had discovered at Mount Hora in 1950. She learned they had been moved from Zambia to the University of Cape Town in South Africa.
Here’s where Emory graduate student Kendra Ann Sirak enters the story. Sirak had the distinction of being the last graduate student of Emory anthropologist George Armelagos, one of the founders of the field of paleopathology. He spent decades working with graduate students to study the bones of ancient Sudanese Nubians to learn about patterns of health, illness and death in the past. Armelagos sent Sirak to one of the best ancient DNA labs in the world, at University College Dublin (UCD), in Ireland, with samples of the Nubian bones.
After Armelagos died in 2014, at age 77, Thompson stepped in as one of Sirak’s mentors.
Thompson, left, examines fragments of artifacts from the Malawi excavations in her lab with Emory graduate student Kendra Ann Sirak. Sirak helped with the radiocarbon dating and DNA extraction of the "forgotten" 8,100-year-old skeleton from Mount Hora. (Photo by Ann Borden, Emory Photo/Video)
Thompson contacted the curator of the two skeletons from Mount Hora, to ask about the possibility of getting DNA from them. Alan Morris, now Professor Emeritus at the University of Cape Town, had had the same idea. A sample from the female skeleton was already slated to be sent to the UCD lab where Sirak was working. So Thompson, Morris and Sirak teamed up on the quest.
The petrous bone, which contains components of the inner ear, is the most promising site to drill for ancient DNA. The skeleton's petrous bone had already broken away from the skull, so only this tiny, triangular-shaped piece of the skeleton was sent to Dublin.
"It was extremely fragile," says Sirak, whose job was to drill into the petrous bone and get about 200 millimeters of bone powder without shattering the specimen.
She drank a cope of coffee, donned a hair cover, overalls, a face mask, two pairs of gloves and shoe covers, then entered a small, sterile room where the petrous bone awaited. "I said to myself, 'Here we go, I've got this!'" Sirak recalls.
Sirak was successful. Her colleagues in Dublin processed the sample and then sent it to the genetics team at Harvard Medical School for DNA analysis, which was also successful.
Meanwhile, radiocarbon dating revealed that the skeleton was 8,100 years old.
"It was like Christmas," Sirak says, "knowing that we had DNA data on such an ancient specimen."
The skeleton's genetics connected her to the same population of hunter-gatherers who died thousands of years later and were found 70 kilometers away at Fingira.
Another surprise revealed by the genetic analysis of the Malawi hunter-gatherers: They did not contribute any detectable ancestry to the people living in Malawi today, the descendants of the Iron Age agriculturalists and pastoralists who began sweeping across the African continent about 3,000 years ago.
“In most parts of Africa, you see quite a bit of admixture,” Thompson says. “When you take genetic samples from modern people who are living today, you find that they are a combination of the folks who were expanding into a region and also the folks who were living there before. In Malawi we see that’s not the case. It appears that there was a complete replacement of the original hunter-gatherer people. They are not just gone as a lifeway, they are actually gone as a people as well.”
One of the mysteries Thompson hopes to solve is how that replacement happened. Was it violent? Was it a sudden or a slow process? Did the entrance of strange new technologies, like pottery and iron working, play a role?
“We can’t use genetics to answer these questions,” Thompson says. “We have to use the archaeology.”
Emory anthropology undergraduates assisting with the Malawi excavations this past summer included, from left: Alexa Rome, Alexandra Davis, Suzanne Kunitz and Aditi Majoe. Graduate student Grace Veatch is on the far right.
She continues to excavate in Malawi, aided by local technicians and other collaborators. This summer, five Emory anthropology students accompanied her in the field: Graduate student Grace Veatch, senior Alexandra Davis, juniors Aditi Majoe and Suzanne Kunitz, and sophomore Alexa Rome. They uncovered more human remains at Mount Hora — a charred bone from a human arm and parts of two legs. These bones, recently dated to between 9,500 and 9,300 years old, show that the Hora site still has many secrets to reveal.
While radiocarbon dating of charcoal samples from just above and below the bones establishes their age, it is not clear whether they will yield DNA. “We don’t have high hopes,” Thompson says, “as they were burned and that tends to create even more preservation problems.”
The students assisted in the tedious work of carefully sifting through grey dust and ash, marking coordinates through GPS and other surveying tools, and recording the data into a computer.
Back in her lab at Emory, Thompson uses the data to generate three-dimensional images of the digs and pinpoint where each bone fragment, shell bead or stone tool was found. Her digital model for the this summer’s Mount Hora dig uses different-colored dots to give a glimpse of how hunter-gatherers were depositing both human remains and ordinary objects from their day-to-day lives over time.
“And then at this point,” Thompson says as she moves her cursor on her computer screen, “you see the introduction of pottery and iron technology. And right after that you see this fundamental change in the way that the site was used. People are no longer going there frequently. They’re no longer making these big bonfires. And they’re no longer interring their dead there.”
Thompson and her students are also sorting through hundreds of gallon-sized Ziploc plastic bags containing fragments from the Malawi sites. “As you excavate,” she explains, “you clean away the dirt and you’re left with all these tiny pieces of stone and bone artifacts. The bones are mostly animals. But every once in a while you find something that looks like it might be human. Any one one of them could be a new individual, a new piece to the story.”
She pulls out a small plastic bag labeled “Human distal phalanx.” It contains a piece of bone about the size of a Tic-Tac. “In this case, we think we have a finger bone, most likely from a child,” Thompson says.
Ultimately, Thompson seeks to understand how and when the earliest members of our species — Stone Age Homo sapiens — interacted with one another and with their environments in Africa.
“One thing that’s really easy to forget, when we look at the way people live today, is that for most of our evolution we lived as hunter-gatherers,” she says. “So if we want to understand our own origins as a species, we have to know what those lifeways looked like in the past.”
Related:
A bone to pick on origins of meat eating
Brain trumps hand in Stone Age tool study
Stone tools from Jordan point to dawn of division of labor
Tags:
Anthropology,
Biology,
Ecology,
Sociology
Subscribe to:
Posts (Atom)


























