Wednesday, August 24, 2016
Chimpanzees choose cooperation over competition
Video shows chimpanzee cooperation task. All chimpanzees must manipulate the handles at the same time in order for the food to be delivered. Video from Yerkes National Primate Research Center.
From Woodruff Health Sciences Center:
When given a choice between cooperating or competing, chimpanzees choose to cooperate five times more frequently, Yerkes National Primate Research Center researchers have found. This, the researchers say, challenges the perceptions humans are unique in our ability to cooperate and chimpanzees are overly competitive, and suggests the roots of human cooperation are shared with other primates.
The study results are reported in this week’s early online edition of the Proceedings of the National Academy of Sciences.
To determine if chimpanzees possess the same ability humans have to overcome competition, the researchers set up a cooperative task that closely mimicked chimpanzee natural conditions, for example, providing the 11 great apes that voluntarily participated in this study with an open choice to select cooperation partners and giving them plenty of ways to compete. Working beside the chimpanzees’ grassy outdoor enclosure at the Yerkes Research Center Field Station, the researchers gave the great apes thousands of opportunities to pull cooperatively at an apparatus filled with rewards. In half of the test sessions, two chimpanzees needed to participate to succeed, and in the other half, three chimpanzees were needed.
While the set up provided ample opportunities for competition, aggression and freeloading, the chimpanzees overwhelmingly performed cooperative acts – 3,565 times across 94 hour-long test sessions.
Read more about the study.
Wednesday, August 17, 2016
Babies' spatial reasoning predicts later math skills
"Our work may contribute to the understanding of the nature of mathematics," says Emory psychologist Stella Lourenco, shown meeting with a young visitor to the Emory Child Study Center. (Photos by Emory Photo/Video.)
By Carol Clark
Spatial reasoning measured in infancy predicts how children do at math at four years of age, finds a new study published in Psychological Science.
“We’ve provided the earliest documented evidence for a relationship between spatial reasoning and math ability,” says Emory University psychologist Stella Lourenco, whose lab conducted the research. “We’ve shown that spatial reasoning beginning early in life, as young as six months of age, predicts both the continuity of this ability and mathematical development.”
Emory graduate student Jillian Lauer is co-author of the study. The researchers controlled the longitudinal study for general cognitive abilities of the children, including measures such as vocabulary, working memory, short-term spatial memory and processing speed.
“Our results suggest that it’s not just a matter of smarter infants becoming smarter four-year-olds,” Lourenco says. “Instead, we believe that we’ve honed in on something specific about early spatial reasoning and math ability.”
The findings may help explain why some people embrace math while others feel they are bad at it and avoid it. “We know that spatial reasoning is a malleable skill that can be improved with training,” Lourenco says. “One possibility is that more focus should be put on spatial reasoning in early math education.”
Previous research has shown that superior spatial aptitude at 13 years of age predicts professional and creative accomplishments in the fields of science, technology, engineering and math more than 30 years later.
To explore whether individual differences in spatial aptitude are present earlier, Lourenco’s lab tested 63 infants, ages six months to 13 months, for a visual-spatial skill known as mental transformation, or the ability to transform and rotate objects in “mental space.” Mental transformation is considered a hallmark of spatial intelligence.
The Lourenco lab uses computer eye-tracking technology to hone in on the visual-spatial skills of babies.
The researchers showed the babies a series of paired video streams. Both streams presented a series of two matching shapes, similar to Tetris tile pieces, which changed orientation in each presentation. In one of the video streams, the two shapes in every third presentation rotated to become mirror images. In the other video stream, the shapes only appeared in non-mirror orientations. Eye tracking technology recorded which video stream the infants looked at, and for how long.
This type of experiment is called a change-detection paradigm. “Babies have been shown to prefer novelty,” Lourenco explains. “If they can engage in mental transformation and detect that the pieces occasionally rotate into a mirror position, that’s interesting to them because of the novelty.”
Eye-tracking technology allowed the researchers to measure where the babies looked, and for how long. As a group, the infants looked significantly longer at the video stream with mirror images, but there were individual differences in the amount of time they looked at it.
Fifty-three of the children, or 84 percent of the original sample, returned at age four to complete the longitudinal study. The participants were again tested for mental transformation ability, along with mastery of simple symbolic math concepts. The results showed that the children who spent more time looking at the mirror stream of images as infants maintained these higher mental transformation abilities at age four, and also performed better on the math problems.
High-level symbolic math came relatively late in human evolution. Previous research has suggested that symbolic math may have co-opted circuits of the brain involved in spatial reasoning as a foundation to build on.
“Our work may contribute to our understanding of the nature of mathematics,” Lourenco says. “By showing that spatial reasoning is related to individual differences in math ability, we’ve added to a growing literature suggesting a potential contribution for spatial reasoning in mathematics. We can now test the causal role that spatial reasoning may play early in life.”
In addition to helping improve regular early math education, the finding could help in the design of interventions for children with math disabilities. Dyscalculia, for example, is a developmental disorder that interferes with doing even simple arithmetic.
“Dyscalculia has an estimated prevalence of five to seven percent, which is roughly the same as dyslexia,” Lourenco says. “Dyscalculia, however, has generally received less attention, despite math’s importance to our technological world.”
Related:
How babies use numbers, space and time
Babies have logical reasoning before age one, study finds
By Carol Clark
Spatial reasoning measured in infancy predicts how children do at math at four years of age, finds a new study published in Psychological Science.
“We’ve provided the earliest documented evidence for a relationship between spatial reasoning and math ability,” says Emory University psychologist Stella Lourenco, whose lab conducted the research. “We’ve shown that spatial reasoning beginning early in life, as young as six months of age, predicts both the continuity of this ability and mathematical development.”
Emory graduate student Jillian Lauer is co-author of the study. The researchers controlled the longitudinal study for general cognitive abilities of the children, including measures such as vocabulary, working memory, short-term spatial memory and processing speed.
“Our results suggest that it’s not just a matter of smarter infants becoming smarter four-year-olds,” Lourenco says. “Instead, we believe that we’ve honed in on something specific about early spatial reasoning and math ability.”
The findings may help explain why some people embrace math while others feel they are bad at it and avoid it. “We know that spatial reasoning is a malleable skill that can be improved with training,” Lourenco says. “One possibility is that more focus should be put on spatial reasoning in early math education.”
Previous research has shown that superior spatial aptitude at 13 years of age predicts professional and creative accomplishments in the fields of science, technology, engineering and math more than 30 years later.
To explore whether individual differences in spatial aptitude are present earlier, Lourenco’s lab tested 63 infants, ages six months to 13 months, for a visual-spatial skill known as mental transformation, or the ability to transform and rotate objects in “mental space.” Mental transformation is considered a hallmark of spatial intelligence.
The Lourenco lab uses computer eye-tracking technology to hone in on the visual-spatial skills of babies.
The researchers showed the babies a series of paired video streams. Both streams presented a series of two matching shapes, similar to Tetris tile pieces, which changed orientation in each presentation. In one of the video streams, the two shapes in every third presentation rotated to become mirror images. In the other video stream, the shapes only appeared in non-mirror orientations. Eye tracking technology recorded which video stream the infants looked at, and for how long.
This type of experiment is called a change-detection paradigm. “Babies have been shown to prefer novelty,” Lourenco explains. “If they can engage in mental transformation and detect that the pieces occasionally rotate into a mirror position, that’s interesting to them because of the novelty.”
Eye-tracking technology allowed the researchers to measure where the babies looked, and for how long. As a group, the infants looked significantly longer at the video stream with mirror images, but there were individual differences in the amount of time they looked at it.
Fifty-three of the children, or 84 percent of the original sample, returned at age four to complete the longitudinal study. The participants were again tested for mental transformation ability, along with mastery of simple symbolic math concepts. The results showed that the children who spent more time looking at the mirror stream of images as infants maintained these higher mental transformation abilities at age four, and also performed better on the math problems.
High-level symbolic math came relatively late in human evolution. Previous research has suggested that symbolic math may have co-opted circuits of the brain involved in spatial reasoning as a foundation to build on.
“Our work may contribute to our understanding of the nature of mathematics,” Lourenco says. “By showing that spatial reasoning is related to individual differences in math ability, we’ve added to a growing literature suggesting a potential contribution for spatial reasoning in mathematics. We can now test the causal role that spatial reasoning may play early in life.”
In addition to helping improve regular early math education, the finding could help in the design of interventions for children with math disabilities. Dyscalculia, for example, is a developmental disorder that interferes with doing even simple arithmetic.
“Dyscalculia has an estimated prevalence of five to seven percent, which is roughly the same as dyslexia,” Lourenco says. “Dyscalculia, however, has generally received less attention, despite math’s importance to our technological world.”
Related:
How babies use numbers, space and time
Babies have logical reasoning before age one, study finds
Tuesday, August 16, 2016
A dog's dilemma: Do canines prefer praise or food?
Chowhound:
Ozzie, a shorthaired terrier mix, was the only dog in the experiments
that chose food over his owner’s praise 100 percent of the time. “Ozzie
was a bit of an outlier,” Berns says, “but Ozzie’s owner understands him
and still loves him.”
By Carol Clark
Given the choice, many dogs prefer praise from their owners over food, suggests a new study published in the journal Social, Cognitive and Affective Neuroscience. The study is one of the first to combine brain-imaging data with behavioral experiments to explore canine reward preferences.
“We are trying to understand the basis of the dog-human bond and whether it’s mainly about food, or about the relationship itself,” says Gregory Berns, a neuroscientist at Emory University and lead author of the research. “Out of the 13 dogs that completed the study, we found that most of them either preferred praise from their owners over food, or they appeared to like both equally. Only two of the dogs were real chowhounds, showing a strong preference for the food.”
Dogs were at the center of the most famous experiments of classical conditioning, conducted by Ivan Pavlov in the early 1900s. Pavlov showed that if dogs are trained to associate a particular stimulus with food, the animals salivate in the mere presence of the stimulus, in anticipation of the food.
“One theory about dogs is that they are primarily Pavlovian machines: They just want food and their owners are simply the means to get it,” Berns says. “Another, more current, view of their behavior is that dogs value human contact in and of itself.”
Berns heads up the Dog Project in Emory’s Department of Psychology, which is researching evolutionary questions surrounding man’s best, and oldest friend. The project was the first to train dogs to voluntarily enter a functional magnetic resonance imaging (fMRI) scanner and remain motionless during scanning, without restraint or sedation. In previous research, the Dog Project identified the ventral caudate region of the canine brain as a reward center. It also showed how that region of a dog’s brain responds more strongly to the scents of familiar humans than to the scents of other humans, or even to those of familiar dogs.
Praise Pooch: Most of the dogs in the experiments preferred praise over food, or liked them both equally. Kady, a Labrador-golden retriever mix, was the top dog when it came to the strength of her preference for praise.
For the current experiment, the researchers began by training the dogs to associate three different objects with different outcomes. A pink toy truck signaled a food reward; a blue toy knight signaled verbal praise from the owner; and a hairbrush signaled no reward, to serve as a control.
The dogs then were tested on the three objects while in an fMRI machine. Each dog underwent 32 trials for each of the three objects as their neural activity was recorded.
All of the dogs showed a stronger neural activation for the reward stimuli compared to the stimulus that signaled no reward, and their responses covered a broad range. Four of the dogs showed a particularly strong activation for the stimulus that signaled praise from their owners. Nine of the dogs showed similar neural activation for both the praise stimulus and the food stimulus. And two of the dogs consistently showed more activation when shown the stimulus for food.
The dogs then underwent a behavioral experiment. Each dog was familiarized with a room that contained a simple Y-shaped maze constructed from baby gates: One path of the maze led to a bowl of food and the other path to the dog’s owner. The owners sat with their backs toward their dogs. The dog was then repeatedly released into the room and allowed to choose one of the paths. If they came to the owner, the owner praised them.
“We found that the caudate response of each dog in the first experiment correlated with their choices in the second experiment,” Berns says. “Dogs are individuals and their neurological profiles fit the behavioral choices they make. Most of the dogs alternated between food and owner, but the dogs with the strongest neural response to praise chose to go to their owners 80 to 90 percent of the time. It shows the importance of social reward and praise to dogs. It may be analogous to how we humans feel when someone praises us.”
The experiments lay the groundwork for asking more complicated questions about the canine experience of the world. The Berns’ lab is currently exploring the ability of dogs to process and understand human language.
“Dogs are hypersocial with humans,” Berns says, “and their integration into human ecology makes dogs a unique model for studying cross-species social bonding.”
Related:
Dogs process faces in specialized brain area, study reveals
Scent of the familiar: You may linger like perfume in your dog's brain
Photos by Gregory Berns
By Carol Clark
Given the choice, many dogs prefer praise from their owners over food, suggests a new study published in the journal Social, Cognitive and Affective Neuroscience. The study is one of the first to combine brain-imaging data with behavioral experiments to explore canine reward preferences.
“We are trying to understand the basis of the dog-human bond and whether it’s mainly about food, or about the relationship itself,” says Gregory Berns, a neuroscientist at Emory University and lead author of the research. “Out of the 13 dogs that completed the study, we found that most of them either preferred praise from their owners over food, or they appeared to like both equally. Only two of the dogs were real chowhounds, showing a strong preference for the food.”
Dogs were at the center of the most famous experiments of classical conditioning, conducted by Ivan Pavlov in the early 1900s. Pavlov showed that if dogs are trained to associate a particular stimulus with food, the animals salivate in the mere presence of the stimulus, in anticipation of the food.
“One theory about dogs is that they are primarily Pavlovian machines: They just want food and their owners are simply the means to get it,” Berns says. “Another, more current, view of their behavior is that dogs value human contact in and of itself.”
Berns heads up the Dog Project in Emory’s Department of Psychology, which is researching evolutionary questions surrounding man’s best, and oldest friend. The project was the first to train dogs to voluntarily enter a functional magnetic resonance imaging (fMRI) scanner and remain motionless during scanning, without restraint or sedation. In previous research, the Dog Project identified the ventral caudate region of the canine brain as a reward center. It also showed how that region of a dog’s brain responds more strongly to the scents of familiar humans than to the scents of other humans, or even to those of familiar dogs.
Praise Pooch: Most of the dogs in the experiments preferred praise over food, or liked them both equally. Kady, a Labrador-golden retriever mix, was the top dog when it came to the strength of her preference for praise.
For the current experiment, the researchers began by training the dogs to associate three different objects with different outcomes. A pink toy truck signaled a food reward; a blue toy knight signaled verbal praise from the owner; and a hairbrush signaled no reward, to serve as a control.
The dogs then were tested on the three objects while in an fMRI machine. Each dog underwent 32 trials for each of the three objects as their neural activity was recorded.
All of the dogs showed a stronger neural activation for the reward stimuli compared to the stimulus that signaled no reward, and their responses covered a broad range. Four of the dogs showed a particularly strong activation for the stimulus that signaled praise from their owners. Nine of the dogs showed similar neural activation for both the praise stimulus and the food stimulus. And two of the dogs consistently showed more activation when shown the stimulus for food.
The dogs then underwent a behavioral experiment. Each dog was familiarized with a room that contained a simple Y-shaped maze constructed from baby gates: One path of the maze led to a bowl of food and the other path to the dog’s owner. The owners sat with their backs toward their dogs. The dog was then repeatedly released into the room and allowed to choose one of the paths. If they came to the owner, the owner praised them.
“We found that the caudate response of each dog in the first experiment correlated with their choices in the second experiment,” Berns says. “Dogs are individuals and their neurological profiles fit the behavioral choices they make. Most of the dogs alternated between food and owner, but the dogs with the strongest neural response to praise chose to go to their owners 80 to 90 percent of the time. It shows the importance of social reward and praise to dogs. It may be analogous to how we humans feel when someone praises us.”
The experiments lay the groundwork for asking more complicated questions about the canine experience of the world. The Berns’ lab is currently exploring the ability of dogs to process and understand human language.
“Dogs are hypersocial with humans,” Berns says, “and their integration into human ecology makes dogs a unique model for studying cross-species social bonding.”
Related:
Dogs process faces in specialized brain area, study reveals
Scent of the familiar: You may linger like perfume in your dog's brain
Photos by Gregory Berns
Monday, August 8, 2016
Cardinals may reduce West Nile virus spillover in Atlanta
One more reason to love the northern cardinal: In addition to being beautiful to look at, in Atlanta these birds appear to help shield humans from West Nile virus. (Photo by Stephen Wolfe.)
By Carol Clark
Northern cardinals act as “super suppressors” of West Nile virus in Atlanta, slowing transmission and reducing the incidence of human cases of the mosquito-borne pathogen, suggests a new study published in the American Journal of Tropical Medicine and Hygiene.
“Previous research has shown that the American robin acts like a ‘super spreader’ for West Nile virus in Chicago and some other cities,” says Rebecca Levine, who led the research as a PhD student in Emory University’s Department of Environmental Sciences. “Now our study provides convincing data that northern cardinals and some other bird species may be ‘super suppressors’ of the virus in Atlanta.”
The researchers also found that birds in Atlanta’s old-growth forests had much lower rates of West Nile virus infection compared to birds tested in the city’s secondary forests and other urban micro-habitats.
“This finding suggests that old growth forests may be an important part of an urban landscape,” Levine says, “not just because of the natural beauty of ancient trees, but because these habitats may also be a means of reducing transmission of some mosquito-borne diseases.”
Levine has since graduated from Emory's Laney Graduate School and now works as an epidemiologist and entomologist for the Centers for Disease Control and Prevention.
Uriel Kitron, chair of Emory’s Department of Environmental Sciences and an expert in mosquito-borne pathogens, is senior author of the paper.
Rebecca Levine in the field with one of the cardinals that was tested. The birds in the study were captured with mist nets and released unharmed after blood samples were drawn. (Photo courtesy of Rebecca Levine.)
West Nile virus (WNV) is zoonotic, meaning that it is an infection of animals that can spill over to humans by a bridge vector, in this case Culex mosquitos. Since its introduction to the United States in 1999, WNV has become the most common zoonotic mosquito-borne pathogen in the country, infecting an estimated 780,000 people (including more than 1,700 fatal cases), in addition to birds and other mammals, such as horses.
The Kitron lab wanted to find out why Georgia’s infection rate for WNV since 2001 is relatively low, at about 3.3 cases per 100,000 people, compared to some states in the north. A 2002 outbreak in Illinois, for instance, recorded about 7.1 cases per 100,000 people.
“When West Nile virus first arrived in the United States, we expected more transmission to humans in the South,” Kitron says, “because the South has a longer transmission season and the Culex mosquitos are common. But even though evidence shows high rates of the virus circulating in local bird populations, there is little West Nile virus in humans in Atlanta and the Southeast in general.”
During the three-year study, the research team collected mosquitoes and birds from different sites across Atlanta, tested them for WNV, and ran a DNA analysis of the mosquitos’ blood meals to see which species of birds they had bitten.
“We found that the mosquitoes feed on American robins a lot from May to mid-July,” Levine says. “But for some unknown reason, in mid-July, during the critical time when the West Nile virus infection rate in mosquitos starts going up, they switch to feeding primarily on cardinals.”
American robins do a great job of amplifying the virus in their blood once they are infected. That trait means they can more efficiently pass the virus to other mosquitos that bite them, so robins are known as “super spreaders.” The virus does not efficiently reproduce, however, in the blood of northern cardinals.
“You can think of the cardinals like a ‘sink,’ and West Nile virus like water draining out of that sink,” Levine says. “The cardinals are absorbing the transmission of the virus and not usually passing it on.”
The study results showed that, to a somewhat lesser extent, birds in the mimid family – including mockingbirds, brown thrashers and gray catbirds – also appear to be acting like sinks for WNV in Atlanta.
The researchers found significantly fewer avian WNV infections in the old growth forest sites sampled in Atlanta – including Fernbank Forest and Wesley Woods Preserve – compared to secondary forests such as Grant Park and the Atlanta Botanical Garden. The rate of infections in mosquitos, however, was similar for both types of forests.
“These are really complex ecosystems, so we cannot single out the specific reasons for these findings,” Levine says. “They suggest that there is something unique about these old growth forests and how they affect avian systems in Atlanta.”
Atlanta, nicknamed “City in the Forest,” is one of only seven U.S. cities with a high population density to have urban tree cover of 40 percent or more. In contrast, Chicago retains only 11 percent tree cover.
“As new mosquito-borne diseases enter and spread in America, we need to better understand all aspects of pathogen transmission cycles, said Stephen Higgs, present of the American Society of Tropical Medicine and Hygiene. “By shedding light on the reasons behind a curious discrepancy in West Nile virus human infection rates in different regions of the United States, this study has the potential to better protect Americans’ health while continuing to demonstrate the link between animal and human health.”
Co-authors of the paper also include researchers from the University of Georgia, Texas A&M and the Georgia Department of Transportation’s Office of Environmental Services.
Related:
Sewage raises West Nile virus risk
Why Zika risk is low for Olympic athletes in Rio
By Carol Clark
Northern cardinals act as “super suppressors” of West Nile virus in Atlanta, slowing transmission and reducing the incidence of human cases of the mosquito-borne pathogen, suggests a new study published in the American Journal of Tropical Medicine and Hygiene.
“Previous research has shown that the American robin acts like a ‘super spreader’ for West Nile virus in Chicago and some other cities,” says Rebecca Levine, who led the research as a PhD student in Emory University’s Department of Environmental Sciences. “Now our study provides convincing data that northern cardinals and some other bird species may be ‘super suppressors’ of the virus in Atlanta.”
The researchers also found that birds in Atlanta’s old-growth forests had much lower rates of West Nile virus infection compared to birds tested in the city’s secondary forests and other urban micro-habitats.
“This finding suggests that old growth forests may be an important part of an urban landscape,” Levine says, “not just because of the natural beauty of ancient trees, but because these habitats may also be a means of reducing transmission of some mosquito-borne diseases.”
Levine has since graduated from Emory's Laney Graduate School and now works as an epidemiologist and entomologist for the Centers for Disease Control and Prevention.
Uriel Kitron, chair of Emory’s Department of Environmental Sciences and an expert in mosquito-borne pathogens, is senior author of the paper.
Rebecca Levine in the field with one of the cardinals that was tested. The birds in the study were captured with mist nets and released unharmed after blood samples were drawn. (Photo courtesy of Rebecca Levine.)
West Nile virus (WNV) is zoonotic, meaning that it is an infection of animals that can spill over to humans by a bridge vector, in this case Culex mosquitos. Since its introduction to the United States in 1999, WNV has become the most common zoonotic mosquito-borne pathogen in the country, infecting an estimated 780,000 people (including more than 1,700 fatal cases), in addition to birds and other mammals, such as horses.
The Kitron lab wanted to find out why Georgia’s infection rate for WNV since 2001 is relatively low, at about 3.3 cases per 100,000 people, compared to some states in the north. A 2002 outbreak in Illinois, for instance, recorded about 7.1 cases per 100,000 people.
“When West Nile virus first arrived in the United States, we expected more transmission to humans in the South,” Kitron says, “because the South has a longer transmission season and the Culex mosquitos are common. But even though evidence shows high rates of the virus circulating in local bird populations, there is little West Nile virus in humans in Atlanta and the Southeast in general.”
During the three-year study, the research team collected mosquitoes and birds from different sites across Atlanta, tested them for WNV, and ran a DNA analysis of the mosquitos’ blood meals to see which species of birds they had bitten.
“We found that the mosquitoes feed on American robins a lot from May to mid-July,” Levine says. “But for some unknown reason, in mid-July, during the critical time when the West Nile virus infection rate in mosquitos starts going up, they switch to feeding primarily on cardinals.”
American robins do a great job of amplifying the virus in their blood once they are infected. That trait means they can more efficiently pass the virus to other mosquitos that bite them, so robins are known as “super spreaders.” The virus does not efficiently reproduce, however, in the blood of northern cardinals.
“You can think of the cardinals like a ‘sink,’ and West Nile virus like water draining out of that sink,” Levine says. “The cardinals are absorbing the transmission of the virus and not usually passing it on.”
The study results showed that, to a somewhat lesser extent, birds in the mimid family – including mockingbirds, brown thrashers and gray catbirds – also appear to be acting like sinks for WNV in Atlanta.
The researchers found significantly fewer avian WNV infections in the old growth forest sites sampled in Atlanta – including Fernbank Forest and Wesley Woods Preserve – compared to secondary forests such as Grant Park and the Atlanta Botanical Garden. The rate of infections in mosquitos, however, was similar for both types of forests.
“These are really complex ecosystems, so we cannot single out the specific reasons for these findings,” Levine says. “They suggest that there is something unique about these old growth forests and how they affect avian systems in Atlanta.”
Atlanta, nicknamed “City in the Forest,” is one of only seven U.S. cities with a high population density to have urban tree cover of 40 percent or more. In contrast, Chicago retains only 11 percent tree cover.
“As new mosquito-borne diseases enter and spread in America, we need to better understand all aspects of pathogen transmission cycles, said Stephen Higgs, present of the American Society of Tropical Medicine and Hygiene. “By shedding light on the reasons behind a curious discrepancy in West Nile virus human infection rates in different regions of the United States, this study has the potential to better protect Americans’ health while continuing to demonstrate the link between animal and human health.”
Co-authors of the paper also include researchers from the University of Georgia, Texas A&M and the Georgia Department of Transportation’s Office of Environmental Services.
Related:
Sewage raises West Nile virus risk
Why Zika risk is low for Olympic athletes in Rio
Tags:
Biology,
Climate change,
Ecology,
Health
Subscribe to:
Posts (Atom)