Thursday, August 13, 2015

Marks on 3.4-million-year-old bones not due to trampling, analysis confirms

Detail of the marks on a fossilized rib bone, one of the two controversial bones. “The best match we have for the marks, using currently available data, would still be butchery with stone tools," says anthropologist Jessica Thompson. Photo by Zeresenay Alemseged.

By Carol Clark

Marks on two 3.4 million-year-old animal bones found at the site of Dikika, Ethiopia, were not caused by trampling, an extensive statistical analysis confirms. The Journal of Human Evolution published the results of the study, which developed new methods of fieldwork and analysis for researchers exploring the origins of tool making and meat eating in our ancestors.

“Our analysis clearly shows that the marks on these bones are not characteristic of trampling,” says Jessica Thompson, an assistant professor of anthropology at Emory University and lead author of the study. “The best match we have for the marks, using currently available data, would still be butchery with stone tools.”

The 12 marks on the two specimens – a long bone from a creature the size of a medium antelope and a rib bone from an animal closer in size to a buffalo – most closely resemble a combination of purposeful cutting and percussion marks, Thompson says. “When these bones were hit, they were hit with enormous force and multiple times.”

The paper supports the original interpretation that the damage to the two bones is characteristic of stone tool butchery, published in Nature in 2010. That finding was sensational, since it potentially pushed back evidence for the use of stone tools, as well as the butchering of large animals, by about 800,000 years.

The Nature paper was followed in 2011 by a rebuttal in the Proceedings of the National Academy of Sciences (PNAS), suggesting that the bones were marked by incidental trampling in abrasive sediments. That sparked a series of debates about the significance of the discovery and whether the bones had been trampled.

Anthropologist Jessica Thompson at work in the field in Africa. She specializes in the study of what happens to bones after an animal dies.

For the current paper, Thompson and her co-authors examined the surfaces of a sample of more than 4000 other bones from the same deposits. They then used statistical methods to compare more than 450 marks found on those bones to experimental trampling marks and to the marks on the two controversial specimens.

“We would really like to understand what caused these marks,” Thompson says. “One of the most important questions in human evolution is when did we start eating meat, since meat is considered a likely explanation for how we fed the evolution of our big brains.”

Evidence shows that our genus, Homo, emerged around 2.8 million years ago. Until recently, the earliest known stone tools were 2.6 million years old. Changes had already been occurring in the organization of the brains of the human lineage, but after this time there was also an increase in overall brain size. This increased size has largely been attributed to a higher quality diet.

While some other apes are known to occasionally hunt and eat animals smaller than themselves, they do not hunt or eat larger animals that store abundant deposits of fat in the marrow of their long bones. A leading hypothesis in paleo-anthropology is that a diet rich in animal protein combined with marrow fat provided the energy needed to fuel the larger human brain.

The animal bones in the Dikika site, however, have been reliably dated to long before Homo emerged. They are from the same sediments and only slightly older than the 3.3-million-year-old fossils unearthed from Dikika belonging to the hominid species Australopithecus afarensis.

Thompson specializes in the study of what happens to bones after an animal dies. “Fossil bones can tell you stories, if you know how to interpret them,” she says.

A whole ecosystem of animals, insects, fungus and tree roots modify bones. Did they get buried quickly? Or were they exposed to the sun for a while? Were they gnawed by a rodent or chomped by a crocodile? Were they trampled on sandy soil or rocky ground? Or were they purposely cut, pounded or scraped with a tool of some kind?

"Fossil bones can tell you stories, if you know how to interpret them," Jessica Thompson says. For instance, the marks on this fossilized bone from the Dikika site are diagnostic of punctures made by crocodile teeth. Photo by Jessica Thompson.

One way that experimental archeologists learn to interpret marks on fossil bones is by modifying modern-day bones. They hit bones with hammer stones, feed them to carnivores and trample them on various substrates, then study the results.

Based on knowledge from such experiments, Thompson was one of three specialists who diagnosed the marks on the two bones from Dikika as butchery in a blind test, before being told the age of the fossils or their origin.

The PNAS rebuttal paper, however, also used experimental methods and came to the conclusion that the marks were characteristic of trampling.

Thompson realized that data from a larger sample of fossils were needed to chip away at the mystery.

The current paper investigated with microscopic scrutiny all non-hominin fossils collected from the Hadar Formation at Dikika. The researchers collected a random sample of fossils from the same deposits as the controversial specimens, as well as nearby deposits. They measured shapes and sizes of marks on the fossil bones. Then they compared the characteristics of the fossil marks statistically to the experimental marks reported in the PNAS rebuttal paper as being typical of trampling damage. They also investigated the angularity of sand grains at the site and found that they were rounded – not the angular type that might produce striations on a trampled bone.

“The random population sample of the fossils provides context,” Thompson says. “The marks on the two bones in question don’t look like other marks common on the landscape. The marks are bigger, and they have different characteristics.”

Trample marks tend to be shallow, sinuous or curvy. Purposeful cuts from a tool tend to be straight and create a narrow V-shaped groove, while a tooth tends to make a U-shaped groove. The study measured and quantified such damage to modern-day bones for comparison to the fossilized ones.

“Our analysis shows with statistical certainty that the marks on the two bones in question were not caused by trampling,” Thompson says. “While there is abundant evidence that other bones at the site were damaged by trampling, these two bones are outliers. The marks on them still more closely resemble marks made by butchering.”

One hypothesis is that butchering large animals with tools occurred during that time period, but that it was an exceedingly rare behavior. Another possibility is that more evidence is out there, but no one has been looking for it because they have not expected to find it at a time period this early.

The Dikika specimens represent a turning point in paleoanthropology, Thompson says. “If we want to understand when and how our ancestors started eating meat and moving into that ecological niche, we need to refine our search images for the field and apply these new recovery and analytical methods. We hope other researchers will use our work as a recipe to go out and systematically collect samples from other sites for comparison.”

In addition to Dikika, other recent finds are shaking up long held views of hominin evolution and when typical human behaviors emerged. This year, a team led by archeologist Sonia Harmand in Kenya reported unearthing stone tools that have been reliably dated to 3.3 million years ago, or 700,000 years older than the previous record.

“We know that simple stone tools are not unique to humans,” Thompson says. “The making of more complex tools, designed for more complex uses, may be uniquely human.”

Related:
Complex cognition shaped the Stone Age hand axe, study shows
Brain trumps hand in Stone Age tool study

Thursday, August 6, 2015

Chemists find new way to do light-driven reactions in solar energy quest

Emory physical chemist Tim Lian researches light-driven charge transfer for solar energy conversion. Photo by Bryan Meltz, Emory Photo/Video.

By Carol Clark

Chemists have found a new, more efficient method to perform light-driven reactions, opening up another possible pathway to harness sunlight for energy. The journal Science is publishing the new method, which is based on plasmon – a special motion of electrons involved in the optical properties of metals.

“We’ve discovered a new and unexpected way to use plasmonic metal that holds potential for use in solar energy conversion,” says Tim Lian, professor of physical chemistry at Emory University and the lead author of the research. “We’ve shown that we can harvest the high energy electrons excited by light in plasmon and then use this energy to do chemistry.”

Plasmon is a collective motion of free electrons in a metal that strongly absorbs and scatters light.

One of the most vivid examples of surface plasmon can be seen in the intricate stained glass windows of some medieval cathedrals, an effect achieved through gold nano-particles that absorb and scatter visible light. Plasmon is highly tunable: Varying the size and shape of the gold nano-particles in the glass controls the color of the light emitted.

Modern-day science is exploring and refining the use of these plasmonic effects for a range of potential applications, from electronics to medicine and renewable energy.

Surface plasmon effects can be seen in the stained glass of some medieval cathedrals. Photo of rose window in Notre Dame Cathedral by Krzysztof Mizera.

Lian’s lab, which specializes in exploring light-driven charge transfer for solar energy conversion, experimented with ways to use plasmon to make that process more efficient and sustainable.

Gold is often used as a catalyst, a substance to drive chemical reactions, but not as a photo catalyst: a material to absorb light and then do chemistry with the energy provided by the light.

During photocatalysis, a metal absorbs light strongly, rapidly exciting a lot of electrons. “Imagine electrons sloshing up and down in the metal,” Lian says. “Once you excite them at this level, they crash right down. All the energy is released as heat really fast – in picoseconds.”

The researchers wanted to find a way to capture the energy in the excited electrons before it was released as heat and then use hot electrons to fuel reactions.

Through experimentation, they found that coupling nano-rods of cadmium selenide, a semi-conductor, to a plasmonic gold nanoparticle tip allowed the excited electrons in the gold to escape into the semi-conductor material.

“If you use a material with a certain energy level that can strongly bond to plasmon, then the excited electrons can escape into the material and stay at the high energy level,” Lian says. “We showed that you can harvest electrons before they crash down and relax, and combine the catalytic property of plasmon with its light absorbing ability.”

Transmission electron micrograph (TEM) of cadmium selenide nanorods with gold tips. Inset shows a high-res TEM of two nanorods. Micrographs courtesy Kaifeng Wu and Tianquan Lian (Emory) and James McBride (Vanderbilt).

Instead of using heat to do chemistry, this new process uses metals and light to do photochemistry, opening a new, potentially more efficient, method for exploration.

“We are now looking at whether we can find other electron acceptors that would work in this same process, such as a molecule or molecular catalyst instead of cadmium selenide,” Lian says. “That would make this process a general scheme with many different potential applications.”

The researchers also want to explore whether the method can drive light-driven water oxidation more efficiently. Using sunlight to split water to generate hydrogen is a major goal in the quest for affordable and sustainable solar energy.

“Using unlimited sunlight to move electrons around and tap catalytic power is a difficult challenge, but we have to find ways to do this,” Lian says. “We have no choice. Solar power is the only energy source that can sustain the growing human population without catastrophic environmental impact.”

The current study was funded by the U.S. Department of Energy. The study co-authors include: Emory graduate student Kaifeng Wu; Emory post-doctoral fellow Jinquan Chen; and chemist James McBride from Vanderbilt University.

Related:
Shining a light on green energy

Tuesday, August 4, 2015

Dogs process faces in specialized brain area, study reveals

The dogs were trained to view both video images and static images on a screen while undergoing fMRI. Photo by Gregory Berns.

By Carol Clark

Dogs have a specialized region in their brains for processing faces, a new study finds. PeerJ published the research, which provides the first evidence for a face-selective region in the temporal cortex of dogs.

“Our findings show that dogs have an innate way to process faces in their brains, a quality that has previously only been well-documented in humans and other primates,” says Gregory Berns, a neuroscientist at Emory University and the senior author of the study.

Having neural machinery dedicated to face processing suggests that this ability is hard-wired through cognitive evolution, Berns says, and may help explain dogs’ extreme sensitivity to human social cues.

Berns heads up the Dog Project in Emory’s Department of Psychology, which is researching evolutionary questions surrounding man’s best, and oldest, friend. The project was the first to train dogs to voluntarily enter a functional magnetic resonance imaging (fMRI) scanner and remain motionless during scanning, without restraint or sedation. In previous research, the Dog Project identified the caudate region of the canine brain as a reward center. It also showed how that region of a dog’s brain responds more strongly to the scents of familiar humans than to the scents of other humans, or even to those of familiar dogs.

For the current study, the researchers focused on how dogs respond to faces versus everyday objects. “Dogs are obviously highly social animals,” Berns says, “so it makes sense that they would respond to faces. We wanted to know whether that response is learned or innate.”

Dogs are hard-wired to respond to faces through cognitive evolution, the study suggests.

The study involved dogs viewing both static images and video images on a screen while undergoing fMRI. It was a particularly challenging experiment since dogs do not normally interact with two-dimensional images, and they had to undergo training to learn to pay attention to the screen.

A limitation of the study was the small sample size: Only six of the eight dogs enrolled in the study were able to hold a gaze for at least 30 seconds on each of the images to meet the experimental criteria.

The results were clear, however, for the six subjects able to complete the experiment. A region in their temporal lobe responded significantly more to movies of human faces than to movies of everyday objects. This same region responded similarly to still images of human faces and dog faces, yet significantly more to both human and dog faces than to images of everyday objects.

If the dogs’ response to faces was learned – by associating a human face with food, for example – you would expect to see a response in the reward system of their brains, but that was not the case, Berns says.

A previous study, decades ago, using electrophysiology, found several face-selective neurons in sheep.

“That study identified only a few face-selective cells and not an entire region of the cortex,” says Daniel Dilks, an Emory assistant professor of psychology and the first author of the current dog study.

The researchers have dubbed the canine face-processing region they identified the dog face area, or DFA.

Humans have at least three face processing regions in the brain, including the fusiform face area, or FFA, which is associated with distinguishing faces from other objects. “We can predict what parts of your brain are going to be activated when you’re looking at faces,” Dilks says. “This is incredibly reliable across people.”

One hypothesis is that distinguishing faces is important for any social creature.

“Dogs have been cohabitating with humans for longer than any other animal,” Dilks says. “They are incredibly social, not just with other members of their pack, but across species. Understanding more about canine cognition and perception may tell us more about social cognition and perception in general.”

Related:
What is your dog thinking?
Scent of the familiar: You may linger like perfume in your dog's brain

Monday, August 3, 2015

Math shines with the stars in 'The Man Who Knew Infinity'



By Carol Clark

Call it a math bromance. Cambridge mathematician G. H. Hardy’s collaboration with the obscure, self-taught Indian Srinivasa Ramanujan – during the height of British colonialism – changed math and science forever. The story is finally going mainstream through a major motion picture, “The Man Who Knew Infinity," starring Dev Patel and Jeremy Irons.

“It’s the story of a man who overcame incredible obstacles to become one of the most important mathematicians of his day,” says Emory mathematician Ken Ono, who served as a consultant for the film. “It’s a great human story. It’s true. And I’m glad that the world is finally going to get to enjoy it.”

The Mathematical Association of America (MAA) will feature a sneak peak of “The Man Who Knew Infinity” on August 6, as part of its centennial celebration, MathFest 2015, in Washington D.C. Ono, a leading expert on Ramanujan’s theories, will lead a panel discussion at the screening event, which begins at 5 pm at the Marriott Wardman Park. Panelists will include Princeton mathematician Manjul Bhargava; Robert Kanigel, who wrote the 1991 book that the movie is based on; and Matt Brown, the screenwriter and director of the movie.

The movie’s world premier is set for September at the Toronto International Film Festival.

In 1913, Ramanujan wrote a letter to Hardy, including creative formulas that clearly showed his brilliance. Hardy invited Ramanujan to come to Cambridge to study and collaborate, a daring move during a time of deep prejudice.

“Together, they produced phenomenal results,” Ono says. “They changed mathematics and they changed the course of science.”

Ken Ono on the set with Jeremy Irons, who plays Cambridge mathematician G. H. Hardy. (Photo by Sam Pressman.)

A relatively unknown director, Matt Brown spent eight years trying to get the movie project off the ground. He eventually found backing from the producer Ed Pressman of Pressman Films.

“This is not your typical Hollywood film,” Brown says of the final product. “A lot of movies that deal with scientific subjects just mention the science and go straight to the human story. We wanted to honor the math in this film, so that mathematicians could appreciate it as well as other audience members. One way we tried to do that was to show the passion the characters have for the subject.”

When Brown called Ono out of the blue last August and asked him to help with the math on the film, Ono did not hesitate. He was soon on a plane from Atlanta to London to begin putting in 16-hour days on the set at Pinewood Studios with the cast and crew.

“I’ve never met anybody with more energy and enthusiasm for his work than Ken,” Brown says. “It was invaluable to me as a director to have him go over the script and make sure that the math was accurate. He was incredibly kind and patient. It gave me confidence.”

Ono also worked closely with the art department, to get details of the math visuals right, and coached the stars, Dev Patel and Jeremy Irons. “Ken helped the actors understand philosophically what was behind the mathematics,” Brown says. “He gave them a little window into it. That’s important because when an actor grasps the meaning of the lines, he can add nuance and subtext to a performance.”

Ultimately, the film is about the relationship between Hardy and Ramanujan, Brown says. “Hardy fought really hard to get Ramanujan honored and bring him into the elite of Trinity College at Cambridge. Hardy basically staked his career on him.”

It was especially risky since Ramanujan did not work like a traditional academic. He did not see the need of providing proofs for his fantastic formulas, and believed that they came to him as visions from a goddess.

“Ramanujan saw the world, and math, in a spiritual way,” Brown says. “It’s incredible that he wound up at Cambridge with Hardy, an atheist, as his mentor.”

Unfortunately, while Hardy proved a great academic mentor for Ramanujan, it took longer for their friendship to evolve. “This movie tells a story about the cost that comes when people wait out of fear to connect more deeply in their relationships,” Brown says.

Related:
Doing math with movie stars

Sunday, August 2, 2015

Should babies be screened for autism risk?


Karen Rommelfanger, neuroethics program director at the Emory Center for Ethics, co-wrote an opinion piece for The Conversation with Jennifer Sarrett, lecturer at Emory's Center for the Study of Human Health. Below is an excerpt:

For children with autism, early intervention is critical. Therapies and education – especially in the first two years of life – can facilitate a child’s social development, reduce familial stress and ultimately improve quality of life.

But while we can reliably diagnose autism spectrum disorder (ASD) at 24 months, most children are diagnosed much later. This is largely due to a lack of resources, poor adherence to screening guidelines and the fact that primary care physicians are often uncomfortable talking about autism risk to parents.

But what if we could use a simple, routine test to screen every baby for autism? It’s not as far-fetched as it sounds. Larger-scale clinical trials for an eye-tracking device that could be used to predict autism are slated to begin this year. This presents a new and unique set of ethical concerns.

Technologies that predict the possibility of a neurological disorder have the weight of affecting conceptions of not just "what" these children have but "who" these children will become. As a neuroethicist and autism researcher, we believe it is time to have a conversation about these technologies, and what it will mean for parents and children or for people with autism.

Many researchers have found that autistic children prefer to look at different things than typically developing children. This is called gaze preference. In fact, gaze preference changes can be detected prior to the onset of autism. Researchers have been using eye-tracking devices to record where babies gaze when viewing videos of social scenes. And they have been using this device not to diagnose autism, but to predict autism.

A 2013 study using an eye-tracking device found that differences in gaze preference can be detected in infants as young as two months. When viewing videos, the infants who look at mouths more than eyes and objects more than people are more likely to later be diagnosed with autism. These infants experienced a decline in attention to other people’s eyes.

The researchers from this study are working to replicate these findings in larger studies and are heading up the development of the eye-tracking device slated for clinical trials this year, and should the trials be successful, researchers will seek FDA approval for the device.

Read the whole article in The Conversation.