Monday, August 31, 2015

Why women rule, and other hot science topics at the Decatur Book Festival

Illustration: Don Morris

Women can forget about equality with men, warns Emory anthropologist Mel Konner.

It’s even better than that. Why should women embrace mere equality when their movement is toward superiority? It is maleness that has Konner worried in his latest book, “Women After All: Sex, Evolution and the End of Male Supremacy,” which looks at the history and future of gender and power dynamics.

Konner will be one of the featured authors in the ever-popular Science track of the Decatur Book Festival this weekend. He’ll take the stage at 3 pm on Saturday, September 5, at the Marriot Conference Center.

The last line of Konner’s book jacket reads: “Provocative and richly informed, ‘Women After All’ is bound to be controversial across the sexes.”

As Konner acknowledges on his personal web site, the first murmurings came about after a short adaptation of the book ran in the Wall Street Journal. Hundreds of angry men responded within a couple of days. His wife, home alone during that period, double-locked the door. Konner’s editor at the Wall Street Journal apologized for failing to instruct him not to read the comments.

For his part, Konner is hiding in plain sight, saying “Clearly, I’ve touched a nerve, and I’m happy about that.”

Konner talks about a future that his grandson will inhabit, a “new world” that “will be better for him because women help run it.”

You can read more about Konner’s book in the latest issue of Emory Magazine.

Another provocative issue at the intersection of science and society is explored in “Vaccine Nation: America’s Changing Relationships with Immunization,” by Emory historian Elena Conis. She will discuss her book at 4:15 pm on Saturday at the Marriott Conference Center.

Tuesday, August 25, 2015

Biophysicists take small step in quest for 'robot scientist'

The researchers dubbed their algorithm "Sir Isaac," in a nod to one of the greatest scientists of all time, Sir Isaac Newton. 

By Carol Clark

Biophysicists have taken another small step forward in the quest for an automated method to infer models describing a system’s dynamics – a so-called robot scientist. Nature Communications published the finding – a practical algorithm for inferring laws of nature from time-series data of dynamical systems.

“Our algorithm is a small step,” says Ilya Nemenman, lead author of the study and a professor of physics and biology at Emory University. “It could be described as a toy version of a robot scientist, but even so it may have practical applications. For the first time, we’ve taught a computer how to efficiently search for the laws that underlie arbitrary, natural dynamical systems, including complex, non-linear biological systems.”

Nemenman’s co-author on the paper is Bryan Daniels, a biophysicist at the University of Wisconsin.

Everything that is changing around us and within us – from the relatively simple motion of celestial bodies, to weather and complex biological processes – is a dynamical system. A large part of science is guessing the laws of nature that underlie such systems, summarizing them in mathematical equations that can be used to make predictions, and then testing those equations and predictions through experiments.

“The long-term dream is to harness large-scale computing to make the guesses for us and speed up the process of discovery,” Nemenman says.

Isaac Newton contemplates gravity beneath an apple tree. The intuition of a genius like Newton is one quality that distinguishes human intelligence from even the highest-powered computer and algorithmic program.

While the quest for a true robot scientist, or computerized general intelligence, remains elusive, this latest algorithm represents a new approach to the problem. “We think we have beaten any automated-inference algorithm that currently exists because we focus on getting an approximate solution to a problem, which we can get with much less data,” Nemenman says.

In previous research, John Wikswo, a biophysicist at Vanderbilt University, along with colleagues at Cornell University, applied a software system to automate the scientific process for biological systems.

“We came up with a way to derive a model of cell behavior, but the approach is complicated and slow, and it is limited in the number of variables that it can track – it can’t be scaled to more complicated systems,” Wikswo says. “This new algorithm increases the speed of the necessary calculation by a factor of 100 or more. It provides an elegant method to generate compact and effective models that should allow prediction and control of complex systems.”

Nemenman and Daniels dubbed their new algorithm “Sir Issac.”

The real Sir Isaac Newton serves as a classic example of how the scientific method involves forming hypotheses, then testing them by looking at data and experiments. Newton guessed that the same rules of gravity applied to a falling apple and to the moon in orbit. He used data to test and refine his guess and generated the law of universal gravitation.

To test their algorithm, Nemenman and Daniels created an artificial, model solar system by generating numerical trajectories of planets and comets that move around a sun. In this simplified solar system, only the sun attracted the planets and comets.

Images of the moon by NASA's Galileo spacecraft. Everything that is changing around us and within us – from the relatively simple motion of celestial bodies, to weather and complex biological processes – is a dynamical system.

“We trained our algorithm how to search through a group of laws which were limited enough to be practical, but also flexible enough to explain many different dynamics,” Nemenman explains. “We then gave the algorithm some simulated planetary trajectories, and asked it what makes these planets move. It gave us the universal gravitational force. Not perfectly, but with very good accuracy. The error was just a few percent.”

The algorithm also figured out that force changes velocity, not the position directly. “It gets Newton’s First Law,” Nemenman says, “the fact that in order to predict the possible trajectory of a planet, whether it stays near the sun or flies off into infinity, just knowing its initial position is not enough. The algorithm understands that you also need to know the velocity.”

While most modern-day high school student know Newton’s First Law, it took humanity 2,000 years beyond the time of Aristotle to discover it.

One limitation of the algorithm is inexactness. Getting an approximate model, however, is beneficial as long as the approximation is close enough to make good predictions, Nemenman says.

“Newton’s laws are also approximate, but they have been remarkably beneficial for 350 years,” he says. “We’re still using them to control everything from electron microscopes to rockets.”

Getting an exact description of any complex dynamical system requires large amounts of data, he adds. “In contrast, with our algorithm, we can get an approximate description by using just a few measurements of a system. That makes our method practical.”

The researchers demonstrated, for example, that the algorithm can infer the dynamics of a caricature of an immune receptor in a leukocyte. This type of model could lead to a better understanding of the time-course for the response to an infection or a drug.

In another experiment, the researchers fed the algorithm data on concentrations of just three different species of chemicals involved in glycolysis in yeast. The algorithm generated a model that makes accurate predictions for the full system of this basic metabolic process to consume glucose, which involves seven chemical species.

“If you applied other methods of automatic inference to this system it would typically take tens of thousands of examples to reliably generate the laws that drive these chemical transformations,” Nemenman says. “With our algorithm, we were able to do it with fewer than 100 examples.”

With their experimental collaborators, the researchers are now exploring whether the algorithm can model more complex biological processes, such as the dynamics of insulin secretion in the pancreas and its relationship to the onset of a disease like diabetes. “The biology of insulin secreting cells is extremely complex. Understanding their dynamics on multiple scales is going to be difficult, and may not be possible for years with traditional methods,” Nemenman says. “But we want to see if we can get a good enough approximation with our method to deliver a practical result.”

The intuition of a genius mind like that of Isaac Newton is one quality that distinguishes human intelligence from even the highest-powered computer and algorithmic program.

“You can’t give a machine intuition – at least for now,” Nemenman says. “What we’re hoping we can do is get our computer algorithm to spit out models of phenomena so that we, as scientists, can use them and our intuition to make useful generalizations. It’s easier to generalize from models of specific systems then it is to generalize from various data sets directly.”

Physicists eye neural fly data, find formula for Zipf's law
Biology may not be so complex after all

Friday, August 21, 2015

Chestnut leaves yield extract that disarms deadly bacteria

Ethnobotanist Cassandra Quave collecting chestnut leaf specimens in the field in Italy. Photo by Marco Caputo.

By Carol Clark

Leaves of the European chestnut tree contain ingredients with the power to disarm dangerous staph bacteria without boosting its drug resistance, scientists have found.

PLOS ONE is publishing the study of a chestnut leaf extract, rich in ursene and oleanene derivatives, that blocks Staphlococcus aureus virulence and pathogenesis without detectable resistance.

The use of chestnut leaves in traditional folk remedies inspired the research, led by Cassandra Quave, an ethnobotanist at Emory University. “We’ve identified a family of compounds from this plant that have an interesting medicinal mechanism,” Quave says. “Rather than killing staph, this botanical extract works by taking away staph’s weapons, essentially shutting off the ability of the bacteria to create toxins that cause tissue damage. In other words, it takes the teeth out of the bacteria’s bite.” 

The discovery holds potential for new ways to both treat and prevent infections of methicillin-resistant S. aureus, or MRSA, without fueling the growing problem of drug-resistant pathogens. 

Antibiotic-resistant bacteria annually cause at least two million illnesses and 23,000 deaths in the United States, according to the Centers for Disease Control and Prevention. MRSA infections lead to everything from mild skin irritations to fatalities. Evolving strains of this “super bug” bacterium pose threats to both hospital patients with compromised immune systems and young, healthy athletes and others who are in close physical contact.

Quave researches the interactions of people and plants – a specialty known as ethnobotany.

“We’ve demonstrated in the lab that our extract disarms even the hyper-virulent MRSA strains capable of causing serious infections in healthy athletes,” Quave says. “At the same time, the extract doesn’t disturb the normal, healthy bacteria on human skin. It’s all about restoring balance.”

Quave, who researches the interactions of people and plants – a specialty known as ethnobotany – is on the faculty of Emory’s Center for the Study of Human Health and Emory School of Medicine’s Department of Dermatology. She became interested in ethnobotany as an undergraduate at Emory. 

For years, she and her colleagues have researched the traditional remedies of rural people in Southern Italy and other parts of the Mediterranean. “I felt strongly that people who dismissed traditional healing plants as medicine because the plants don’t kill a pathogen were not asking the right questions,” she says. “What if these plants play some other role in fighting a disease?”

Hundreds of field interviews guided her to the European chestnut tree, Castanea sativa. “Local people and healers repeatedly told us how they would make a tea from the leaves of the chestnut tree and wash their skin with it to treat skin infections and inflammations,” Quave says.

For the current study, Quave teamed up with Alexander Horswill, a microbiologist at the University of Iowa whose lab focuses on creating tools for use in drug discovery, such as glow-in-the-dark staph strains.

The researchers steeped chestnut leaves in solvents to extract their chemical ingredients. “You separate the complex mixture of chemicals found in the extract into smaller batches with fewer chemical ingredients, test the results, and keep honing in on the ingredients that are the most active,” Quave explains. “It’s a methodical process and takes a lot of hours at the bench. Emory undergraduates did much of the work to gain experience in chemical separation techniques.”

The work produced an extract of 94 chemicals, of which ursene and oleanene based compounds are the most active.

Tests showed that this extract inhibits the ability of staph bacteria to communicate with one another, a process known as quorum sensing. MRSA uses this quorum-sensing signaling system to manufacture toxins and ramp up its virulence.

“We were able to trace out the pathways in the lab, showing how our botanical extract blocks quorum sensing and turns off toxin production entirely,” Quave says. “Many pharmaceutical companies are working on the development of monoclonal antibodies that target just one toxin. This is more exciting because we’ve shown that with this extract, we can turn off an entire cascade responsible for producing a variety of different toxins.”

A single dose of the extract, at 50 micrograms, cleared up MRSA skin lesions in lab mice, stopping tissue damage and red blood cell damage. The extract does not lose activity, or become resistant, even after two weeks of repeated exposure. And tests on human skin cells in a lab dish showed that the botanical extract does not harm the skin cells, or the normal skin micro-flora.

The Emory Office of Technology Transfer has filed a patent for the discovery of the unique properties of the botanical extract.

The researchers are doing further testing on individual components of the extract to determine if they work best in combination or alone. “We now have a mixture that works,” Quave says. “Our goal is to further refine it into a simpler compound that would be eligible for FDA consideration as a therapeutic agent.”

Potential uses include a preventative spray for football pads or other athletic equipment; preventative coatings for medical devices and products such as tampons that offer favorable environments for the growth of MRSA; and as a treatment for MRSA infections, perhaps in combination with antibiotics.

“It’s easy to dismiss traditional remedies as old wives’ tales, just because they don’t attack and kill pathogens,” Quave says. “But there are many more ways to help cure infections, and we need to focus on them in the era of drug-resistant bacteria.”

The research was funded by the NIH National Center for Complementary and Integrative Health. In addition to Quave and Horswill, the study’s authors include: Emory researchers James Lyles and Kate Nelson; and Jeffery Kavanaugh, Corey Parlet, Heidi Crosby and Kristopher Heilmann from the University of Iowa.

Her patient approach to health

Thursday, August 13, 2015

Marks on 3.4-million-year-old bones not due to trampling, analysis confirms

Detail of the marks on a fossilized rib bone, one of the two controversial bones. “The best match we have for the marks, using currently available data, would still be butchery with stone tools," says anthropologist Jessica Thompson. Photo by Zeresenay Alemseged.

By Carol Clark

Marks on two 3.4 million-year-old animal bones found at the site of Dikika, Ethiopia, were not caused by trampling, an extensive statistical analysis confirms. The Journal of Human Evolution published the results of the study, which developed new methods of fieldwork and analysis for researchers exploring the origins of tool making and meat eating in our ancestors.

“Our analysis clearly shows that the marks on these bones are not characteristic of trampling,” says Jessica Thompson, an assistant professor of anthropology at Emory University and lead author of the study. “The best match we have for the marks, using currently available data, would still be butchery with stone tools.”

The 12 marks on the two specimens – a long bone from a creature the size of a medium antelope and a rib bone from an animal closer in size to a buffalo – most closely resemble a combination of purposeful cutting and percussion marks, Thompson says. “When these bones were hit, they were hit with enormous force and multiple times.”

The paper supports the original interpretation that the damage to the two bones is characteristic of stone tool butchery, published in Nature in 2010. That finding was sensational, since it potentially pushed back evidence for the use of stone tools, as well as the butchering of large animals, by about 800,000 years.

The Nature paper was followed in 2011 by a rebuttal in the Proceedings of the National Academy of Sciences (PNAS), suggesting that the bones were marked by incidental trampling in abrasive sediments. That sparked a series of debates about the significance of the discovery and whether the bones had been trampled.

Anthropologist Jessica Thompson at work in the field in Africa. She specializes in the study of what happens to bones after an animal dies.

For the current paper, Thompson and her co-authors examined the surfaces of a sample of more than 4000 other bones from the same deposits. They then used statistical methods to compare more than 450 marks found on those bones to experimental trampling marks and to the marks on the two controversial specimens.

“We would really like to understand what caused these marks,” Thompson says. “One of the most important questions in human evolution is when did we start eating meat, since meat is considered a likely explanation for how we fed the evolution of our big brains.”

Evidence shows that our genus, Homo, emerged around 2.8 million years ago. Until recently, the earliest known stone tools were 2.6 million years old. Changes had already been occurring in the organization of the brains of the human lineage, but after this time there was also an increase in overall brain size. This increased size has largely been attributed to a higher quality diet.

While some other apes are known to occasionally hunt and eat animals smaller than themselves, they do not hunt or eat larger animals that store abundant deposits of fat in the marrow of their long bones. A leading hypothesis in paleo-anthropology is that a diet rich in animal protein combined with marrow fat provided the energy needed to fuel the larger human brain.

The animal bones in the Dikika site, however, have been reliably dated to long before Homo emerged. They are from the same sediments and only slightly older than the 3.3-million-year-old fossils unearthed from Dikika belonging to the hominid species Australopithecus afarensis.

Thompson specializes in the study of what happens to bones after an animal dies. “Fossil bones can tell you stories, if you know how to interpret them,” she says.

A whole ecosystem of animals, insects, fungus and tree roots modify bones. Did they get buried quickly? Or were they exposed to the sun for a while? Were they gnawed by a rodent or chomped by a crocodile? Were they trampled on sandy soil or rocky ground? Or were they purposely cut, pounded or scraped with a tool of some kind?

"Fossil bones can tell you stories, if you know how to interpret them," Jessica Thompson says. For instance, the marks on this fossilized bone from the Dikika site are diagnostic of punctures made by crocodile teeth. Photo by Jessica Thompson.

One way that experimental archeologists learn to interpret marks on fossil bones is by modifying modern-day bones. They hit bones with hammer stones, feed them to carnivores and trample them on various substrates, then study the results.

Based on knowledge from such experiments, Thompson was one of three specialists who diagnosed the marks on the two bones from Dikika as butchery in a blind test, before being told the age of the fossils or their origin.

The PNAS rebuttal paper, however, also used experimental methods and came to the conclusion that the marks were characteristic of trampling.

Thompson realized that data from a larger sample of fossils were needed to chip away at the mystery.

The current paper investigated with microscopic scrutiny all non-hominin fossils collected from the Hadar Formation at Dikika. The researchers collected a random sample of fossils from the same deposits as the controversial specimens, as well as nearby deposits. They measured shapes and sizes of marks on the fossil bones. Then they compared the characteristics of the fossil marks statistically to the experimental marks reported in the PNAS rebuttal paper as being typical of trampling damage. They also investigated the angularity of sand grains at the site and found that they were rounded – not the angular type that might produce striations on a trampled bone.

“The random population sample of the fossils provides context,” Thompson says. “The marks on the two bones in question don’t look like other marks common on the landscape. The marks are bigger, and they have different characteristics.”

Trample marks tend to be shallow, sinuous or curvy. Purposeful cuts from a tool tend to be straight and create a narrow V-shaped groove, while a tooth tends to make a U-shaped groove. The study measured and quantified such damage to modern-day bones for comparison to the fossilized ones.

“Our analysis shows with statistical certainty that the marks on the two bones in question were not caused by trampling,” Thompson says. “While there is abundant evidence that other bones at the site were damaged by trampling, these two bones are outliers. The marks on them still more closely resemble marks made by butchering.”

One hypothesis is that butchering large animals with tools occurred during that time period, but that it was an exceedingly rare behavior. Another possibility is that more evidence is out there, but no one has been looking for it because they have not expected to find it at a time period this early.

The Dikika specimens represent a turning point in paleoanthropology, Thompson says. “If we want to understand when and how our ancestors started eating meat and moving into that ecological niche, we need to refine our search images for the field and apply these new recovery and analytical methods. We hope other researchers will use our work as a recipe to go out and systematically collect samples from other sites for comparison.”

In addition to Dikika, other recent finds are shaking up long held views of hominin evolution and when typical human behaviors emerged. This year, a team led by archeologist Sonia Harmand in Kenya reported unearthing stone tools that have been reliably dated to 3.3 million years ago, or 700,000 years older than the previous record.

“We know that simple stone tools are not unique to humans,” Thompson says. “The making of more complex tools, designed for more complex uses, may be uniquely human.”

Complex cognition shaped the Stone Age hand axe, study shows
Brain trumps hand in Stone Age tool study

Thursday, August 6, 2015

Chemists find new way to do light-driven reactions in solar energy quest

Emory physical chemist Tim Lian researches light-driven charge transfer for solar energy conversion. Photo by Bryan Meltz, Emory Photo/Video.

By Carol Clark

Chemists have found a new, more efficient method to perform light-driven reactions, opening up another possible pathway to harness sunlight for energy. The journal Science is publishing the new method, which is based on plasmon – a special motion of electrons involved in the optical properties of metals.

“We’ve discovered a new and unexpected way to use plasmonic metal that holds potential for use in solar energy conversion,” says Tim Lian, professor of physical chemistry at Emory University and the lead author of the research. “We’ve shown that we can harvest the high energy electrons excited by light in plasmon and then use this energy to do chemistry.”

Plasmon is a collective motion of free electrons in a metal that strongly absorbs and scatters light.

One of the most vivid examples of surface plasmon can be seen in the intricate stained glass windows of some medieval cathedrals, an effect achieved through gold nano-particles that absorb and scatter visible light. Plasmon is highly tunable: Varying the size and shape of the gold nano-particles in the glass controls the color of the light emitted.

Modern-day science is exploring and refining the use of these plasmonic effects for a range of potential applications, from electronics to medicine and renewable energy.

Surface plasmon effects can be seen in the stained glass of some medieval cathedrals. Photo of rose window in Notre Dame Cathedral by Krzysztof Mizera.

Lian’s lab, which specializes in exploring light-driven charge transfer for solar energy conversion, experimented with ways to use plasmon to make that process more efficient and sustainable.

Gold is often used as a catalyst, a substance to drive chemical reactions, but not as a photo catalyst: a material to absorb light and then do chemistry with the energy provided by the light.

During photocatalysis, a metal absorbs light strongly, rapidly exciting a lot of electrons. “Imagine electrons sloshing up and down in the metal,” Lian says. “Once you excite them at this level, they crash right down. All the energy is released as heat really fast – in picoseconds.”

The researchers wanted to find a way to capture the energy in the excited electrons before it was released as heat and then use hot electrons to fuel reactions.

Through experimentation, they found that coupling nano-rods of cadmium selenide, a semi-conductor, to a plasmonic gold nanoparticle tip allowed the excited electrons in the gold to escape into the semi-conductor material.

“If you use a material with a certain energy level that can strongly bond to plasmon, then the excited electrons can escape into the material and stay at the high energy level,” Lian says. “We showed that you can harvest electrons before they crash down and relax, and combine the catalytic property of plasmon with its light absorbing ability.”

Transmission electron micrograph (TEM) of cadmium selenide nanorods with gold tips. Inset shows a high-res TEM of two nanorods. Micrographs courtesy Kaifeng Wu and Tianquan Lian (Emory) and James McBride (Vanderbilt).

Instead of using heat to do chemistry, this new process uses metals and light to do photochemistry, opening a new, potentially more efficient, method for exploration.

“We are now looking at whether we can find other electron acceptors that would work in this same process, such as a molecule or molecular catalyst instead of cadmium selenide,” Lian says. “That would make this process a general scheme with many different potential applications.”

The researchers also want to explore whether the method can drive light-driven water oxidation more efficiently. Using sunlight to split water to generate hydrogen is a major goal in the quest for affordable and sustainable solar energy.

“Using unlimited sunlight to move electrons around and tap catalytic power is a difficult challenge, but we have to find ways to do this,” Lian says. “We have no choice. Solar power is the only energy source that can sustain the growing human population without catastrophic environmental impact.”

The current study was funded by the U.S. Department of Energy. The study co-authors include: Emory graduate student Kaifeng Wu; Emory post-doctoral fellow Jinquan Chen; and chemist James McBride from Vanderbilt University.

Shining a light on green energy

Tuesday, August 4, 2015

Dogs process faces in specialized brain area, study reveals

The dogs were trained to view both video images and static images on a screen while undergoing fMRI. Photo by Gregory Berns.

By Carol Clark

Dogs have a specialized region in their brains for processing faces, a new study finds. PeerJ published the research, which provides the first evidence for a face-selective region in the temporal cortex of dogs.

“Our findings show that dogs have an innate way to process faces in their brains, a quality that has previously only been well-documented in humans and other primates,” says Gregory Berns, a neuroscientist at Emory University and the senior author of the study.

Having neural machinery dedicated to face processing suggests that this ability is hard-wired through cognitive evolution, Berns says, and may help explain dogs’ extreme sensitivity to human social cues.

Berns heads up the Dog Project in Emory’s Department of Psychology, which is researching evolutionary questions surrounding man’s best, and oldest, friend. The project was the first to train dogs to voluntarily enter a functional magnetic resonance imaging (fMRI) scanner and remain motionless during scanning, without restraint or sedation. In previous research, the Dog Project identified the caudate region of the canine brain as a reward center. It also showed how that region of a dog’s brain responds more strongly to the scents of familiar humans than to the scents of other humans, or even to those of familiar dogs.

For the current study, the researchers focused on how dogs respond to faces versus everyday objects. “Dogs are obviously highly social animals,” Berns says, “so it makes sense that they would respond to faces. We wanted to know whether that response is learned or innate.”

Dogs are hard-wired to respond to faces through cognitive evolution, the study suggests.

The study involved dogs viewing both static images and video images on a screen while undergoing fMRI. It was a particularly challenging experiment since dogs do not normally interact with two-dimensional images, and they had to undergo training to learn to pay attention to the screen.

A limitation of the study was the small sample size: Only six of the eight dogs enrolled in the study were able to hold a gaze for at least 30 seconds on each of the images to meet the experimental criteria.

The results were clear, however, for the six subjects able to complete the experiment. A region in their temporal lobe responded significantly more to movies of human faces than to movies of everyday objects. This same region responded similarly to still images of human faces and dog faces, yet significantly more to both human and dog faces than to images of everyday objects.

If the dogs’ response to faces was learned – by associating a human face with food, for example – you would expect to see a response in the reward system of their brains, but that was not the case, Berns says.

A previous study, decades ago, using electrophysiology, found several face-selective neurons in sheep.

“That study identified only a few face-selective cells and not an entire region of the cortex,” says Daniel Dilks, an Emory assistant professor of psychology and the first author of the current dog study.

The researchers have dubbed the canine face-processing region they identified the dog face area, or DFA.

Humans have at least three face processing regions in the brain, including the fusiform face area, or FFA, which is associated with distinguishing faces from other objects. “We can predict what parts of your brain are going to be activated when you’re looking at faces,” Dilks says. “This is incredibly reliable across people.”

One hypothesis is that distinguishing faces is important for any social creature.

“Dogs have been cohabitating with humans for longer than any other animal,” Dilks says. “They are incredibly social, not just with other members of their pack, but across species. Understanding more about canine cognition and perception may tell us more about social cognition and perception in general.”

What is your dog thinking?
Scent of the familiar: You may linger like perfume in your dog's brain

Monday, August 3, 2015

Math shines with the stars in 'The Man Who Knew Infinity'

By Carol Clark

Call it a math bromance. Cambridge mathematician G. H. Hardy’s collaboration with the obscure, self-taught Indian Srinivasa Ramanujan – during the height of British colonialism – changed math and science forever. The story is finally going mainstream through a major motion picture, “The Man Who Knew Infinity," starring Dev Patel and Jeremy Irons.

“It’s the story of a man who overcame incredible obstacles to become one of the most important mathematicians of his day,” says Emory mathematician Ken Ono, who served as a consultant for the film. “It’s a great human story. It’s true. And I’m glad that the world is finally going to get to enjoy it.”

The Mathematical Association of America (MAA) will feature a sneak peak of “The Man Who Knew Infinity” on August 6, as part of its centennial celebration, MathFest 2015, in Washington D.C. Ono, a leading expert on Ramanujan’s theories, will lead a panel discussion at the screening event, which begins at 5 pm at the Marriott Wardman Park. Panelists will include Princeton mathematician Manjul Bhargava; Robert Kanigel, who wrote the 1991 book that the movie is based on; and Matt Brown, the screenwriter and director of the movie.

The movie’s world premier is set for September at the Toronto International Film Festival.

In 1913, Ramanujan wrote a letter to Hardy, including creative formulas that clearly showed his brilliance. Hardy invited Ramanujan to come to Cambridge to study and collaborate, a daring move during a time of deep prejudice.

“Together, they produced phenomenal results,” Ono says. “They changed mathematics and they changed the course of science.”

Ken Ono on the set with Jeremy Irons, who plays Cambridge mathematician G. H. Hardy. (Photo by Sam Pressman.)

A relatively unknown director, Matt Brown spent eight years trying to get the movie project off the ground. He eventually found backing from the producer Ed Pressman of Pressman Films.

“This is not your typical Hollywood film,” Brown says of the final product. “A lot of movies that deal with scientific subjects just mention the science and go straight to the human story. We wanted to honor the math in this film, so that mathematicians could appreciate it as well as other audience members. One way we tried to do that was to show the passion the characters have for the subject.”

When Brown called Ono out of the blue last August and asked him to help with the math on the film, Ono did not hesitate. He was soon on a plane from Atlanta to London to begin putting in 16-hour days on the set at Pinewood Studios with the cast and crew.

“I’ve never met anybody with more energy and enthusiasm for his work than Ken,” Brown says. “It was invaluable to me as a director to have him go over the script and make sure that the math was accurate. He was incredibly kind and patient. It gave me confidence.”

Ono also worked closely with the art department, to get details of the math visuals right, and coached the stars, Dev Patel and Jeremy Irons. “Ken helped the actors understand philosophically what was behind the mathematics,” Brown says. “He gave them a little window into it. That’s important because when an actor grasps the meaning of the lines, he can add nuance and subtext to a performance.”

Ultimately, the film is about the relationship between Hardy and Ramanujan, Brown says. “Hardy fought really hard to get Ramanujan honored and bring him into the elite of Trinity College at Cambridge. Hardy basically staked his career on him.”

It was especially risky since Ramanujan did not work like a traditional academic. He did not see the need of providing proofs for his fantastic formulas, and believed that they came to him as visions from a goddess.

“Ramanujan saw the world, and math, in a spiritual way,” Brown says. “It’s incredible that he wound up at Cambridge with Hardy, an atheist, as his mentor.”

Unfortunately, while Hardy proved a great academic mentor for Ramanujan, it took longer for their friendship to evolve. “This movie tells a story about the cost that comes when people wait out of fear to connect more deeply in their relationships,” Brown says.

Doing math with movie stars

Sunday, August 2, 2015

Should babies be screened for autism risk?

Karen Rommelfanger, neuroethics program director at the Emory Center for Ethics, co-wrote an opinion piece for The Conversation with Jennifer Sarrett, lecturer at Emory's Center for the Study of Human Health. Below is an excerpt:

For children with autism, early intervention is critical. Therapies and education – especially in the first two years of life – can facilitate a child’s social development, reduce familial stress and ultimately improve quality of life.

But while we can reliably diagnose autism spectrum disorder (ASD) at 24 months, most children are diagnosed much later. This is largely due to a lack of resources, poor adherence to screening guidelines and the fact that primary care physicians are often uncomfortable talking about autism risk to parents.

But what if we could use a simple, routine test to screen every baby for autism? It’s not as far-fetched as it sounds. Larger-scale clinical trials for an eye-tracking device that could be used to predict autism are slated to begin this year. This presents a new and unique set of ethical concerns.

Technologies that predict the possibility of a neurological disorder have the weight of affecting conceptions of not just "what" these children have but "who" these children will become. As a neuroethicist and autism researcher, we believe it is time to have a conversation about these technologies, and what it will mean for parents and children or for people with autism.

Many researchers have found that autistic children prefer to look at different things than typically developing children. This is called gaze preference. In fact, gaze preference changes can be detected prior to the onset of autism. Researchers have been using eye-tracking devices to record where babies gaze when viewing videos of social scenes. And they have been using this device not to diagnose autism, but to predict autism.

A 2013 study using an eye-tracking device found that differences in gaze preference can be detected in infants as young as two months. When viewing videos, the infants who look at mouths more than eyes and objects more than people are more likely to later be diagnosed with autism. These infants experienced a decline in attention to other people’s eyes.

The researchers from this study are working to replicate these findings in larger studies and are heading up the development of the eye-tracking device slated for clinical trials this year, and should the trials be successful, researchers will seek FDA approval for the device.

Read the whole article in The Conversation.