Thursday, May 26, 2011

The science and ethics of X-Men


“What really interests me in watching comic book super heroes move to the movie screen, 30 and 40 years after their origins, is the change in the way we portray the technology involved,” says bioethicist Paul Root Wolpe. The director of Emory’s Center for Ethics, Wolpe was a big comic book fan growing up.

In the original Spider Man story, Peter Parker gained super powers after he was bitten by a spider that was exposed to radiation.

At that time, Wolpe notes, there was an enormous fear that we were going to be “nuked” by nuclear weapons. “Radiation was the scientific phenomenon around which people had fear and mystery. That isn’t how we think about radiation anymore. Now it’s how we think about genetics.”

So when the first Spider Man movie came out in 2002, the spider that bit Parker had been genetically manipulated instead of radiated.

The X-Men movie series, about a group of genetically mutated super heroes, is a great example of a storyline that is on target with a technology that holds a sense of potential, along with fear and mystery, Wolpe says.

The latest in the series, “X-Men: First Class,” opens June 3.

“The story of the X-Men is really a dramatic representation about what happens when there is a minority group that’s feared by the majority,” Wolpe says. “They created that minority through genetics.”

The mutations portrayed in the series, such as the ability to shoot laser beams from your eyeballs, are implausible as sudden genetic mutations. But many enhanced powers are perfectly plausible as biotechnological developments, Wolpe says.

Bio-engineers are working on ways to improve things such as memory and strength that would mimic the best achievements of humans, and we might one day even be able to borrow traits from animals, Wolpe says.

“One of the great challenges for us is how do we resist the temptation to use genetic technology in humans beings for reasons that are less than life saving,” he says.


Related:
Blurring the lines between life forms

Friday, May 20, 2011

Does lack of fear drive psychopaths?

The higher children score on callous unemotionality, the slower they are to react to a fearful face. 

From the Association for Psychological Science:

Children with a particular risk factor for psychopathy don’t register fear as quickly as healthy children, suggests a new study, to be published in an upcoming issue of Psychological Science.

The hypothesis that psychopaths don’t feel or recognize fear dates back to the 1950s. “What happens is you’re born without that fear, so when your parents try to socialize you, you don’t really respond appropriately because you’re not scared,” says Patrick Sylvers of the University of Washington. By the same token, he adds, if you hurt a peer and they give you a fearful look, “most of us would learn from that and back off,” but a child with developing psychopathy would keep tormenting their classmate.

Sylvers, the study's primary author, did his graduate study at Emory University, where he worked with Patricia Brennan and Scott Lilienfeld, the study's co-authors.

Some recent research has suggested that the problem is attention; that people with psychopathy just don’t pay attention to fearful faces. That would mean you might be able to help troubled children recognize fear by training them to look into people’s eyes, for example. Some studies have suggested that might help.

The researchers wondered if something deeper was going on than a failure to pay attention. They recruited boys in the Atlanta area who got in a lot of trouble at home and school, and gave them and their parents a questionnaire about some aspects of psychopathy. For example, they asked the boys whether they felt guilty when they hurt other people. The researchers were most interested in “callous unemotionality” – a lack of regard for others’ feelings. Children who rank high on callous unemotionality are at risk of developing psychopathy later.

In this experiment, each boy watched a screen that showed a different picture to each eye. One eye saw abstract shapes in constant motion.

In the other eye, a still image of a face was faded up extremely quickly – even before subjects could consciously attend to it – while the abstract shapes were faded out just as quickly. The brain is drawn to the moving shapes, while the face is harder to notice. Each face showed one of four expressions: fearful, disgusted, happy, or neutral. The child was supposed to push a button when he saw the face.

Healthy people notice a fearful face faster than they notice a neutral or happy face, but this was not the case in children who scored high on callous unemotionality. In fact, the higher the score, the slower they were to react to a fearful face.

The children’s reaction to the face was unconscious. Healthy people are “reacting to a threat even though they’re not aware of it,” Sylvers says. That suggests that teaching children to pay attention to faces won’t help solve the underlying problems of psychopathy, because the difference happens before attention comes into play. “I think it’s just going to take a lot more research to figure out what you can do – whether it’s parenting, psychological interventions, or pharmacological therapy. At this point, we just don’t know,” Sylvers says.

Related:
Psychopathic boldness tied to U.S. presidential success
Anxious children confuse 'mad' and 'sad'
Top 10 facts about non-verbal communication

Thursday, May 19, 2011

Striking up conversations about smoking


“I’ve been affected by smoking since the day I was born,” says David Latov, a first-year medical student at Emory. “I never got to meet my grandpa on my mom’s side because he smoked his entire life and died of lung cancer when she was seven months pregnant with me.”

Latov is among the many people talking about how smoking has affected their lives, as the university considers the feasibility of becoming a tobacco-free campus, effective this fall.

“The message isn’t that if you smoke you’re a bad person,” Latov says. “We need to acknowledge that smoking has real consequences and that everyone’s affected, not just the people that smoke.”

Linda Rosen, a business office manager at Emory’s Wesley Woods Center, recently graduated from the university’s smoking cessation program: For the third time.

“I learned the hard way that I can’t smoke ‘sometimes,’” Rosen says. “I really now am a non-smoker.”

An epiphany for her was writing a letter to cigarettes. “It was basically a good-bye letter, which was painful and heartfelt because smoking had been there for me,” Rosen says, adding that she is glad to finally feel free of the need to light up.

“One of the things that people often don’t associate with smoking is the environmental impacts,” says Ciannat Howett, director of Emory’s office of sustainability.

“Smoking is the leading cause of deforestation,” she says. “In Brazil alone, about 60 million trees every year are consumed just for tobacco production. “

Tobacco requires a lot of chemicals and fertilizer that lead to ground water and surface water contamination, she adds. “Cigarette butts alone are highly toxic and non-biodegradable. Every year about 1.7 billion tons are contributed to our oceans, rivers and streams.”

Click here if you would like to weigh in on the idea of a tobacco-free Emory.

Related:
How college shapes health behaviors

Monday, May 16, 2011

Mummies tell history of a 'modern' plague

From ancient times to today, people along the Nile have adapted their farming techniques to the ebb and flow of the river.

By Carol Clark

Mummies from along the Nile are revealing how age-old irrigation techniques may have boosted the plague of schistosomiasis, a water-borne parasitic disease that infects an estimated 200 million people today.

An analysis of the mummies from Nubia, a former kingdom that was located in present-day Sudan, provides details for the first time about the prevalence of the disease across populations in ancient times, and how human alteration of the environment during that era may have contributed to its spread.

The American Journal of Physical Anthropology
is publishing the study, led by Emory graduate student Amber Campbell Hibbs, who recently received her PhD in anthropology.


The analysis provides the first details about the prevalence of the disease across populations in ancient times. Photo by Dennis Van Gerven.

About 25 percent of mummies in the study dated to about 1,500 years ago were found to have Schistosoma mansoni, a species of schistosomiasis associated with more modern-day irrigation techniques.

“Often in the case of prehistoric populations, we tend to assume that they were at the mercy of the environment, and that their circumstances were a given,” says Campbell Hibbs. “Our study suggests that, just like people today, these ancient individuals were capable of altering the environment in ways that impacted their health.”

The study was co-authored by Emory anthropologist George Armelagos; William Secor, an epidemiologist at the Centers for Disease Control and Prevention; and Dennis Van Gerven, an anthropologist at the University of Colorado at Boulder.

“We hope that understanding the impact of schistosomiasis in the past may help in finding ways to control what is one of the most prevalent parasitic diseases in the world today,” Campbell Hibbs says.

CDC graphic shows the lifecycle of schistosomiasis. Click on the graphic to enlarge it.

Schistosomiasis is caused by parasitic worms that live in certain types of freshwater snails. The parasite can emerge from the snails to contaminate fresh water, and then infect humans whose skin comes in contact with the water.

Infection can cause anemia and chronic illness that impairs growth and cognitive development, damages organs, and increases the risk for other diseases. Along with malaria, schistosomiasis ranks among the most socio-economically damaging parasitic diseases in the world.

As far back as the 1920s, evidence of schistosomiasis was detected in mummies from the Nile River region, but only in recent years did the analysis of the antigens and antibodies of some of the individuals become possible.

This latest study tested desiccated tissue samples from two Nubian populations for S. mansoni. The Kulubnarti population lived about 1,200 years ago, during an era when Nile flooding was at its highest average known height, and archaeological evidence for irrigation is lacking. The Wadi Halfa population lived further south along the Nile, about 1,500 years ago, when the average heights of the river were lower. Archeological evidence indicates that the Wadi Halfa used canal irrigation to sustain multiple crops.

The analysis of tissue samples showed that 25 percent of the Wadi Halfa population in the study were infected with S. mansoni, while only 9 percent of the Kulubnarti were infected.

The standing water collected by irrigation canals is particularly favorable to the type of snail that spreads the S. mansoni infection. Another form of the disease, Schistosoma haematobium, is spread by snails that prefer to live in more oxygenated, free-flowing water.

An 1882 engraving, above right, shows irrigation along the Nile.

“Previously, it was generally assumed that in ancient populations schistosomiasis was primarily caused by S. haematobium, and that S. mansoni didn’t become prevalent until Europeans appeared on the scene and introduced intensive irrigation schemes,” Campbell Hibbs says. “That’s a sort of Euro-centric view of what’s going on in Africa, assuming that more advanced technology is needed to control the elements, and that irrigation conducted in a more traditional way doesn’t have a big influence on the environment.”

Co-author George Armelagos is a bioarcheologist who has been studying ancient Nubian populations for more than three decades. Through extensive analysis, he and colleagues have shown that nearly 2,000 years ago the Nubians were regularly consuming tetracycline, most likely in their beer, at levels high enough to show they were deliberately brewing the antibiotic effects.

“The Nubians were probably in healthier shape than many other populations of their time, due to the dry climate, which would reduce their bacterial load, and because they were getting tetracycline,” Armelagos says. “But the prevalence of schistosomiasis shown in this study suggests that their parasite load was probably quite heavy."

Related:
Ancient brewers tapped antibiotic secrets
Putting teeth into the Barker hypothesis

Wednesday, May 11, 2011

What's in Jimmy Carter's cornflakes?


Whether or not you agree with the politics of President Jimmy Carter, it’s hard to deny that the man is a dynamo, well into his 80s. Since Ronald Reagan beat him in the 1980 Presidential election, a defeat he describes as his “forced retirement” at the age of 56, Carter has been running full speed. He established the Carter Center, teaches at Emory, has written numerous books, and he jets around the world to try to eliminate terrible plagues such as Guinea worm disease, and to help resolve brutal, dangerous conflicts.

In a recent talk with Emory sociology students, Carter told them it’s not too early to think about what they will do at the end of their working careers.

“The average American spends about half of their adult years after retirement. And very seldom do we make plans about how we’re going to spend those years,” he said. “About the only thing we plan for is how can I gather enough money and put it in the bank so I can support myself.”

Aging is an opportunity to do good things for other people, Carter said, adding that the key to having a gratifying old age comes down to two words: Work and love.

He and his wife, Rosalynn, have been married 65 years. She tolerates his adventures, and he joins her in her favorite past-times, like skiing and bird watching. “I had never seen skis until I was 62 years old,” Carter said.

“We’ve learned two things in our retirement, one is to give each other plenty of space. I don’t tell her what to do with her life, and she doesn’t tell me what to do with my life,” Carter said. “And we try to heal our differences before we go to sleep at night. Sometimes we stay up quite late, but we don't go to bed angry. Those are the two basic rules we have.”


Related:
Baby boomers raise midlife suicide rate