Showing posts with label Mathematics and Computer Science. Show all posts
Showing posts with label Mathematics and Computer Science. Show all posts

Friday, April 3, 2026

Accuracy test for protein language models shines light into AI 'black box'

Yana Bromberg, right, professor of biology and computer science, and R. Prabakaran, a postdoctoral fellow in the Bromberg lab, are developing computational techniques to study biological complexity. (Photo by Carol Clark)

AI language models, used to generate human-like text to power chatbots and create content, are also revolutionizing biology by treating complex biological data like a language. Language models are increasingly used, for example, to find patterns in DNA and proteins to make predictions and speed research into biological complexity. 

A critical gap, however, is the lack of a method to estimate the reliability of these predictions. 

Computational biologists at Emory University have bridged this gap, developing a simple way to test the accuracy of a language model’s understanding of proteins. Nature Methods published their system, which scores the reliability of a model’s predictions by comparing how it “embeds,” or numerically codifies, synthetic random proteins versus proteins found in nature. 

“To the best of our knowledge, our framework is the first generalized method to quantify protein sequence embedding reliability,” says Yana Bromberg, senior author of the paper and Emory professor of biology and computer science. 

“Our method is a simple, elegant solution to a complex problem,” adds R. Prabakaran, first author of the study and a postdoctoral fellow in the Bromberg lab. “It’s a foundational method with a lot of scope for a range of language models in science.” 


Related:


How the brain charts emotion in a map-like way

Co-authors Philip Kragel, assistant professor of psychology, and Yumeng Ma, a PhD student in Kragel's Emotion Cognition and Computation Lab. (Photos by Carol Clark)

It is well established in psychology that humans conceptualize emotions by features known as valence (the degree of pleasantness or unpleasantness) and arousal (the intensity of bodily reactions, such as rapid breathing or a racing heart). 

If you think of “pleasantness” as longitude and “bodily reaction” as latitude, you can imagine a “mental map,” with nodes that “chart” knowledge of emotion. 

The neural mechanisms giving rise to this configuration, however, have remained unclear. 

Now, a new study reveals that hippocampal-prefrontal circuits — neural structures implicated in forming other types of cognitive maps — could support the mental mapping of emotion. 

Nature Communications published the research by neuroscientists at Emory University. The results showed how the hippocampus represents emotion concepts in a structured hierarchy of “nodes” of pleasantness and bodily reaction, while the ventromedial prefrontal cortex more accurately tracks relationships between these different nodes, or how they are distributed on the mental map.


Related:


Wednesday, January 14, 2026

'Periodic table' for AI methods aims to drive innovation

Eslam Abdelaleem led the work as an Emory graduate student. The day of the final breakthrough, the AI health tracker on his watch recorded his racing heart as three hours of cycling. "That's how it interpretated the level of excitement I was feeling," Abdelaleem says. (Photo by Barbara Conner)

Artificial intelligence is increasingly used to integrate and analyze multiple types of data formats, such as text, images, audio and video. One challenge slowing advances in multimodal AI, however, is the process of choosing the algorithmic method best aligned to the specific task an AI system needs to perform. 

Scientists have developed a unified view of AI methods aimed at systemizing this process. The Journal of Machine Learning Research published the new framework for deriving algorithms, developed by physicists at Emory University. 

“We found that many of today’s most successful AI methods boil down to a single, simple idea — compress multiple kinds of data just enough to keep the pieces that truly predict what you need,” says Ilya Nemenman, Emory professor of physics and senior author of the paper. “This gives us a kind of ‘periodic table’ of AI methods. Different methods fall into different cells, based on which information a method’s loss function retains or discards.”

Wednesday, January 7, 2026

Unlocking design secrets of deep-sea microbes

"The molecular study of proteins is rapidly expanding as the technology supporting the field keeps advancing," says Vincent Conticello. "You're only limited by your interest and your imagination." (Photo by Carol Clark)

The microbe Pyrodictium abyssi is an archaeaon — a member of what’s known as the third domain of life — and an extremophile. It lives in deep-sea thermal vents, at temperatures above the boiling point of water, without light or oxygen, withstanding the enormous pressure at ocean depths of thousands of meters. 

A biomatrix of tiny tubes of protein, known as cannulae, link cells of Pyrodictium abyssi together into a highly stable microbial community. No one knew how these single-celled microbes accomplished this feat of extreme engineering — until now. 

A study using advanced microscopy techniques reveals new details about the elegant design of the cannulae and the remarkable simplicity of their method of construction. Nature Communications published the work, led by scientists at Emory University; the University of Virginia, Charlottesville; and Vrije Universiteit Brussel in Belgium. 

The discovery holds the potential to inspire innovations in biotechnology, from the development of new “smart” materials to nanoscale drug delivery systems. 

“Not only are the cannulae strong enough to endure extreme conditions, they’re beautiful,” says Vincent Conticello, Emory professor of chemistry and co-senior author of the paper. “To me, they resemble columns from the classical architecture of ancient Greece or Rome,” he adds, citing their fluted edges and precise regularity.

Read the full story here.

Related:

Emory chemists invent shape-shifting nanomaterial

Tuesday, October 28, 2025

Electric charge connects jumping worm to prey


A tiny worm that leaps high into the air — up to 25 times its body length — to attach to flying insects uses static electricity to perform this astounding feat, scientists have found. The journal PNAS published the work on the nematode Steinernema carpocapsae, a parasitic roundworm, led by researchers at Emory University and the University of California, Berkeley. 

“We’ve identified the electrostatic mechanism this worm uses to hit its target, and we’ve shown the importance of this mechanism for the worm’s survival,” says co-author Justin Burton, an Emory professor of physics whose lab led the mathematical analyses of laboratory experiments. “Higher voltage, combined with a tiny breath of wind, greatly boosts the odds of a jumping worm connecting to a flying insect.” 

“You might expect to find big discoveries in big animals, but the tiny ones also hold a lot of interesting secrets,” adds Victor Ortega-JimĂ©nez, co-lead author and assistant professor of biomechanics at the University of California, Berkeley. He conducted the experiments, including the use of highspeed microscopy techniques to film the parasitic worm — whose length is about the diameter of a needle point — as it leaped onto electrically charged fruit flies. 

The researchers showed how a charge of a few hundred volts, similar to that generated by an insect’s wings beating the air, initiates an opposite charge in the worm, creating an attractive force. They identified electrostatic induction as the charging mechanism driving this process.

Tuesday, August 12, 2025

AI reveals new physics in dusty plasma


Physicists used a machine-learning method to identify surprising new twists on the non-reciprocal forces governing a many-body system.

The journal PNAS published the findings by experimental and theoretical physicists at Emory University, based on a neural network model and data from laboratory experiments on dusty plasma — ionized gas containing suspended dust particles. 

The work is one of the relatively few instances of using AI not as a data processing or predictive tool, but to discover new physical laws governing the natural world.

"We showed that we can us AI to discover new physics," says Justin Burton, an Emory professor of experimental physics and senior co-author of the paper. "Our AI method is not a black box: we understand how and why it works. The framework it provides is also universal. It could potentially be applied to other many-body systems to open to new routes to discovery."

Wednesday, July 2, 2025

Exploring the frontiers of data science

Satellite technology is transforming the field of geography, says Xiao Huang. "It's kind of like being an astronaut in that satellites give you a view of Earth from space."

As a high-tech geographer, Xiao Huang uses remove sensing and AI for insights into how to design more equitable cities, improve management of natural resources, lessen the impact of natural and human-caused disasters, and improve public health policies.

"I love geography and computer technology," says Huang, assistant professor in Emory's Department of Environmental Sciences. "I want to use my knowledge of these fields to help humanity, especially socially disadvantaged communities."

Read the full story here.

Related:

Developing a new approach to control a dangerous urban mosquito in Ethiopia

Friday, June 27, 2025

New AI tool supports best practices to prevent spread of dangerous C. diff infections

"At Emory, I look forward to continuing this line of work and exploring innovative ways AI can help improve patient care," says Shengpu Tang, who recently joined the university as assistant professor of computer science.

Decision-making forms the core of hospital patient care, involving an array of clinicians whose duties span diagnosis, treatment and resource allocation. The complexity of these interrelated decisions makes it challenging for physicians, nurses and other caretakers to connect all the dots in real time. 

Shengpu Tang, assistant professor of computer science at Emory University, is developing AI tools to identify, validate and transmit key data needed to most effectively support healthcare workers in decision-making processes. 
 
“The end goal is to improve patient care and patient outcomes,” Tang says. 

JAMA Open Network published the results of Tang’s latest collaborative project: the first AI guidance deployed in a hospital setting aimed at guiding best practices to prevent the spread of dangerous infections of Clostridioides difficile

Analysis by the researchers found that the new AI-guided protocol significantly reduced antibiotic prescriptions at Michigan Medicine — a factor that increases infection risk for vulnerable patients — with 10% to 15% fewer days on antimicrobials. Importantly, reducing days on antimicrobials did not increase the length of stay, readmission rate or mortality among patients. The already low incidence of Clostridioides difficle trended downwards during the study, but that reduction did not reach statistical significance.


Related:

Friday, May 2, 2025

Developing a new approach to control a dangerous, invasive mosquito in Ethiopia

Edilawit Mesfine, left, and Edel Seifu, both from Jigjiga University, collect data and larvae from a construction site. (Photo by Kim Awbrey)

Emory University received $2.8 million in funding from the Gates Foundation to support its work to develop and test a high-tech, low-cost method to control an invasive mosquito that poses a growing threat of urban malaria in Africa. The three-year project is focused on three cities in Ethiopia: Jigjiga, Semera and Logiya. 

The project’s novel approach to combating malaria combines on-the-ground knowledge of human and mosquito behaviors with detailed environmental imagery from drones and NASA satellites. Machine learning techniques will be applied to the data to develop a model — powered by artificial intelligence — for targeted public health interventions. 

The aim is to efficiently control populations of the invasive Anopheles stephensi mosquito by first, identifying water sources that are most likely to harbor the larvae during the dry season. And secondly, by sharing maps of these precise targets with local public health authorities — via a mobile phone app — to guide their larvae-eradication efforts in the most efficient and effective manner. 

The strategy is based on research on the ecology of stephensi in Jigjiga led by Gonzalo Vazquez-Prokopec, Emory professor of environmental sciences and co-principal investigator for the grant. “It sounds counterintuitive to focus mosquito-control efforts on the dry season,” Vazquez-Prokopec says. “Our research, however, shows that the dry season offers a perfect window of opportunity to cost-effectively control these mosquitoes.” 

Vazquez-Prokopec is an expert on the disease ecology of pathogens spread by vectors, such as mosquitoes. His research considers environmental factors as well as the interactions of mosquitoes, the pathogens they carry, and people. 


Related:



Tuesday, April 15, 2025

New AI tool set to speed quest for advanced superconductors

Xu Chen, an Emory PhD student of theoretical chemistry, is first author of the paper. He says the team was inspired by the image-recognition training used for self-driving cars to create a powerful machine-learning framework.

Using artificial intelligence shortens the time to identify complex quantum phases in materials from months to minutes, finds a new study published in Newton. The breakthrough could significantly speed up research into quantum materials, particularly low-dimensional superconductors. 

The study was led by theorists at Emory University and experimentalists at Yale University. Senior authors include Fang Liu and Yao Wang, assistant professors in Emory’s Department of Chemistry, and Yu He, assistant professor in Yale’s Department of Applied Physics. 

The team applied machine-learning techniques to detect clear spectral signals that indicate phase transitions in quantum materials — systems where electrons are strongly entangled. These materials are notoriously difficult to model with traditional physics because of their unpredictable fluctuations. 

“Our method gives a fast and accurate snapshot of a very complex phase transition, at virtually no cost,” says Xu Chen, the study’s first author and an Emory PhD student in chemistry. “We hope this can dramatically speed up discoveries in the field of superconductivity.” 

One of the challenges in applying machine learning to quantum materials is the lack of sufficient high-quality experimental data needed to train models. To overcome this, the researchers used high-throughput simulations to generate large amounts of data. They then combined these simulation results with just a small amount of experimental data to create a powerful and efficient machine-learning framework.

Read more about the discovery.

Related:

Chatbot opens computational chemistry to nonexperts

Monday, April 7, 2025

Chatbot opens computational chemistry to nonexperts

The researchers hope their pioneering work to democratize computational chemistry will inspire similar initiatives across the natural sciences. (Liu Group)

Advanced computational software is streamlining quantum chemistry research by automating many of the processes of running molecular simulations. The complicated design of these software packages, however, often limits their use to theoretical chemists trained in specialized computing techniques. 

A new web platform developed at Emory University overcomes this limitation with a user-friendly chatbot. The chatbot guides nonexperts through a multistep process for setting up molecular simulations and visualizing molecules in solution. It enables any chemist — including undergraduate chemistry majors — to configure and execute complex quantum mechanical simulations through chatting. 

The free, publicly available platform — known as AutoSolvateWeb — operates primarily on cloud infrastructure, further expanding access to sophisticated computational research tools. 

The journal Chemical Science published a proof-of-concept for AutoSolvateWeb, which marks a significant step forward in the integration of AI into education and scientific research.

Related:



Wednesday, March 5, 2025

Atlanta Science Festival set to entertain, inspire and engage all ages

The festival culminates Saturday, March 22, in "Exploration Expo," a day-long celebration in Piedmont Park. Demonstrations by Emory chemist Douglas Mulford are among the perennial favorites.

By Carol Clark

Atlanta Science Festival returns March 8-22, with more than 100 events throughout the metro area, inviting the public to join fun, interactive and educational experiences. The acclaimed city-wide celebration, one of the largest of its kind in the country, showcases the myriad science, technology engineering and mathematics (STEM) innovations happening in Atlanta, including at Emory. 

“Not only does the Atlanta Science Festival spotlight the wonder of science in its various forms, we strive to do so by curating a two-week experience that’s as exciting and intriguing as possible,” says Meisa Salaita, executive co-director of Science ATL, the non-profit organization that engineers the festival. “We want to open minds, educate, inspire, entertain, and spark the interest of the scientists of tomorrow.” 

Now in its 12th year, the Atlanta Science Festival was co-founded by Emory, Georgia Tech and the Metro Atlanta Chamber. 

Members of the Emory community will help participants experience the wonders of science through spectacles like the chemistry of fireballs, a musical entertainment combined with a biology talk on the surprising abilities of animals to use medicine, a walking tour of campus science landmarks, a behind-the-scenes look at the latest advances in healthcare technology and much more. 

Creative events to engage participants with technology include “Data Poetics,” which will combine slam poetry and computer science on Thursday, March 13 at 7 p.m. at the Supermarket event space in Atlanta. The introductory workshop in how to use software to visualize data and add power to poetic expression will be co-hosted by Emily Wall, Emory assistant professor of computer science, Keke Wu, Emory postdoctoral researcher, and W. J. Lofton, an Atlanta poet. 

The idea for the event grew out of an Emory class that Wall and Lofton co-taught as part of the Emory Arts and Social Justice Fellows program, which pairs faculty with local artists to explore how creative thinking and artistic expression can inspire change. Their class was so successful that the duo wanted to introduce the concept to the wider public. 

Participants will write a data-driven poem about a social issue affecting Atlanta and then amplify their message through information visualizations. “Many people think of computer science as intimidating and too ‘mathy’ to be interesting,” Wall says. 

That attitude often changes when people learn simple ways to directly apply computer science to better communicate a human problem, she adds. “We want to give artists another tool, a way to make their art even more compelling.”

Tuesday, January 7, 2025

Bittersweet secrets of the fruit fly brain

Fruit flies have served as an important laboratory organism for more than 100 years. (Sanjay Archaya/Wikipedia)

The sense of taste carries evolutionary benefits key to survival. A sweet taste, for instance, signals energy-dense nutrients important to animals foraging for food — including humans. A bitter taste may warn of a toxic substance. 

“We use our sense of taste to decide what to eat and how much to eat,” says Anita Devineni, a neuroscientist and assistant professor in Emory University’s Department of Biology. 

Despite the importance of taste, little is known about how taste cues spark the firing of cells across a brain and evoke a variety of behavioral responses. Devineni is exploring this mystery by mapping the neural circuitry for the taste system of the fruit fly, Drosophila melanogaster

Tinier than a poppy seed, the fruit fly brain contains around 140,000 neurons. 

“That’s 1,000 fewer neurons than a mouse brain and a million times fewer than a human brain,” Devineni explains, making the fly brain a simple starting point for studying general mechanistic principles of cognition. 

Compared to the incredible complexity of its cognitive powers, the human brain’s basic biology appears relatively straightforward. 

“The brain is just an organ like any other organ in your body,” Devineni says. “It’s made up of neurons that are cells like any other cells — lipid membranes containing proteins, DNA and other molecules. What makes a brain cell different from a skin cell or a lung cell is that a brain cell fires. Firing means that sodium ions flow in and out of the cell. Everything that you do, from thinking to talking to walking, is a result of patterns of neurons firing. How could this be?”


Related:


Wednesday, March 6, 2024

Atlanta Science Festival returns to inspire discovery for all ages

A middle-school student experiences an Emory chemistry lab during a recent community outreach event. 

The Atlanta Science Festival returns March 9 to 23, inviting curious kids and adults to explore all things science, technology, engineering and mathematics (STEM). Experts in these fields — including many members of the Emory community — will serve as educational guides for more than 150 interactive events. 

“The Atlanta Science Festival aims to bring the community together through their shared love of science,” says Meisa Salaita, co-founder and co-executive director of Science ATL, the engineers of the festival. “Through these events, we hope to inspire and empower the next generation to pursue their dreams.” 

Participants can take a crash course on the basics of AI, create an herbarium of medicinal plants, go into the field with researchers studying microplastic pollution in a stream, take a behind-the-scenes tour of the latest advances in healthcare technology and even get a taste of the physics of cheese making. 

Now in its 11th year, the Atlanta Science Festival was co-founded by Emory, Georgia Tech and the Metro Atlanta Chamber. 

“We have grown into a mainstay of Atlanta,” says Salaita, noting that many of the events fill up quickly. “The festival is something that people look forward to every spring.” 

Wednesday, January 24, 2024

Computer scientists create simple method to speed cache sifting

"Computer performance fascinates me," says Emory graduate student Yazhuo Zhang, co-first author of the discovery, shown on a visit to Switzerland. Set to receive her PhD in May, Zhang accepted a post-doctroal fellowship at the Federal Institute of Technology Zurich (ETH Zurich).

By Carol Clark

Computer scientists have invented a highly effective, yet incredibly simple, algorithm to decide which items to toss from a web cache to make room for new ones. Known as SIEVE, the new open-source algorithm holds the potential to transform the management of web traffic on a large scale. 

SIEVE is a joint project of computer scientists at Emory University, Carnegie Mellon University and the Pelikan Foundation. The team’s paper on SIEVE will be presented at the 21st USENIX Symposium on Networked Systems Design and Implementation (NSDI) in Santa Clara, California, in April. 

A preprint of the paper is already making waves. SIEVE became a hot topic on Hacker News and the subject of a feature in the influential tech newsletter TLDR, driving tens of thousands of visits to the SIEVE website. 

“SIEVE is bigger and greater than just us,” says Yazhuo Zhang, an Emory PhD student and co-first author of the paper. “It is already performing well but we are getting a lot of good suggestions to make it even better. That’s the beauty of the open-source world.” 

Zhang shares first authorship of the paper with Juncheng (Jason) Yang, who received his master’s degree in computer science at Emory and is now a PhD candidate at Carnegie Mellon. 

“SIEVE is an easy improvement of a tried-and-true cache-eviction algorithm that’s been in use for decades — which is literally like centuries in the world of computing,” says Ymir Vigfusson, associate professor in Emory’s Department of Computer Science. 

Vigfusson is co-senior author of the paper, along with Rashmi Vinayak, an associate professor in Carnegie Mellon’s computer science department. Yao Yue, a computer engineer at the Pelikan Foundation, is also a co-author. 

In addition to its speed and effectiveness, a key factor sparking interest in SIEVE is its simplicity, lending it scalability. 

“Simplicity is the ultimate sophistication,” Vigfusson says. “The simpler the pieces are within a system designed to serve billions of people within a fraction of a second, the easier it is to efficiently implement and maintain that system.” 

Keeping ‘hot objects’ handy 

Many people understand the value of regularly reorganizing their clothing closet. Items that are never used can be tossed and those that are rarely used can be moved to the attic or some other remote location. That leaves the items most commonly worn within easy reach so they can be found quickly, without rummaging around. 

A cache is like a well-organized closet for computer data. The cache is filled with copies of the most popular objects requested by users, or “hot objects” in IT terminology. The cache maintains this small collection of hot objects separately from a computer network’s main database, which is like a vast warehouse filled with all the information that could be served by the system. 

Caching hot objects allows a networked system to run more efficiently, rapidly responding to requests from users. A web application can effectively handle more traffic by popping into a handy closet to grab most of the objects users want rather than traveling down to the warehouse and searching through a massive database for each request. 

“Caching is everywhere,” Zhang says. “It’s important to every company, big or small, that is using web applications. Every website needs a cache system.” 

And yet, caching is relatively understudied in the computer science field. 

A logo for SIEVE, designed by Zhang, portrays hotter objects in shades of red and colder objects in shades of blue. Zhang also designed a web site for SIEVE, including a motion graphic demonstrating how it works.

A sense of wonder 

Zhang, who received her undergraduate and master’s degrees at universities in her hometown of Guangzhou, China, started off majoring in software engineering. “It’s fun to code and to make a website,” she says, “but it’s not fundamentally challenging once you learn how to do it. I wanted to gain more understanding of the backbone of technology. Computer performance fascinates me.” 

Zhang applied to Emory to work with Vigfusson given his focus on fundamental topics such as computer security and caching, and his skill at talking about them in simple terms. “It’s important to make complex ideas easy to understand,” she says. 

In turn, Vigfusson appreciates how Zhang approaches intractable problems with a sense of wonder. “She’s doing science for all the right reasons,” he says. “She is delighted by the process of exploration and by traversing the frontiers of the unknown.” 

In 2016, Vigfusson received a National Science Foundation Faculty Early Career Development Program (CAREER) grant to explore cache systems. Yang took the lead on the project while he was an Emory master’s student. As a PhD student at Carnegie Mellon, Yang continued to collaborate with Vigfusson and helped to mentor Zhang when she arrived at Emory in 2019. 

How caching works 

While caching can be thought of as a well-organized closet for a computer, it is difficult to know what should go into that closet when millions of people, with constantly changing needs, are using it. 

The fast memory of the cache is expensive to run yet critical to a good experience for web users. The goal is to keep the most useful, future information within the cache. Other objects must be continuously winnowed out, or “evicted” in tech terminology, to make room for the changing array of hot objects.

Cache-eviction algorithms determine what objects to toss and when to do so. 

FIFO, or “first-in, first-out,” is a classic eviction algorithm developed in the 1960s. Imagine objects lined up on a conveyor belt. Newly requested objects enter on the left and the oldest objects get evicted when they reach the end of the line on the right. 

In the LRU, or “least recently used,” algorithm the objects also move along the line towards eviction at the end. However, if an object is requested again while it moves down the conveyor belt, it gets moved back to the head of the line. 

Hundreds of variations of eviction algorithms exist but they have tended to take on greater complexity to gain efficiency. That generally means they are opaque to reason about and require high maintenance, especially when dealing with massive workloads. 

“If an algorithm is very complicated, it tends to have more bugs, and all of those bugs need to be fixed,” Zhang explains. 

A simple idea 

Like LRU and some other algorithms, SIEVE makes a simple tweak on the basic FIFO scheme. 

SIEVE initially labels a requested object as a “zero.” If the object is requested again as it moves down the belt, its status changes to “one.” When an object labeled “one” makes it to the end of the line it is automatically reset to “zero” and evicted. 

A pointer, or “moving hand,” also scans the objects as they travel down the line. The pointer starts at the end of the line and then jumps to the head, moving in a continuous circle. Anytime the pointer hits an object labeled “zero,” the object is evicted. 

“It’s important to evict unpopular objects as quickly as possible, and SIEVE is very fast at this task,” Zhang says. 

In addition to this quick demotion of objects, SIEVE manages to maintain popular objects in the cache with minimal computational effort, known as “lazy promotion” in computer terminology. The researchers believe that SIEVE is the simplest cache-eviction algorithm to effectively achieve both quick demotion and lazy promotion. 

A lower miss ratio 

The purpose of caching is to achieve a low miss ratio — the fraction of requested objects that must be fetched from “the warehouse.” 

To evaluate SIEVE, the researchers conducted experiments on open-source web-cache traces from Meta, Wikimedia, X and four other large datasets. The results showed that SIEVE achieves a lower miss ratio than nine state-of-the-art algorithms on more than 45% of the traces. The next best algorithm has a lower miss ratio on only 15%. 

The ease and simplicity of SIEVE raise the question of why no one came up with the method before. The SIEVE team’s focus on how patterns of web traffic have changed in recent years may have made the difference, Zhang theorizes. 

“For example,” she says, “new items now become ‘hot’ quickly but also disappear quickly. People continuously lose interest in things because new things keep coming up.” 

Web-cache workloads tend to follow what are known as generalized Zipfian distributions, where a small subset of objects account for a large proportion of requests. SIEVE may have hit a Zipfian sweet spot for current workloads. 

“It is clearly a transformative moment for our understanding of web-cache eviction,” Vigfusson says. “It changes a construct that’s been used blindly for so long.” 

Marquee companies that manage massive amounts of web traffic are making inquiries, he notes, adding, “Even a tiny improvement in a web-caching system can save millions of dollars at a major data center.”

Zhang and Yang are on track to receive their PhDs in May. 

“They are doing incredible work,” Vigfusson says. “It’s safe to say that both of them are now among the world experts on web-cache eviction.”

Related:



Thursday, October 19, 2023

Math trio makes new points about size of the smallest triangle

"It's a very rich area, to study tiny, small-scale shapes and uncover what the math hidden there can tell us," says Emory mathematician Cosmin Pohoata.

By Carol Clark

Think of a square dotted with points. Now imagine the smallest triangle that could be made by connecting three of those points. That’s the Heilbronn triangle problem in a nutshell. 

“The problem is very easy to state and can sound frivolous,” says Cosmin Pohoata, a theoretical mathematician and Emory assistant professor of mathematics “When I have conversations with non-math friends, they often ask me why we should study problems like this. The beauty of them is that they are often more complex than they seem. They can have unexpected connections that open new doors for understanding all sorts of phenomena.” 
 
Pohoata and two MIT graduate students, Alex Cohen and Dimitrii Zakharov, recently opened some of those new doors. They completed a new proof for the Heilbronn triangle problem that shows that the smallest triangle in a confined space is much smaller than was previously realized, breaking a record that stood for 40 years. 
 
Their proof, available online, is submitted to the Journal of the American Mathematical Society and is already making waves in the math world. 

“I think it’s a stunning result,” Anthony Carbery, a mathematician at the University of Edinburgh, told Qanta Magazine. And Thomas Bloom of the University of Oxford told Qanta that he expects the new proof to “prompt a renaissance” of progress on the triangle problem. 
 
“What makes the proof special to me,” Pohoata says, “is the way that we connected the triangle problem to different areas of math. In particular, harmonic analysis, the study of how waves interact with one another and projection theory, or the behavior of fractals under projections.” 

A graphic representation of the Heilbronn triangle problem.

The making of a mathematician 

Pohoata loved math from the time he was a small child growing up in Romania. He cites an elementary school teacher, who encouraged his love for numbers and patterns, as one key influence. 

In middle school he began competing in International Mathematical Olympiads (IMO). Romania is the original home of the IMO, which dates back to 1959, making it the oldest of the International Science Olympiads. Today more than 100 countries compete in the annual event. 

“I thought I knew a lot about math because the problems in class had been so easy for me,” Pohoata recalls. “When I started competing in the Olympiads, I began to realize how little I knew and how much math was out there to learn.” 

He began to think about math as a career. “It’s quite fun to get to think about problems that interest you,” he says. 

Pohoata attended Princeton as an undergraduate, got his PhD at the California Institute of Technology and taught at Yale before joining Emory this fall. 

Patterns in points and lines 

As a theoretical mathematician, Pohoata focuses his research on three specialized fields: Discrete geometry, additive number theory and extremal combinatorics. 

Extremal combinatorics examines how large or small finite objects such as graphs can be, if placed under certain restrictions. For centuries, it seemed like an esoteric endeavor. 

“The breadcrumbs to extremal combinatorics trace back to ancient Greece,” Pohoata says, “but the field didn’t really come alive until the late 20th century with the rise of computers and the internet. Graphs are at the heart of many things to do with computer science and the internet.” 

Facebook “friend” networks, for example, are large collections of data that are essentially graphs. “You can think of people in the world as points on paper and then draw arrows connecting the ones who are friends,” Pohoata explains. “Then you can look at basic questions underlying these structures. If you have at least seven points do you always have triangles? When do you see a big cluster of connected vertices? Are there areas of the world that are less connected than others?” 

Real-life problems, like how to make algorithms run faster, fuel interest in studying problems about graphs and related areas. 

“As a theorist, I’m driven simply by the math behind shapes and the beauty of them,” Pohoata says, “but I do get excited when I hear that some math breakthrough has been used in a cool way to help with a practical problem.” 

The human side of math 

“I wasn’t interested in the history of math when I first started out,” Pohoata says. “Why learn the progression of results if you have the latest result?” 

But when he taught an introductory course to number theory, it forced him to look more closely at the history of the greats and trace the chronology of events. “I started realizing that you can get many new ideas by following the progress of the past rather than just focusing on the latest thing,” he says. “And I personally learn better when I follow the story of the people in the history of math. You feel the math differently, too, when you put yourself in someone else’s shoes.” 

Pohoata’s interest in the Heilbronn triangle problem inspired him to delve deeper into the work of Klaus Roth, a German-British mathematician who won math’s highest honor, the Fields Medal. “Roth does elegant math that has inspired a lot of activity,” Pohoata says. 

In 1951, Roth developed a strategy for finding the smallest possible triangle within the parameters of the Heilbronn triangle problem, or its so-called “upper bound.” Austrian mathematician Wolfgang Schmidt pushed the upper bound further in a paper published in 1972. That inspired Roth to jump back into the game. Roth further improved the result by Schmidt, just a few months after Schmidt’s breakthrough. 

A tiny problem 

“Roth and Schmidt had a kind of rivalry to see who could come up with the best recipe to find even smaller triangles,” Pohoata says. “They were writing beautiful papers, improving on each other’s work. I learned a lot by studying them.” 

In 1980, a trio of mathematicians — Komlos, Pintz and Szemeredi — pushed the envelope even further, finding a new upper bound to the Heilbronn triangle problem. 

While the problem in its simplest form can be thought of as dots and lines drawn on paper, the mathematicians are working with triangles far too tiny to be “seen” without special tools. 

“You can think of these triangles as microscopic,” Pohoata says. “We’re talking about billions of points crammed within a confined space.” 

Just as scientists keep making improvements in microscopy to get an ever more detailed view of the tiniest parts of a living system or of distant galaxies imperceptible to human eyes on Earth, theoretical mathematicians create tools to get closer and sharper views of the math underlying the universe and everything in it. 

Making connections 

Pohoata had pondered the Heilbronn triangle problem for several months with Zakharov. He met Cohen last year in a chance encounter at MIT, where he had traveled to give a presentation. 

“When math people get together, they like to talk about recent problems on their minds,” he says. “We were excited to learn that we were taking similar approaches to the Heilbronn triangle problem. And that we were all stuck in the same place.” 

The trio decided to join forces. Unlike many of their math heroes of the past, who communicated across distances by letter, they exchanged ideas in real time through Zoom and the Discord instant-messaging platform. 

“Math research is becoming more of a social experience as the world has become more connected,” Pohoata says. “Technology facilitates collaboration.” 

Rather than a single, euphoric eureka moment, he describes the process of creating their 40-page proof as a series of smaller insights. “There are many moving pieces to this proof and each one had to come together,” Pohoata explains. “There was a lot of going back and forth to get all the pieces to fit. You can think of it like putting together a really complex Lego structure.” 

Ultimately, their breakthrough revealed new connections between the Heilbronn triangle problem and other areas of mathematics, including harmonic analysis and fractals — figures that are similar and keep repeating one another at smaller and smaller scales. 

Pohoata and his two MIT colleagues are continuing to work on explaining this web of connections in more detail. “It’s a very rich area, to study tiny, small-scale shapes and uncover what the math hidden there can tell us,” Pohoata says. “What makes math fascinating is that it’s the language for how things work in the world.” 

Related:

Monday, August 7, 2023

Physicists open new path to exotic form of superconductivity

"Everything we learn about the world has potential applications," says Emory physicist Luiz Santos, senior author of the paper.

By Carol Clark

Physicists have identified a mechanism for the formation of oscillating superconductivity known as pair-density waves. Physical Review Letters published the discovery, which provides new insight into an unconventional superconductive state seen in certain materials, including high-temperature superconductors. 

“We discovered that structures known as Van Hove singularities can produce modulating, oscillating states of superconductivity,” says Luiz Santos, assistant professor of physics at Emory University and senior author of the study. “Our work provides a new theoretical framework for understanding the emergence of this behavior, a phenomenon that is not well understood.” 

First author of the study is Pedro Castro, an Emory physics graduate student. Co-authors include Daniel Shaffer, a postdoctoral fellow in the Santos group, and Yi-Ming Wu from Stanford University. 

The work was funded by the U.S. Department of Energy’s Office of Basic Energy Sciences. 

The puzzle of superconductivity 

Santos is a theorist who specializes in condensed matter physics. He studies the interactions of quantum materials — tiny things such as atoms, photons and electrons — that don’t behave according to the laws of classical physics. 

Superconductivity, or the ability of certain materials to conduct electricity without energy loss when cooled to a super-low temperature, is one example of intriguing quantum behavior. The phenomenon was discovered in 1911 when Dutch physicist Heike Kamerlingh Onnes showed that mercury lost its electrical resistance when cooled to 4 Kelvin or minus 371 degrees Fahrenheit. That’s about the temperature of Uranus, the coldest planet in the solar system. 

It took scientists until 1957 to come up with an explanation for how and why superconductivity occurs. At normal temperatures, electrons roam more or less independently. They bump into other particles, causing them to shift speed and direction and dissipate energy. At low temperatures, however, electrons can organize into a new state of matter. 

“They form pairs that are bound together into a collective state that behaves like a single entity,” Santos explains. “You can think of them like soldiers in an army. If they are moving in isolation they are easier to deflect. But when they are marching together in lockstep it’s much harder to destabilize them. This collective state carries current in a robust way.” 

A holy grail of physics 

Superconductivity holds huge potential. In theory, it could allow for electric current to move through wires without heating them up, or losing energy. These wires could then carry far more electricity, far more efficiently. 

“One of the holy grails of physics is room-temperature superconductivity that is practical enough for everyday-living applications,” Santos says. “That breakthrough could change the shape of civilization.”

Many physicists and engineers are working on this frontline to revolutionize how electricity gets transferred. 

Meanwhile, superconductivity has already found applications. Superconducting coils power electromagnets used in magnetic resonance imaging (MRI) machines for medical diagnostics. A handful of magnetic levitation trains are now operating in the world, built on superconducting magnets that are 10 times stronger than ordinary electromagnets. The magnets repel each other when the matching poles face each other, generating a magnetic field capable of levitating and propelling a train. 

The Large Hadron Collider, a particle accelerator that scientists are using to research the fundamental structure of the universe, is another example of technology that runs through superconductivity. 

Superconductivity continues to be discovered in more materials, including many that are superconductive at higher temperatures.  

An accidental discovery 

One focus of Santos’ research is how interactions between electrons can lead to forms of superconductivity that cannot be explained by the 1957 description of superconductivity. An example of this so-called exotic phenomenon is oscillating superconductivity, when the paired electrons dance in waves, changing amplitude. 

In an unrelated project, Santos asked Castro to investigate specific properties of Van Hove singularities, structures where many electronic states become close in energy. Castro’s project revealed that the singularities appeared to have the right kind of physics to seed oscillating superconductivity. 

That sparked Santos and his collaborators to delve deeper. They uncovered a mechanism that would allow these dancing-wave states of superconductivity to arise from Van Hove singularities. 

“As theoretical physicists, we want to be able to predict and classify behavior to understand how nature works,” Santos says. “Then we can start to ask questions with technological relevance.” 

Some high-temperature superconductors — which function at temperatures about three times as cold as a household freezer — have this dancing-wave behavior. 

The discovery of how this behavior can emerge from Van Hove singularities provides a foundation for experimentalists to explore the realm of possibilities it presents. 

“I doubt that Kamerlingh Onnes was thinking about levitation or particle accelerators when he discovered superconductivity,” Santos says. “But everything we learn about the world has potential applications.” 

Related:

Chemists crack complete quantum nature of water 

New evidence for a unifying theory of granular physics

Wednesday, July 26, 2023

Merck Prize boosts work on air sensor for pandemic pathogens

"There is a need for viral-detecting devices for public indoor air spaces as we enter an era when pandemics will likely become more common," says Emory chemist Khalid Salaita.

Merck KGaA, Darmstadt, Germany, awarded its 2023 Future Insight Prize to Khalid Salaita, professor of chemistry at Emory University. The award comes with $540,000 to fund the next phase of research into an air sensor that can continuously monitor indoor spaces for pathogens that can cause pandemics. 

“I’m extremely thankful to receive the Future Insight Prize as this enables us to continue our path toward an early-warning system for emerging threats,” Salaita says. “Our research sets the stage for fully automated detection of airborne pathogens without human intervention or sample processing.” 

The Merck Future Insight Prize recognizes groundbreaking ideas to solve some of the world’s most pressing challenges in health, nutrition and energy. 

The Salaita lab’s sensor, a rolling micro-motor called “Rolosense,” holds the potential to help mitigate, or even prevent, a pandemic. 

Read the full story here.

Wednesday, March 22, 2023

As the worm turns: New twists in behavioral association theories

The researchers conducted experiments on C. elegans, a roundworm with just 300 neurons, that offers a simple laboratory model for studying how an animal learns.

By Carol Clark

Physicists have developed a dynamical model of animal behavior that may explain some mysteries surrounding associative learning going back to Pavlov’s dogs. The Proceedings of the National Academy of Sciences (PNAS) published the findings, based on experiments on a common laboratory organism, the roundworm C. elegans

“We showed how learned associations are not mediated by just the strength of an association, but by multiple, nearly independent pathways — at least in the worms,” says Ilya Nemenman, an Emory professor of physics and biology whose lab led the theoretical analyses for the paper. “We expect that similar results will hold for larger animals as well, including maybe in humans.” 

“Our model is dynamical and multi-dimensional,” adds William Ryu, an associate professor of physics at the Donnelly Centre at the University of Toronto, whose lab led the experimental work. “It explains why this example of associative learning is not as simple as forming a single positive memory. Instead, it’s a continuous interplay between positive and negative associations that are happening at the same time.” 

First author of the paper is Ahmed Roman, who worked on the project as an Emory graduate student and is now a postdoctoral fellow at the Broad Institute. Konstaintine Palanski, a former graduate student at the University of Toronto, is also an author. 

The conditioned reflex

More than 100 years ago, Ivan Pavlov discovered the “conditioned reflex” in animals through his experiments on dogs. For example, after a dog was trained to associate a sound with the subsequent arrival of food, the dog would start to salivate when it heard the sound, even before the food appeared. 

About 70 years later, psychologists built on Pavlov’s insights to develop the Rescorla-Wagner model of classical conditioning. This mathematical model describes conditioned associations by their time-dependent strength. That strength increases when the conditioned stimulus (in Pavlov dog’s case the sound) can be used by the animal to decrease the surprise in the arrival of the unconditioned response (the food). 

Such insights helped set the stage for modern theories of reinforcement learning in animals, which in turn enabled reinforcement learning algorithms in artificial intelligence systems. But many mysteries remain, including some related to Pavlov’s original experiments. 

After Pavlov trained dogs to associate the sound of a bell with food he would then repeatedly expose them to the bell without food. During the first few trials without food, the dogs continued to salivate when the bell rang. If the trials continued long enough, the dogs “unlearned” and stopped salivating in response to the bell. The association was said to be “extinguished.” 

Pavlov discovered, however, that if he waited a while and then retested the dogs, they would once again salivate in response to the bell, even if no food was present. Neither Pavlov nor more recent associative-learning theories could accurately explain or mathematically model this spontaneous recovery of an extinguished association. 

Teasing out the puzzle

Researchers have explored such mysteries through experiments with C. elegans. The one-millimeter roundworm only has about 1,000 cells and 300 of them are neurons. That simplicity provides scientists with a simple system to test how the animal learns. At the same time, C. elegans’ neural circuitry is just complicated enough to connect some of the insights gained from studying its behavior to more complex systems. 

Earlier experiments have established that C. elegans can be trained to prefer a cooler or warmer temperature by conditioning it at a certain temperature with food. In a typical experiment, the worms are placed in a petri dish with a gradient of temperatures but no food. Those trained to prefer a cooler temperature will move to the cooler side of the dish, while the worms trained to prefer a warmer temperature go to the warmer side. 

But what exactly do these result mean? Some believe that the worms crawl toward a particular temperature in expectation of food. Others argue that the worms simply become habituated to that temperature, so they prefer to hang out there even without a food reward. 

The puzzle could not be resolved due to a major limitation of many of these experiments — the lengthy amount of time it takes for a worm to traverse a nine-centimeter petri dish in search of the preferred temperature. 

Measuring how learning changes over time

Nemenman and Ryu sought to overcome this limitation. They wanted to develop a practical way to precisely measure the dynamics of learning, or how learning changes over time. 

Ryu’s lab used a microfluidic device to shrink the experimental model of nine-centimeter petri dishes into four-millimeter droplets. The researchers could rapidly run experiments on hundreds of worms, each worm encased within its individual droplet. 

“We could observe in real time how a worm moved across a linear gradient of temperatures,” Ryu says. “Instead of waiting for it to crawl for 30 minutes or an hour, we could much more quickly see which side of the droplet, the cold side or the warm side, that the worm preferred. And we could also follow how its preferences changed with time.” 

Their experiments confirmed that if a worm is trained to associate food with a cooler temperature it will move to the cooler side of the droplet. Over time, however, with no food present, this memory preference seemingly decays. 

“We found that suddenly the worms wanted to spend more time on the warm side of the droplet,” Ryu says. “That’s surprising because why would the worms develop a different preference and even avoidance of the temperature they had come to associate with food?” 

Eventually the worm begins moving back and forth between the cooler and warmer temperatures. The researchers hypothesized that the worm does not simply forget the positive memory of food associated with cooler temperatures but instead starts to negatively associate the cooler side with no food. That spurs it to head for the warmer side. Then as more time passes, it begins to form a negative association of no food with the warmer temperature, which combined with the residual positive association to the cold, makes it migrate back to the cooler one. 

“The worm is always learning, all the time,” Ryu explains. “There is an interplay between the drive of a positive association and a negative association that causes it to start oscillating between cold and warm.” 

'It's like when you lose your keys'

Nemenman’s team developed theoretical equations to describe the interactions over time between the two independent variables — the positive, or excitatory, association that drives a worm toward one temperature and the negative, or inhibitory, association that drives it away from that temperature. 

“The side that the worm gravitates toward depends on when exactly you take the measurements,” Nemenman explains. “It’s like when you lose your keys you may check the desk where you usually keep them first. If you don’t see them there right away, you run around different places looking for them. If you still don’t find them, you go back to the original desk figuring you just didn’t look hard enough.” 

The researchers repeated the experiments under different conditions. They trained the worms at different starting temperatures and starved them for different durations before testing their temperature preference, and the worms’ behaviors were correctly predicted by the equations. 

They also tested their hypothesis by genetically modifying the worms, knocking out the insulin-like signaling pathway known to serve as a negative association pathway. 

“We perturbed the biology in specific ways and when we ran the experiments, the worm’s behavior changed as predicted by our theoretical model,” Nemenman says. “That gives us more confidence that the model reflects the underlying biology of learning, at least in C. elegans.” 

The researchers hope that others will test their model in studies of larger animals across species. 

“Our model provides an alternative quantitative model of learning that is multi-dimensional,” Ryu says. “It explains results that are difficult, or in some cases impossible, for other theories of classical conditioning to explain.” 

Related:

Physicists develop theoretical model for neural activity of mouse brain

Machine learning used to understand and predict dynamics of worm behavior

Thursday, November 17, 2022

New chemistry toolkit speeds analyses of molecules in solution

"We've freed the researchers from most of the tedious, manual tasks of data input," says Emory theoretical chemist Fang Liu, center. Her team members who developed the toolkit include Emory graduate student Ariel Gale, left, and postdoctoral fellow Eugen Husk, right. Not shown is Xiao Huang, who worked on the project as an undergraduate.

By Carol Clark

A new open-source toolkit automates the process of computing molecular properties in the solution phase, clearing new pathways for artificial-intelligence design and discovery in chemistry and beyond. The Journal of Chemical Physics published the free, open-source toolkit developed by theoretical chemists at Emory University. 

Known as AutoSolvate, the toolkit can speed the creation of large, high-quality datasets needed to make advances in everything from renewable energy to human health. “By using our automated workflow, researchers can quickly generate 10, or even 100 times, more data compared to the traditional approach,” says Fang Liu, Emory assistant professor of chemistry and corresponding author of the paper. “We hope that many researchers will access our toolkit to perform high-throughput simulation and data curation for molecules in solution.” 

Such datasets, Liu adds, will provide a foundation for applying state-of-the-art machine-learning techniques to drive innovation in a broad range of scientific endeavors. 

First author of the paper is Eugen Hruska, a postdoctoral fellow in the Liu lab. Co-authors include Emory PhD candidate Ariel Gale and Xiao Huang, who worked on the paper as an Emory undergraduate and is now a graduate student of chemistry at Duke University. 

Exploring the quantum world 

A theoretical chemist, Liu leads a team specializing in computational quantum chemistry, including modeling and deciphering molecular properties and reactions in the solution phase. 

The world becomes much more complex as it shrinks down to the scale of atoms and small molecules, where quantum mechanics describes the wave-particle duality of energy and matter. 

Theoretical chemists use supercomputers to simulate the structures of molecules and the vast array of interactions that can occur during a reaction so that they can make predictions about how a molecule will behave under certain conditions. Understanding these dynamics is key to identifying promising molecules for various applications and for driving reactions efficiently. 

Researchers have already generated datasets for the properties of many molecules in the gas phase. Molecular properties in the solution phase, however, remain relatively unexplored in the context of big data and machine learning, despite the fact that most reactions occur in solution. 

The problem is that studying a molecule in solution requires much more time and effort. 

A complicated process 

“In the gas phase, molecules are far from each other,” Liu explains, “so when we study a molecule of interest, we don’t have to consider its neighbors.” 

In the solution phase, however, a molecule is closely immersed with many other molecules, making the system much larger. “Imagine a solvent molecule surrounded by layers and layers of water molecules,” Liu says. “Depending on its size and structure, a molecule may be covered by tens, or even up to hundreds, of water molecules. In systems of such large size, the computation will be slow and may not even be feasible.” 

Before running a quantum chemistry program for a molecule in the solution phase it’s necessary to first determine the geometry of the molecule and the location and orientation of the surrounding solvent molecules. 

“This process is difficult to do,” Liu says. “It takes so much time and effort, and it’s so complicated, that a researcher can only perform this calculation for a few systems that they care about in one paper,” Liu says. 

Technical issues can also arise during each step in the process, she adds, leading to errors in the results.

A streamlined solution 

Liu and her colleagues replaced the complicated steps required to perform these calculations with their automated system AutoSolvate. 

Previously, a computational chemist might have to type hundreds of lines of code into a supercomputer to run a simulation. The command-line interface for AutoSolvate, however, requires just a few lines of code to conduct hundreds of calculations automatically. 

“The time for running the simulations may be long, but that’s a job for the computer,” Liu says. “We’ve freed the researchers from most of the tedious, manual tasks of data input so that they can focus on analyzing their results and other creative work.” 

In addition to the command-line interface geared toward more experienced theoretical chemists, AutoSolvate includes an intuitive graphical interface that is suitable for graduate students who are learning to run simulations. 

Labs can now efficiently generate many data points for solvated molecules and then use the dataset to build machine-learning models for chemical design and discovery. AutoSolvate also makes it easier to build and share datasets across different research groups. 

Setting the stage for machine learning 

“During the past 10 years, machine learning has become a popular tool for chemistry but the lack of computational datasets has been a bottleneck,” Liu says. “AutoSolvate will allow the research community to curate a huge number of datasets for molecular properties in the solution phase.” 

Determining the redox potential of a solvent molecule, or the likelihood for an oxidation to occur, is just one example of a key research area that AutoSolvate could help advance. Redox-active molecules hold potential for applications in the development of anticancer drugs and chemical batteries for renewable-energy storage. 

“Building up redox-potential datasets will then allow us to use machine learning to look at millions of different compounds to rapidly find the ones with redox potential within the desired range,” Liu says.

Instead of a black-box result, such analyses of large datasets can yield interpretable artificial intelligence, or basic rules for molecular models.  

“The ultimate goal is to identify rules that can then be applied to solve a broad range of fundamental science problems,” Liu says. 

The development of AutoSolvate was funded by Emory University with computational resources provided by the National Science Foundation.

Related:

A new spin on computing: Chemist leads $2.9 DOE quest for quantum software

Chemists crack complete quantum nature of water

Chemists map cascade of reactions for producing atmosphere's 'detergeant'