Information Sciences Institute leads initiative to increase bandwidth availability around the nation

Story Headline and Deck – USC News *
ISI leads an initiative in partnership with the National Science Foundation (NSF), Idaho National Laboratory (INL) and the University of Utah (UoU) to expand spectrum access across the nation
Body Copy *
As the world experiences unprecedented waves of technological innovation, communication needs are multiplying. Between our smartphones, tablets, computers, and even smartwatches, it’s becoming harder for wireless communication services to meet the ever-increasing demand.

Each new development requires more of a limited resource that makes communicating over airways possible, known as the electromagnetic spectrum. The more we innovate, the more spectrum access, or bandwidth, we need.

Scientists envision a future where we communicate through virtual reality or avatars, which would-you guessed it-require even more bandwidth to function.

Here’s the kicker-these wireless communication services are also competing with scientific activities, such as radio astronomy and climate research, for spectrum access. Right now, there’s simply not enough to go around.

Limited spectrum availability is quite literally preventing advancements in science and the development of faster communications for society as a whole.

We need a solution, and we need it fast.

USC Viterbi Information Sciences Institute (ISI)’s Alefiya Hussain, Idaho National Laboratory (INL)’s Arupjyoti Bhuyan, and Robert Ricci of The University of Utah (UoU), are collaborating on a proposal known as Advanced Spectrum Initiative for Research and Experimentation (ASPIRE).

ASPIRE seeks to create this bandwidth availability through a project sponsored by the National Science Foundation (NSF) known as Spectrum Innovation Initiative: National Radio Dynamic Zones (SII-NRDZ). The goal of SII-NRDZ is to address these issues through dynamic spectrum sharing.

The Project
The SII-NRDZ program supports promising project proposals from spectrum sharing researchers with funding. ASPIRE received an Engineering and Execution Lead award from NSF and subsequently launched just a few months ago in January.

The project is centered around radio dynamic zones: geographically bounded areas that are able to autonomously regulate and control electromagnetic energy entering or leaving the parameters.

Alefiya Hussain, lead researcher at ISI, said the plan is to use designated radio dynamic zones as testing sites to experiment with dynamic spectrum sharing through field trials, and look for ways that “multiple entities can harmoniously coexist.” In other words the team is finding new ways where the needs of commercial and scientific groups can be met at the same time.

“The radio dynamic zone is creating essentially these experimentation spaces for testbeds that allows us to investigate what is a good combination of frequency multiplexing or time-based multiplexing within the spectrum space to be able to effectively use it,” she said.

The Current Method
The United States has tackled the management of spectrum access through the creation of an allocation chart that segments off, in color codes, which frequencies belong to each service. It worked for decades, but now that we’re using up all of the spectrum, smoothing out inefficiencies in the chart is critical to opening up more access.

For example, with the chart an individual service can only operate in its denoted spot, which Hussain said can be wasteful because spectrum access that is available is often left unused.

“Traditionally, one entity was given that spectrum, and only they used it. There were many times when they didn’t use it, but since nobody else was allowed to use it, it goes wasted with this sort of fixed allocation mechanism,” she explained.

The goal, she said, is to have a more “dynamic, flexible allocation” so that one day, the chart can be replaced by a self-regulating radio dynamic zone that both allocates spectrum access more efficiently and redistributes it to meet immediate needs.

The United States currently has a National Radio Quiet Zone (NRQZ) in Virginia where radio astronomy takes place.This protects experimental activities that need to pick up tiny astronomy signals from interference. Hussain said the NRQZ is basically a “radio vacuum” where the use of any sort of wireless device-through phones, bluetooth, WiFi, and other means-is banned.

The NRQZ creates space for passive experimentation, whereas the NRDZ would allow for active experimentation.

Think about it this way: you’re in a room with a large group of people talking loudly among themselves. The NRQZ scenario involves silencing everybody nearby so you are able to hear conversations far away. Alternatively, the second scenario (NRDZ) is if you were able to listen to every conversation that is occurring by sharing space effectively-so that everyone can talk at the right time.

The Vision: A National Radio Dynamic Zone
After experimenting with regional field trials and finding out what works and what doesn’t, the big picture objective is to take the information gathered from rigorous testing to create a permanent, national experimentation facility, somewhere in the United States.

The NRDZ would tackle coexistence and maximize utility through dynamic spectrum sharing, while also opening up a new avenue to support the next generation of spectrum science through active experimentation.

The average person would see an improvement in the speed and communication abilities of their devices while the scientific community would gain bandwidth for their cutting edge projects. It’s a win-win.

The new science made possible with the spectrum includes radio astronomy and remote sensing, which Hussain said will involve advancements in environmental sciences, such as climate monitoring in urban areas that could help scientists “observe phenomena they had not observed before.”

Hussain noted that the NRDZ aims to provide “larger protections for next generation telescopes” that are being built currently and going to be deployed in the future. These telescopes are highly sensitive and necessitate this spectrum innovation.

Green Lights Ahead
The project is still in its early stages. In fact, the team is currently in Phase I-designing field trials. Phase II involves actually conducting the trials in regional radio dynamic zones.

The spectrum allocation chart, although it worked great for the last 20 years, is no longer able to meet society’s wireless communication demands. We are in urgent need of a new, more effective method of spectrum management, and the national radio dynamic zone could be just what the doctor ordered.

The task ahead is not an easy feat, but the implications have the potential to transform spectrum solutions for the better. Hussain said the project will “require not only technological support but also legislative support to include breakthroughs in economic, social, and behavioral sciences as well.”

It looks like in this case Plato might have been right-necessity is in fact the mother of invention. Society needs better spectrum sharing ability-ASPIRE is setting out to create it.

The post Information Sciences Institute leads initiative to increase bandwidth availability around the nation appeared first on USC News.

Researcher uses mammal DNA to zoom into human genome with unprecedented resolution

8419 school/unit website *
Story Headline and Deck – USC News *
USC researcher uses mammal DNA to zoom into the human genome with unprecedented resolution

Steven Gazal has identified base pairs of DNA that play a crucial role in human disease.
Body Copy *
“Why do humans have disease if they went through millions of years of evolution?” It’s a question Steven Gazal, PhD, assistant professor of population and public health sciences at the Keck School of Medicine of USC, hopes to answer.

Gazal is part of an international team of researchers who have become the first to precisely identify base pairs of the human genome that remained consistent over millions of years of mammalian evolution, and which play a crucial role in human disease. The findings were published in a special Zoonomia edition of Science.

Gazal and his team analyzed the genomes of 240 mammals, including humans, zooming in with unprecedented resolution to compare DNA. They were able to identify base pairs that were “constrained” – meaning they remained generally consistent – across mammal species over the course of evolution. Individuals born with mutations on these genes may not have been as successful within their species or were otherwise not likely to pass down the genetic variation. “We were able to identify where gene mutations are not tolerated in evolution, and we demonstrated that these mutations are significant when it comes to disease,” explains Gazal.

The team found that 3.3% of bases in the human genome are “significantly constrained,” including 57.6% of the coding bases that determine amino acid position, meaning these bases had unusually few variants across species in the dataset. The most constrained base pairs in mammals were over seven times more likely to be causal for human disease and complex trait, and over 11 times more likely when researchers looked at the most constrained base pairs in primates alone.

The dataset was provided by the Zoonomia consortium, which according to the project website, “is applying advances in DNA sequencing technologies to understand how genomes generate the tremendous wealth of animal diversity.” Gazal gives credit to Zoonomia for making this type of data available to researchers and anticipates it will be widely used by human geneticists. “It’s a cheap resource to generate, as opposed to datasets generated in human genetic studies,” says Gazal.

His team’s findings are a significant step forward, as Gazal notes, “we do not understand 99% of the human genome, so it is fundamental to understand which part has been constrained by evolution and is likely to have an impact on human phenotypes.” Their discoveries and methods could become crucial tools for further research.

The next step for Gazal and his team is to repeat the process with a primate-only dataset. By restricting the subjects, they hope to focus on functions of DNA that appeared more recently in human evolution. “We expect this to be even more useful in determining information on human disease,” says Gazal.


For more information and a complete list of authors, access the paper here.

The post Researcher uses mammal DNA to zoom into human genome with unprecedented resolution appeared first on USC News.

How vaccine hesitant are you? A third of Americans aren’t fully protected against COVID

Story Headline and Deck – USC News *
Are You More Vaccine Hesitant Than A 57-Year-Old?
Body Copy *
In May 2021, U.S. President Joe Biden announced his goal as getting at least 70% of Americans partially vaccinated against COVID-19 by July of that year. However, government records indicate that as late as September 2022, more than 31% of Americans were still not fully vaccinated. It has been shown this was not due to supply constraints, but rather, due to vaccine hesitancy among certain segments of the population.

Why were so many Americans hesitant about the COVID vaccine?

This is what researchers at USC Viterbi School of Engineering set out to answer. Mayank Kejriwal, Research Lead at the USC Information Sciences Institute (ISI) and a Research Assistant Professor in the Daniel J. Epstein Department of Industrial and Systems Engineering, along with PhD student Ke Shen analyzed socio-demographic variables in their paper, Using Conditional Inference To Quantify Interaction Effects of Socio-Demographic Covariates of Us COVID-19 Vaccine Hesitancy, which was recently published in PLOS Global Public Health.

With this research, they hope to lay the groundwork for future pandemic preparedness with regard to vaccine hesitancy.

Survey Says…
Kejriwal conducted a retrospective analysis on data from a COVID-19 cross-sectional Gallup survey that was administered to a representative sample of U.S.-based respondents. It was an online survey that began in March 2020, and included daily random samples of U.S. adults.

“We wanted to see whether we could predict, based on socio-demographic variables, what specific groups might be more vaccine hesitant than others,” said Kejriwal. He explained, “If we can predict that, then you could target the communication. You might know that these are the communities where we need more vaccine awareness, for example.

Using the responses of 16,322 respondents, he analyzed the relative effects of different categories of demographic variables on vaccine hesitancy. These variables were: annual household income, race/ethnicity, political party, employment status, gender, education, and “trust in the Trump administration.”

For this final variable, the 2020 Gallup question asked: ‘Please think about the recent impact of the coronavirus (COVID-19) on your life when responding to the following and indicate your level of agreement or disagreement: I have confidence in the leadership of President Donald Trump to successfully manage emerging health challenges.’ Responses to this question were recorded on a five-point scale, from strongly disagree (1) to strongly agree (5). Those who responded greater than 3 were identified as individuals who had trust in the Trump administration

How (and By How Much) Do These Variables Affect Vaccine Acceptance?
Kejriwal had two goals in mind for the survey data: 1) find the associations between the variables and vaccine acceptance; and 2) quantify and visualize the interactions between those variables and vaccine acceptance.

Using univariate regression – a model that looks to find the relationship between one variable and a target variable (vaccine acceptance in this case) – Kejriwal analyzed the Gallup data to find and measure the associations

Additionally, Kejriwal used machine learning and deep statistical analysis to take the variables and the associations between them and vaccine hesitancy and organize them into a conditional inference tree. This tree is a way to quantify and visualize the relative importance of the variables, and also show the effects between the variables

The tree shows, for example, a male non-Black Democrat who did not trust the Trump administration had high vaccine acceptance. Whereas a female under age 57 who trusted the Trump administration had very low vaccine acceptance. Both of these might seem intuitive, but with the conditional inference tree, the degree of vaccine acceptance and the relationship between the variables is quantified and visualized.

And the thought is that, with this level of precision, communications strategies could be more targeted and effective. Kejriwal found clear patterns between vaccine acceptance among different socio-demographic groups in the U.S. and hopes that his methods can be used to predict vaccine hesitancy if we ever face another pandemic.

The post How vaccine hesitant are you? A third of Americans aren’t fully protected against COVID appeared first on USC News.

Unlocking the ocean’s secret: Natural carbon capture

Scientists around the world are racing to develop new methods for combating the rising levels of carbon dioxide in our atmosphere that are driving climate change and threatening the health of our planet.

USC Assignment: Earth logo 2022Ocean carbon capture, which involves using natural ocean processes to trap and store greenhouse gases out at sea, is one promising method. Two L.A. researchers — William Berelson of USC and Jess Adkins of Caltech — are looking to harness this technology to address the problem.

“Behind every potential solution for a more sustainable world lies a story of hard work and collaboration,” USC President Carol L. Folt said. “This promising research to reduce carbon emissions between USC and Caltech will help us all achieve a more sustainable future — starting right here in Southern California.”

We met up with Berelson, professor of earth sciences, environmental studies and spatial sciences at the USC Dornsife College of Letters, Arts and Sciences, and Adkins, the Smits Family Professor of Geochemistry and Global Environmental Science at Caltech, at the docks of AltaSea at the Port of Los Angeles — one of the largest harbors in the world and a leading gateway for international trade in North America.

How does the shipping industry play a role in climate change?

Berelson: At seaports around the world, huge quantities of goods arrive daily that feed the global economy. Those goods are transported across the ocean on container ships, cargo ships and other vessels that burn diesel fuel. Collectively, all the ships in the world are contributing about 3% of the carbon dioxide that’s being added to our atmosphere every year.

Adkins: Over 90% of the products we use in our daily lives traveled on a ship at some point. If we’re going to think about how to deal with our CO2 problem as a society, we have to be mindful of the fact that we can’t electrify all parts of the industry. Shipping is a good example of an industry that doesn’t electrify well. It’s hard to imagine ships running off batteries, even though we must, as a society, get ourselves onto renewable energy.

(Q&A continues below video)

How do carbon emissions affect our oceans?

Berelson: As carbon dioxide from the atmosphere dissolves in ocean water, it increases its acidity thus causing ocean acidification. The rising annual rate of CO2 emissions leads to a corresponding increase in ocean acidification, resulting in dramatic impacts on marine ecosystems like corals and other organisms that use calcium carbonate to build shells.

People care about corals for their beauty, but these organisms are also crucial to biodiversity and sustaining the populations of fish and other marine life that live in and among the coral communities.

Adkins: Exactly. As you acidify the ocean, you make it harder for the main components of the ecosystem to grow. But another reason we should care about ocean acidification over and above the photographic megafauna that are corals is the algae out in the middle of the ocean and far away from the coast. They are the primary producers and bottom of the marine food chain where sunlight is first turned into organic matter and then becomes food for the rest of the system, humans included.

How does the ocean naturally capture carbon?

Adkins: The planet has been capturing carbon for billions of years. As the ocean absorbs excess carbon, the CO2 reacts with calcium carbonate, or limestone, that naturally occurs at the sea floor — this reaction makes the neutral salts of bicarbonate and calcium ions.

Berelson: The natural reaction that happens in the ocean is exactly what happens when you treat an upset stomach. The analogy we like to use is that when you have excess acid in your tummy, you take a little antacid tablet, which is effectively ground up calcium carbonate, to neutralize the acid.

What are you working on now?

Berelson: An idea came about during our research on how the ocean naturally mitigates excess CO2. We discovered that if we could accelerate the dissolution of limestone, it could be a way to mitigate CO2 at a larger scale. We’re developing a startup company that could one day build machinery that would allow this reaction to happen at a fast enough scale and at the right quantity to make a greater impact on CO2 reduction.

Adkins: Right. Although the ocean naturally captures carbon, it does so at a slow rate. We want to find ways of speeding up the neutralization of that extra CO2. All we have to do is follow the natural process of what happens when, say, volcanoes erupt and release CO2 into the atmosphere.

What inspired this collaboration?

Adkins: We’ve known each other for decades as friends in the field and have always talked about finding something to work on together. But it was our shared concern about ocean acidification and the idea that we might be able to make a breakthrough that brought us together to think about joining labs.

Berelson: True, we initially bonded over our common interest in chemical oceanography. That and all things having to do with major league baseball.

The post Unlocking the ocean’s secret: Natural carbon capture appeared first on USC News.

Air pollution particles trigger cellular defense mechanisms

Story Headline and Deck – USC News *
Air pollution particles trigger cellular defense mechanisms

New research from the Keck School of Medicine of USC shows that air pollution particles activate a cellular defense mechanism known as autophagy, which may reduce the ability of cells to fight off other harms.
Body Copy *
The link between air pollution and lung disease has long been recognized. Now a new USC study reveals one biological process that may be behind that link — a discovery which could provide new insights on better ways to treat or prevent diseases related to pollution exposure.

“We know that diseases, especially lung diseases, can result from air pollution exposure. What we don’t know are the mechanisms by which that occurs,” said Edward Crandall, PhD, MD, professor of pathology, member of the Hastings Center for Pulmonary Research and director of the Will Rogers Institute Pulmonary Research Center at the Keck School of Medicine of USC.

In their research, Crandall and his team discovered a key step along the path between air pollution exposure and disease. Exposure to ambient nanoparticles, or very small pollutants in the air, limits the ability of cells to defend themselves against other potential harms. The findings were published in the journal Autophagy Reports.

Crandall, the study’s senior author, and his colleagues studied a cellular defense process known as autophagy, which cells use to destroy damaged or abnormal internal materials. For the first time, the researchers found that, when exposed to nanoparticles, autophagy activity in cells seems to reach an upper threshold.

“The implication of these studies is that autophagy is a defense mechanism that has an upper limit, beyond which it can’t defend the cell any further,” Crandall said.

An upper threshold

The researchers conducted a series of tests using lung adenocarcinoma cells. They first exposed the cells to nanoparticles, then to rapamycin (a chemical known to stimulate autophagy), then to both nanoparticles and rapamycin. In every case, autophagy activity reached the same upper threshold and did not increase further.

Consequently, cells may lack the ability to further boost autophagy to defend against other dangers, such as smoke inhalation or a viral or bacterial infection. This may help explain why air pollution increases a person’s risk for a number of acute and chronic lung diseases, including lung cancer, interstitial pulmonary fibrosis, and chronic obstructive pulmonary disease.

As part of the research, Crandall and his team also developed a new method of studying autophagy, which can support future studies on the subject. They used a combination of fluorescent dyes and a powerful imaging method, known as confocal microscopy, to document the amount of autophagy taking place inside individual cells.

“What’s special is that we can now measure the autophagic activity of single living cells in real time. It’s a novel method for studying autophagy,” said Arnold Sipos, MD, PhD, assistant professor of research pathology at the Keck School of Medicine and the study’s first author.

More research on autophagy

The new findings can help support ongoing research on autophagy, including for cancer treatment. While autophagy is a boon for healthy cells, it makes cancer cells harder to destroy. Developing methods to raise or lower autophagy in cells could be a key way to protect against and treat disease.

“The more we know about the mechanisms by which diseases occur, the more opportunity we have to find places in the pathway where we can intervene and prevent or treat the disease,” Crandall said.

Next, Crandall, Sipos and their colleagues will conduct further research to test whether adding nanoparticles to a cell directly increases its vulnerability to other threats, such as an infection. They plan to study the link in both healthy cells and cancer cells.

About this study

In addition to Crandall and Sipos, the study’s other authors are Kwang-Jin Kim of the Department of Pathology, Keck School of Medicine of USC and the Department of Biomedical Engineering, USC Viterbi School of Engineering and Constantinos Sioutas, Department of Civil and Environmental Engineering, USC Viterbi School of Engineering.

This work was supported by the Will Rogers Motion Picture Pioneers Foundation; Whittier Foundation; Hastings Foundation; and the National Institutes of Health [R01ES017034, U01HL108364, P01AG055367].

The post Air pollution particles trigger cellular defense mechanisms appeared first on USC News.

AI helps place drones in remote areas for faster emergency response

For residents of rural and underserved areas, access to emergency medical care can be a matter of life and death. With limited access to health care services and long ambulance wait times due to distance, these communities face challenges that can significantly affect their health and well-being. In the case of cardiac arrest, when every minute counts, finding solutions to improve response times is critical to saving lives.

USC researchers are exploring the use of AI-powered decision-making to deploy life-saving equipment in data-scarce settings like rural neighborhoods to enable faster emergency response times, improve the design of emergency response systems and potentially save lives. Results from a recent study show the potential for AI to help emergency responders make informed and efficient decisions in settings where data is limited.

The study, published in the journal Operations Research, focuses on developing a new method for using data to choose between candidate ways to design a system. To demonstrate their method, the researchers examined a case study involving a Toronto-based pilot program that deploys drones in conjunction with ambulances to respond to calls about cardiac arrest events.

“Our methods have the potential to revolutionize the way we design and optimize systems in data-scarce settings that extend beyond emergency response. It can help us make more informed and efficient decisions across a range of fields where data is limited,” said corresponding author Michael Huang, a doctoral candidate in the Data Science and Operations department at the USC Marshall School of Business.

No data, no problem: AI-driven methods fill the gaps

When a bystander calls in to report someone near them is experiencing cardiac arrest, emergency responders in the Toronto pilot program have two options: They can either send an ambulance, or they can send an ambulance and deploy a drone with an automated external defibrillator (AED) attached. The AED is a small device that bystanders can use — with no medical training — to attach to the patient and restart their heart before the ambulance arrives. The drone’s ability to get to the patient faster than the ambulance can significantly improve their chances of survival.

This raises key questions about where to place drone depots and how to determine the appropriate response to an emergency situation.

“We initially thought that the main question was where to deploy the drone, but in reality, the first-order question is where to put the drone depots,” said Vishal Gupta, an associate professor of data sciences and operations at USC Marshall.

Ambulances rarely go to these remote locations, so we don’t have a lot of data on travel times.

Vishal Gupta, USC Marshall

“We want to strategically place them in locations that are both close to where cardiac arrests occur, but also in areas that are difficult to reach by ambulance. The challenge here is that data on ambulance travel times to remote locations is scarce, making it difficult to estimate. Ambulances rarely go to these remote locations, so we don’t have a lot of data on travel times,” said Gupta, who also holds a courtesy appointment in the Daniel J. Epstein Department of Industrial and Systems Engineering at the USC Viterbi School of Engineering.

The researchers found that for cardiac arrest events in rural areas where ambulance wait times are longer than in urban areas and where there is limited data, their method leads to significantly more effective decisions on when to dispatch the drone and where to place depots compared to conventional approaches.

The AI-driven methodology can be applied to various fields and areas of public policy, including where to place speed bumps to reduce traffic fatalities or the most efficient location for power lines, where the true construction costs are often unknown and estimates are made based on rough figures.

“We often hear about big data and its potential, but in many cases, data is still scarce, especially in settings where data collection is expensive or limited by privacy concerns,” Gupta said. “There are also cases where collection events are rare, which can make it challenging to design systems and make informed decisions. With AI tools, we can address these challenges and make better decisions even in data-limited settings.”

About the study: Paat Rusmevichientong, the Justin Dart Professor of Operations Management and a professor of data sciences and operations at USC Marshall, served as co-author of the study.

The study was partially funded by the National Science Foundation under Grant No. CMMI-1661732. All three authors thank Justin J. Boutilier and Timothy C.Y. Chan of the University of Toronto for sharing simulation results and details pertaining to their 2019 paper, “Response time optimization for drone-delivered automated external defibrillators.”

The post AI helps place drones in remote areas for faster emergency response appeared first on USC News.

Heavy drinking poses even greater risk for 1 in 3 Americans


on your school/unit website *
Story Headline and Deck – USC News *

A USC study shows that metabolic syndrome — a cluster of conditions that raise the risk of heart disease, diabetes, stroke and other health problems — more than doubles the risk of advanced liver disease among heavy drinkers.
Body Copy *
LOS ANGELES — Two people regularly have a few alcoholic drinks daily. One develops liver disease. The other doesn’t.
What explains the different outcomes?
The answer may lie in a condition known as metabolic syndrome, a cluster of conditions that together raise the risk of coronary heart disease, diabetes, stroke and other serious health problems. This syndrome, characterized by symptoms such as abdominal fat, high blood pressure, high cholesterol and high blood sugar, affects more than one in three Americans.
A new study from Keck Medicine of USC published in the Annals of Internal Medicine shows that heavy alcohol use may be dramatically more damaging to the liver for people with metabolic syndrome.
“Our research suggests that metabolic syndrome and alcohol interact in such a way that they multiply the effect of alcohol on the liver, more than doubling the risk of advanced liver disease among heavy drinkers,” said Brian P. Lee, MD, MAS, a hepatologist and liver transplant specialist with Keck Medicine who is the lead author on the study. “Drinking is harmful to the liver, but especially so for this segment of the population.”
In the study, heavy alcohol use was defined as two drinks (a total of 12 fluid ounces) a day for women and three drinks (a total of 18 fluid ounces) per day for men.
Lee and his colleagues were motivated to research a connection between advanced liver disease, alcohol use and metabolic syndrome after noticing that between 2009-2018, deaths from alcohol-associated liver disease surged in the United States by more than 30% while alcohol use, including heavy drinking, remained stable or declined.

During the last 20 years, the number of Americans with metabolic syndrome increased significantly. Previous research has shown that metabolic syndrome can cause liver abnormalities.

“We therefore hypothesized that metabolic syndrome could be an important contributor to this unexplained surge in advanced liver disease,” said Lee.
For the study, Lee and his fellow researchers used data from the National Health and Nutrition Examination Survey, which assesses the health and nutritional status of adults and children in the United States, pulling together samples representing the U.S. population 20 years or older between 1999 and 2018.

While the data revealed a slight increase in advanced liver disease with heavy alcohol use without metabolic syndrome, the greatest increase in advanced liver disease was found in those with combined heavy alcohol use and metabolic syndrome.

Lee believes that the increased risk of liver damage from drinking is a result of an increase in the amount of fat in the liver. A healthy liver contains less than five percent fat; any more than that can lead to inflammation and cirrhosis (scarring) of the liver, liver cancer and liver failure.

“Both metabolic syndrome and drinking increase liver fat, and we think that the combination of the two accelerates the accumulation of fat in the liver and fuels inflammation, resulting in a greater chance of liver disease,” said Lee.
He hopes the study will encourage physicians who screen and diagnose patients with metabolic syndrome to also ask about alcohol use and look for liver disease.

“Our study indicates that these conditions may often coexist, and it is in patients’ best interest to address both issues,” he said. “It’s also important for people with metabolic syndrome to realize they may be at an increased likelihood of advanced liver disease, and to monitor their drinking accordingly,” he added.

The other authors of the study are Jennifer Dodge, MPH, assistant professor of research medicine and population and public health sciences with the Keck School of Medicine of USC; Wendy Mack, PhD, professor of population and public health sciences with the Keck School of Medicine; Adam Leventhal, PhD, professor of population and public health sciences with the Keck School of Medicine and director of the USC Institute for Addiction Science and Norah Terrault, MD, MPH, a Keck Medicine gastroenterologist and division chief of gastroenterology and liver diseases with the Keck School of Medicine.
For more information about Keck Medicine of USC, please visit

Photos: Shutterstock/headshot courtesy of Keck Medicine of USC

The post Heavy drinking poses even greater risk for 1 in 3 Americans appeared first on USC News.

Immigrant adults with liver cancer have higher survival rates than those born in the U.S.

8377 school/unit website *

New USC study shows immigrant adults with liver cancer have higher survival rates than those born in the US

Story Headline and Deck – USC News *
New USC study shows immigrant adults with liver cancer have higher survival rates than those born in the US
Body Copy *
Immigrant adults with liver cancer in the United States have higher survival rates than people with the disease who were born in the U.S., according to new research from the USC Norris Comprehensive Cancer Center.

Hepatocellular carcinoma (HCC), the most common form of liver cancer, contributes to more than 27,000 deaths annually in the United States. Immigrants comprise a significant proportion of those diagnosed with HCC in the U.S. Research has shown that birthplace, also referred to as nativity, impacts incidence and risk factors for HCC, but little was known about its influence on survival after diagnosis.

The new study, just published in the Journal of the National Cancer Institute, identified a previously unrecognized disparity in survival after a diagnosis of liver cancer across all major racial/ethnic groups, with immigrants having better survival compared to those born in the U.S. This study is one of the first to robustly address nativity status as a predictor of overall survival for adults with HCC and provides important estimates of HCC survival by region of birth.

“Liver cancer is one of few cancers with increasing deaths. We identified a novel disparity by birthplace, whereby immigrants with liver cancer demonstrated better survival than their U.S.-born counterparts,” said study author Kali Zhou, MD, member of the Cancer Epidemiology Program at the USC Norris cancer center and a transplant hepatologist specializing in the treatment of chronic liver disease at Keck Medicine of USC. “This was true across different racial/ethnic groups. This finding is important as liver cancer rates are rising among U.S.-born and understanding why immigrants have better outcomes may help us create strategies to improve the survival of those born here.”

California has a high concentration of immigrants, representing about a quarter of the foreign-born population nationwide. This study used California Cancer Registry data to investigate whether birthplace impacts survival among patients with liver cancer, a cancer with poor prognosis that is common among immigrants, though rising in those born in the U.S.

Zhou, who is also an Assistant Professor of Clinical Medicine at the Keck School of Medicine of USC, and her colleagues identified 51,533 adults with HCC with available birthplace data in the California Cancer Registry between 1988 and 2017, of which 20,400 were people who were born in foreign countries. Cases were categorized as people who are born in the U.S. or people born in any other country, then stratified by four mutually exclusive race and ethnicity groups: Hispanic, non-Hispanic (NH) White, NH Black, and NH Asian/Pacific Islander. Results showed that 40% of all HCC cases were among those born outside of the U.S., and that their five-year survival rate was higher than patients with HCC who were born in the U.S. across all four major race and ethnicity groups. Among foreign-born people, lower mortality was observed in those from Central and South America compared to Mexico for Hispanics, East Asia compared to Southeast Asia for Asian/Pacific Islanders, and East Europe and Greater Middle East compared to West/South/North Europe for Whites.

The population-based California Cancer Registry provided a unique opportunity to compare HCC survival by nativity overall separated by regions of origin within individual race and ethnicity groups. Understanding the reasons for better survival among immigrants may help researchers address the disparity in survival rates by identifying ways to improve outcomes for people born in the U.S.

In addition to Kali Zhou, other contributors from the USC Norris Comprehensive Cancer Center and the Keck School of Medicine include Lihua Liu, PhD; Mariana Stern, PhD; Wendy Setiawan, PhD; Norah Terrault, MD, MPH; and Myles Cockburn, PhD.

This work was supported by the California Department of Public Health, the Centers for Disease Control and Prevention’s National Program of Cancer Registries, and the National Cancer Institute’s Surveillance, Epidemiology and End Results Program.

The post Immigrant adults with liver cancer have higher survival rates than those born in the U.S. appeared first on USC News.

USC computer scientists tackle dental health and birth defects

Story Headline and Deck – USC News *
USC Computer Scientists Are Tackling Dental Health and Birth Defects
Body Copy *
Roughly half of all birth defects involve the face and skull, yet scientists remain unclear about why most occur.

The way to address tough medical challenges like this one is through data – lots of it. But how to best manage the data, integrate it into meaningful information, and create a comprehensive picture that is useful and accessible to researchers is another question. FaceBase offers an answer.

FaceBase is a research resource that provides open access to genetic, molecular and imaging data to the dental, oral and craniofacial (DOC) research community.

“Through FaceBase, USC is playing a role in the next generation of dental and craniofacial research,” said Carl Kesselman, FaceBase’s co-Principal Investigator, who is the William H. Keck Professor of Engineering and a Professor in the Daniel J. Epstein Department of Industrial and Systems Engineering and Director of the Informatics Systems Research Division at the Information Sciences Institute (ISI) in the USC Viterbi School of Engineering.

He continued, “We are assembling all of the data, organizing the research community, and providing this service to the National Institute of Dental and Craniofacial Research [NIDCR] and the research community at large.”

The Basics of FaceBase
FaceBase is a collaborative NIDCR-funded project that houses comprehensive data in support of advancing research into craniofacial development and malformation.

Kesselman, who is an ISI Fellow, leads the team of researchers and staff at ISI who run FaceBase’s coordinating center (i.e., the Hub).

The Hub is where large datasets are curated and shared. Researchers in the DOC community can submit their projects to FaceBase, and datasets from approved projects are added.

How does a large database help research?

Rob Schuler, the technical lead for FaceBase and Senior Computer Scientist at ISI, gave examples of how some researchers are using FaceBase data: “to have a larger patient cohort; to compare their own clinical results with research being done on animal models; some of them do analysis and use the large datasets to train neural networks and produce models that can, for example, predict a phenotype based on a patient’s face.”

But FaceBase is more than just an ever-growing database.

FaceBase Connects the Dots
“We don’t think of FaceBase as a data repository, although we do operate a repository as part of FaceBase. But really, we are an overall data resource,” said Schuler.

One of the missions of the project is to facilitate cooperation and collaboration between the Hub and the craniofacial research community.

“There’s a desire to be able to use a data resource like FaceBase to assist researchers in making connections to other people who are possibly working on a similar disease,” said Schuler.

Kesselman said, “We connect the dots. In the absence of something like FaceBase, you have a little piece of data over here and a little piece of data over there, and you can’t figure out how they connect. But we do that. We take all these different aspects and research projects, we integrate them so that they’re more cohesive, and it represents more of the total knowledge of the community rather than isolated silos.”

The AADOC Conference
Kesselman and Schuler, along with Computer Scientist Alejandro Bugacov, Research Engineer Cris Williams, and USC Ostrow School of Dentistry Associate Dean of Research Yang Chai (Co-PI of FaceBase), recently made a big impact at the American Association of Dental and Craniofacial Research (AADOCR) conference in Portland, Oregon.

From March 15 to 18, 2023, the team showcased their innovative work in the FaceBase platform’s data sharing and management. Bugacov presented a poster and provided demos at the NIDCR Trainee Research Presentation, which highlighted the platform’s user-friendly interface and powerful search capabilities.

Meanwhile, Schuler presented two talks: an interactive talk on building FAIR data sharing communities (where he also served as session co-chair); and an invited talk in the Knowledge and Database Symposium.

Big Praise From the Biggest Name in DOC Research
The AADOCR conference also included a celebration of the 75th anniversary of NIDCR.

The National Institute of Dental Research – which would later become the NIDCR – was founded as one of the earliest institutes of the National Institutes of Health (NIH); created in response to the tooth decay epidemic during World War II. At the time, oral health was an issue of national security, as potential military recruits were being disqualified from service due to tooth decay.

Today, the mission of NIDCR is to “advance fundamental knowledge about DOC health and disease and translate these findings into prevention, early detection, and treatment strategies that improve overall health for all individuals and communities across the lifespan.” It has an annual budget of $475 million, funding approximately 770 grants, 6,500 researchers, 350 trainees and 200 organizations

With NIDCR being such a major player in the DOC research community, the FaceBase team was particularly excited to hear how their project is valued by the institute. In a video commemorating 75 years of NIDCR, current NIH director and former NIDCR director Lawrence Tabak mentioned FaceBase as one of the top achievements during his tenure – an endorsement that speaks volumes about the impact FaceBase has had in advancing the field of dental and craniofacial research.

What’s Next for FaceBase?
“We’re in the third phase of FaceBase right now, and we’ve opened it up to more projects,” said Williams. She continued, “Previously, we had specific ‘spoke’ projects around us, the ‘Hub.’ Ten to 12 projects were contributing data at a time. But now it’s open to the community, which has definitely widened the scope even further of what we’re taking in and how we’re building up our database.”

How wide of a scope? Kesselman gave examples of two current projects: “We’ve got a large dataset contributed by colleagues looking at the genetic foundations of tooth enamel. And another large set of data from researchers studying oral health in Appalachia. They’ve looked at social factors, along with all kinds of various health factors associated with oral health.”

In addition to opening it up to more research, the FaceBase team is also looking at applications for clinicians, the people who are actually treating and diagnosing patients. Williams explained, “That’s not something FaceBase had focused on before. We’re working on a pilot project about what it would take to serve the clinician community, and that opens up a whole new frontier.”

Kesselman said, “We’ve been working with people in the Ostrow Dental School for the last eight years now, applying very sophisticated computer science and research that we’ve developed at ISI. And by applying it in this area we are ultimately making a real impact on dental health and childhood development.”

About the FaceBase Co-PIs
Carl Kesselman is the William H. Keck Chair of Engineering in the USC Viterbi School of Engineering and is a Professor in the Daniel J. Epstein Department of Industrial and Systems Engineering. He also holds joint appointments as Professor in Computer Science at the USC Viterbi School of Engineering, the Department of Population and Public Health Sciences in the Keck School of Medicine and in the Herman Ostrow School of Dentistry. He is the director of the Informatics Systems Research Division at ISI and an ISI Fellow, the institute’s highest honor.

Yang Chai is a University Professor and he holds the George and MaryLou Boone Chair in Craniofacial Molecular Biology. He is the director of the Center for Craniofacial Molecular Biology and is Associate Dean of Research in the Herman Ostrow School of Dentistry.

The post USC computer scientists tackle dental health and birth defects appeared first on USC News.

USC Viterbi team to participate in multi-university study on quantum computing


school/unit website *

Daniel Lidar to Lead MURI on Quantum Computing Research

Story Headline and Deck – USC News *
Daniel Lidar to Lead MURI on Quantum Computing Research
The Department of Defense Multidisciplinary University Research Initiative Award will allow Daniel Lidar’s team to investigate techniques that may unlock quantum computing’s full potential.
Body Copy *
A research team led by Daniel Lidar, the holder of the Viterbi Professorship of Engineering and Professor in the Ming Hsieh Department of Electrical and Computer Engineering, has been named as the recipient of a Multidisciplinary University Research Initiative (MURI) Award. These highly competitive and sought-after grants support basic research projects in areas of strategic importance to the Department of Defense. Lidar’s team will receive a maximum of $6.25 million over five years.

Lidar, who is the Director of the USC Center for Quantum Information Science and Technology, will be collaborating with colleagues at the Massachusetts Institute of Technology and Iowa State University — along with Dr. Robert Kosut, a quantum control expert at the company SC Solutions, and a separately funded team based in Australia led by Professor Kavan Modi — to investigate quantum error correction and quantum control. These techniques hold the promise of facilitating the development of quantum computers that can be exponentially faster than the best state-of-the-art classical computers for certain problems.

“Quantum computers have the potential to solve problems that are currently impossible for classical computers, like simulating complex chemical reactions or breaking modern cryptographic codes,” said Lidar. “However, one major challenge in building a practical quantum computer is dealing with errors.”

By researching improvements in quantum error correction and quantum control, Lidar and his team aim to overcome the challenges posed by errors and the delicate nature of quantum systems.

Minimizing errors

Errors in quantum computing can arise from various sources, such as the environment (heat, radiation or magnetic fields) or imperfections in the hardware. These errors can cause qubits — which are the fundamental units of information in quantum computing — to lose their fragile quantum state or introduce unwanted changes, potentially ruining the computation. That’s where quantum error correction comes in.

One widely used method is the error-correcting code approach, which involves encoding the information of a single qubit across multiple “physical” qubits. These extra qubits essentially provide redundancy so that if an error occurs, it can be detected and corrected without losing the original information.

“Imagine a game of ‘telephone,’ where a message is passed down a line of people,” says Lidar. “If each person only whispers to the next one, errors can easily creep in. But if everyone repeats the message to multiple neighbors who share the messages they received, it becomes easier to identify and correct any mistakes. Quantum error correction works in a similar way, but with qubits and quantum correlations called entanglement instead of correlated people.”

Ensuring accuracy

Lidar’s team will be looking at how quantum error correction intersects with quantum control, which involves manipulating quantum systems to perform specific tasks or computations. Quantum control focuses on the precise control of qubits to ensure that the desired quantum operations are executed with high accuracy.

“The need for quantum control arises because it’s crucial to accurately perform the quantum operations while minimizing errors and maintaining the quibits’ coherence, which is the ability to maintain their quantum state,” said Lidar.

Achieving precise quantum control is challenging because quantum systems are so prone to errors. Lidar and his team will be exploring how to improve the effectiveness of quantum control approaches, including open-loop and closed-loop control, in dealing with unexpected errors.

Leading the charge

This is the second MURI Award team that Lidar will be leading. The current project will build on results from the quantum computing research he spearheaded with a MURI Award in 2011.

Lidar, who has also been the recipient of a Guggenheim Fellowship for his groundbreaking work in quantum computing, notes that his research group at USC Viterbi has a longstanding collaboration with the researchers at both MIT and Iowa State University, dating back to the previous MURI Award and even earlier in the case of MIT.

“It’s incredibly exciting to have our team selected for this award,” said Lidar. “We’ve assembled some of the top people globally working at the intersection of quantum error correction and quantum control and worked long and hard to put together a competitive proposal. We’re all very gratified that our ideas were selected for funding, and we’re eager to start work on them as a team.”

The post USC Viterbi team to participate in multi-university study on quantum computing appeared first on USC News.

Give Now