With the Trump administration making sweeping cuts to staff and research grants at science-related agencies, artificial intelligence could offer a tempting way to keep labs going, but scientists say there are limits to the technology鈥檚 uses.
The Trump-appointed leaders of The National Institutes of Health, U.S. Centers for Disease Control and the Department of Health and Human Services have moved to that fund university research and laboratory needs in the last few months.
The federal government may be eyeing artificial intelligence to bridge a gap created by these cuts. In February, the U.S. Department of Energy鈥檚 national labs partnered with AI companies OpenAI and Anthropic for an 鈥�,鈥� a day for 1,000 scientists across various disciplines to test the companies鈥� AI models and share feedback. Some figures in Trump鈥檚 cabinet have suggested that artificial intelligence models may .
But scientists and builders of AI say it鈥檚 not that simple.
AI is playing a major role in scientific discovery 鈥� last year鈥檚 to three scientists for discoveries that used AI to predict the shape of proteins and to invent new ones.
But we aren鈥檛 looking at a future where we can substitute researchers and doctors with algorithms, said Jennifer Kang-Mieler, department chair and professor of biomedical engineering at Stevens Institute of Technology in New Jersey.
鈥淚t鈥檚 a tool they may use to enhance clinical decision-making,鈥� she said. 鈥淏ut I think that clinical expertise is not going to be something that we can completely match with AI.鈥�
Kang-Mieler and other researchers say AI has its limitations, but is playing an increasingly important role in analyzing data, speeding up lab work, assisting in diagnostics, making personalized treatment plans and in cutting some costs related to research.
AI uses in scientific labs and healthcareArtificial intelligence technologies have been a part of some healthcare and laboratory settings, like image recognition in radiology, for at least a decade, said Bradley Bostic, chairman and CEO of healthcare software company hc1, based in Indiana. But Bostic said the industry is still early in exploring its uses.
鈥淚t feels to me similar to 1999, with the World Wide Web,鈥� Bostic said. 鈥淲e didn鈥檛 realize how early days it was. I think that鈥檚 where we are right now with AI and specifically in healthcare.鈥�
While AI鈥檚 potential is nearly endless, AI鈥檚 best uses in scientific and healthcare settings are for tasks that are repetitive and operational, Bostic said. Ideally, AI makes processes more efficient, and frees up humans鈥� time to focus on more important tasks.
Stephen Wong, the John S. Dunn Presidential Distinguished Chair in Biomedical Engineering at Houston Methodist uses machine learning, deep learning, and large language models every day in his lab, which researches cost-effective strategies for disease management.
He said he uses AI models for image analysis, medical imaging, processing massive datasets in genomics, the study of proteins, known as proteomics, and drug screening, as well as sifting through existing research and lab data. His goal is to cut down on tedious tasks, and make sense of large-scale data.
鈥淓ven tasks like locating crucial information buried in lab notebooks, scientific publications and patents become far more efficient,鈥� he said.
Efficiency is also the goal of Kang-Mieler鈥檚 research, which was funded last fall by an NIH grant. Kang-Mieler and colleague Yu Gan are developing an AI-powered diagnostic tool for retinopathy of prematurity (ROP) 鈥� an eye disorder and loss of vision 鈥� in premature infants.
There was a lack of quality images for AI models to train on, Kang-Mieler said, so they are using images of animal eyes that feature ROP, to create 鈥渟ynthetic鈥� images of what the condition would look like in humans. The neural networks in the AI model will learn how to categorize those synthetic images, and eventually assist eye doctors in spotting ROP. Before AI tools, this process would have been done by the human eye, and take much longer, Kang-Mieler said.
鈥淭he way I saw it was also that if we can be really successful in developing and doing this, we can actually take this into other types of diseases, rare diseases, that are hard to diagnose,鈥� she said.
Automation and human capitalMany scientific labs require a lot of physical tasks, like handling liquids, following steps at specific times and sometimes handling hazardous materials. With AI algorithms and hardware, much of that work can be done without humans physically present, researchers at the University of North Carolina are finding.
Ron Alterovitz, the Lawrence Grossberg Distinguished Professor in the Department of Computer Science, has worked with Jim Cahoon, chair of the Department of Chemistry, on an approach to make lab work more autonomous. The pair have studied how an AI model could instruct an autonomous robot to execute lab processes, and then how AI models could analyze experiment results into findings. Alterovitz called it a 鈥渕ake and test鈥� model.
鈥淪o once people can set it in motion, the AI comes up with a design, the robotic automation will make and test it, and the AI will analyze the results and come up with a new design,鈥� he said. 鈥淎nd this whole loop can essentially run autonomously.鈥�
The pair , saying there are several levels of automation a lab could deploy, from assistive automation, where tasks like liquid handling are automated and humans do everything else, all the way up to the fully automated loop Alterovitz described.
Alterovitz sees many benefits to automated labs. Robots offer a safer method of handling hazardous materials, and allow researchers to conduct experiments 24 hours a day, instead of just when lab techs are clocked in. The robots also provide high accuracy and precision, and can replicate experiments easily, he said.
鈥淚f you ask two different people to do the same synthesis process, there鈥檒l be subtle differences in how they do some of the details that can lead to some variance in the results sometimes,鈥� Alterovitz said. 鈥淲ith robots, it鈥檚 just done the same way every time, very repeatedly.鈥�
While there are fears that AI and automation will cut jobs in science, Alterovitz said it allows humans to do higher-level tasks. Many labs are already facing a shortage of trained technicians who do a majority of the physical tasks involved.
AI-assisted labs will likely heighten the need for other types of jobs, like data scientists, AI specialists and interdisciplinary experts who can bridge technology with real-world scientific applications, Wong said.
In order to continue innovating and learning new things, labs will still need the 鈥渃hemical intuition鈥� and problem-solving skills that trained scientists have, Alterovitz said.
AI鈥檚 limitationsKang-Mieler says that AI鈥檚 current limitations are a factor that keeps the industry from rushing to apply the technology to everything. AI models are only as good as the data sets they鈥檙e trained on, and can contain data bias, or incomplete information that won鈥檛 paint a full picture.
And AI models can鈥檛 do an essential function of researchers, Kang-Mieler said 鈥� discover new information.
鈥淚 suppose that AI models can help formulate new hypotheses, but I don鈥檛 think that capability is the same as discovery,鈥� Kang-Mieler said. 鈥淐urrent AI models are not developed to make independent discoveries or have original thoughts.鈥�
Bostic has built other technology companies in his career, but said the stakes in scientific research and healthcare are much higher. Inaccurate data in an AI model could lead to a missed diagnosis or another huge problem for a patient. He said the best approach is what he calls 鈥渞einforcement learning through human feedback.鈥�
鈥淭his is where you don鈥檛 have models that are just running independent of people,鈥� Bostic said. 鈥淵ou have the models that are complementing the people and actually being informed by the people.鈥�
Bostic said as the tech industry evolves, AI will play a role in shortening drug trials, providing patients more specialized care and helping research teams make due with fewer skilled workers, he said. But it鈥檚 not a fix-all, set-it-and-forget-it solution.
鈥淚 don鈥檛 see a scenario where clinical decisions are being independently made by machines and there aren鈥檛 the experts 鈥� who are trained and seeing the total picture of what鈥檚 going on with the patient 鈥� involved with those decisions anytime soon,鈥� he said.
This story is republished from