The pharmaceutical giant says the drug's effectiveness may vary among different racial and ethnic groups.
Black Americans, for example, are twice as likely to develop multiple myeloma as white Americans—but their participation rate in bone marrow cancer clinical trials is only 4.8%. Johnson & Johnson says it has already increased that share with the help of artificial intelligence.
The company used algorithms to identify centers where black patients with the disease could go, thereby increasing the participation rate of black people in 5 studies to approximately 10%. Established academic centers or clinics that have traditionally conducted trials are often difficult to reach for minority or low-income patients due to distance or cost.
In total, Johnson & Johnson is using AI to diversify 50 different trials, with plans to double that number next year.
Junior PHP developer 908, Dnipro
Unity/C# Developer BG Games, Remote, salary 800
Middle+ PHP developer (marketplace) SendPulse
One skin disease study that used phone screenshots and electronic consent forms to allow patients to participate remotely in the study was able to increase the number of black people to about 50%.
“A machine learning algorithm calculates and creates a heat map of where the patients eligible for this study are located,” explains Najat Khan, chief data scientist for the pharmaceutical division.
In recent years, there has been growing evidence that a drug can affect different groups of people differently, and the Covid-19 pandemic has highlighted deep ethnic disparities in access to care. Regulators and advocacy groups have pressured drug makers to include underrepresented racial and ethnic groups in new trials, not only to improve biomedical knowledge but also to build trust in medical systems among minority groups. Many companies are turning to AI for help.
Frontend course. Online course with practice, become a super ninja who can create a website from scratch. Get to know the course
A recent analysis in the journal Health Affairs found that fewer than 20% of drugs approved in 2020 had data on treatment benefits or side effects for Black patients. Financially and socially, a lack of diversity in trials will cost the US “billions of dollars” over the next three decades, a report from the National Academies of Sciences, Engineering and Medicine says.
About 75% of participants in clinical trials of new drugs approved in 2020 were white, 11% Hispanic and 8% black.
Since the 1980s, researchers have argued that white patients tend to have a better response to a type of antihypertensive drugs called beta blockers and a widely used class of heart disease drugs called ACE inhibitors than black patients. Other studies have shown that Asians with cancer and treated with immune checkpoint inhibitors called PD-1 and PD-L1 have significantly better survival rates.
Clinical trials are difficult to conduct because they involve coordination with multiple parties: patients, hospitals, and contract research companies. So pharmaceutical companies have often simply relied on well-known academic medical centers, where the population may not be as diverse.
At the same time, computer algorithms can help researchers quickly sift through vast amounts of data on past medical research, search millions of medical records from patients around the world, and quickly estimate the prevalence of diseases in populations. The data could help drugmakers find new networks of doctors and clinics with access to more diverse patients who fit more easily into their clinical trials—sometimes months faster and cheaper than if people combed through the data.
However, artificial intelligence poses new challenges for drug manufacturers, since the technology carries the risk of exacerbating the situation, so-called algorithmic bias. For example, in 2019, scientists said they had discovered unintentional racial bias in a software product sold by Optum. The algorithm based its predictions on patients' healthcare costs, not the severity or needs of their disease. According to a study of the algorithm's effects at one of the company's centers published in the journal Science, only 18% of black patients received additional help, rather than the 47% who needed it.
The study authors say the bias is typical in risk prediction tools that health centers and government agencies use to serve 200 million people nationwide, and that the bias is likely at work in other software as well.
At the same time, critics are suggesting alternatives to artificial intelligence—Otis Brawley, a professor of oncology at Johns Hopkins University, believes that remote testing or providing transportation or parking vouchers for participants will attract more people than algorithms. Black people in the U.S. are disproportionately poor, and the hospitals that care for them often don't have the capacity for additional projects such as clinical trials, he says.
“Artificial intelligence can do this, just like I could if I were allowed to pay for people to park, as I have done in many places,” Brawley says.
Walgreens Boots Alliance, which began conducting clinical trials for drug makers in 2022, has a different approach to promoting diversity in research. The company uses AI tools to quickly find relevant patients from different groups, but relies on local pharmacists at its nearly 9,000 U.S. stores.
“We have posters and flyers with information about the tests. Pharmacists also talk to patients about this,” says Ramita Tandon, who heads the clinical trials business at the pharmacy chain.
The method helped increase participation of black patients in one cardiovascular study to 15%, Tandon says, a number higher than the percentage of black people in the general population. The FDA's new diversity requirements have sparked a lot of interest from big pharmaceutical companies in Walgreens' clinical trials, she said.
In other countries, the use of artificial intelligence transcends race and ethnicity. Japanese pharmaceutical company Takeda Pharmaceutical, for example, uses AI to personalize complex consent letters for patients from minority groups such as the LGBT community. Technology can adjust language according to how people identify by gender and sexual orientation, creating more trust in the process.
New York-based H1, which uses generative artificial intelligence to match drug makers with trial sites, says it is working to remove bias from the data it collects. For example, their data on race and ethnicity may come from credit cards and bank statements, which means they can't reach people with less financial means, says Ariel Katz, H1's chief executive officer.
QA course. This is a good way to develop your career in the IT industry. Once you complete a course, Mate guarantees you an awesome deal. Get to know the course
“We're doing a lot of work to make sure our data sets are comprehensive and not biased, but there's still a lot of work to be done,” he says.
J&J now has a separate artificial intelligence ethics board, which includes scientists who oversee the trials to remove bias from the data.
“Our team probably spends 60% or 70% of its time on this aspect versus anything else, which is making sure the data is fit for purpose, relevant and representative, and if not, purchasing other data sets.” to make them like this,” the company said.