You are on page 1of 1

Editorial

AI in medicine: creating a safe and equitable future


The meteoric progress of generative artificial intelligence exacerbating historical injustice and discrimination when
(AI)—such as Open AI’s ChatGPT, capable of holding used elsewhere. These issues all risk eroding patient trust.
realistic conversations, or others of creating realistic How then to ensure that AI is a force for good in
images and video from simple prompts—has renewed medicine? The scientific community has a key role in
interest in the transformative potential of AI, including rigorous testing, validation, and monitoring of AI. The

Yuichiro Chino/Getty Images


for health. It has also sparked sobering warnings. UN is assembling a high-level advisory body to build
Addressing the UN Security Council in July, Secretary global capacity for trustworthy, safe, and sustainable AI;
General António Guterres spoke of the “horrific levels of it is crucial that health and medicine are well represented.
death and destruction” that malicious AI use could cause. An equitable approach will require a diversity of local
How can the medical community navigate AI’s substantial knowledge. WHO has partnered with the International See World Report page 517
challenges to realise its health potential? Digital Health and AI Research Collaborative to boost
AI in medicine is nothing new. Non-generative machine participation from LMICs in the governance of safe and
learning can already perform impressively at discrete ethical AI in health through cross-border collaboration
tasks, such as interpretating medical images. The Lancet and common guidance. But without investment in local
Oncology recently published one of the first randomised infrastructure and research, LMICs will remain reliant on
controlled trials of AI-supported mammography, AI developed in the USA and Europe, and costs could be For the AI-assisted
mammography trial see
demonstrating a similar cancer detection rate and prohibitive without open access alternatives. At present, Articles Lancet Oncology 2023;
nearly halved screen-reading workload compared with the pace of technological progress far outstrips the 24: 936–44

unassisted reading. AI has driven progress in infectious guidance, and the power imbalance between the medical For more on AI in infectious
diseases see Science 2023;
diseases and molecular medicine and has enhanced field- community and technology firms is growing. 381: 164–70
deployable diagnostic tools. But the medical applications Allowing private entities undue influence is dangerous. For more on AI in molecular
medicine see N Engl J Med 2023;
of generative AI remain largely speculative. Automation The UN Secretary General has urged the Security Council 388: 2456–65
of evidence synthesis and identification of de novo drug to help ensure transparency, accountability, and oversight For more on generative AI in
candidates could expedite clinical research. AI-enabled on AI. Regulators must act to ensure safety, privacy, and medicine see Comment
Lancet Digit Health 2023;
generation of medical notes could ease the administrative ethical practice. The EU’s AI Act, for example, will require 5: e107–8
burden for health-care workers, freeing up time to see high risk AI systems to be assessed before approval and For more on the Global Grand
patients. Initiatives such as the Bill & Melinda Gates subjected to monitoring. Regulation should be a key Challenges see https://gcgh.
grandchallenges.org/challenge/
Foundation’s Global Grand Challenges seek innovative concern of the first major global summit on AI safety, catalyzing-equitable-artificial-
intelligence-ai-use
uses of large language models in low-income and middle- being held in the UK later this year. Although technology
For more on the dangers of
income countries (LMICs). companies should be part of the regulatory conversation, biased health data see Science
These advances come with serious risks. AI performs there are already signs of resistance. Amazon, Google, and 2019; 366: 447–53

best at well defined tasks and when models can easily Epic have objected to proposed US rules to regulate AI in For more on WHO’s efforts to
improve access to AI see https://
augment rather than replace human judgement. Applying health technologies. The tension between commercial www.who.int/news/item/06-07-
generative AI to heterogeneous data is complicated. The interests and transparency risks compromising patient 2022-who-and-i-dair-to-
partner-for-inclusive-impactful-
black box nature of many models makes it challenging wellbeing, and marginalised groups will suffer first. and-responsible-international-
to appraise their suitability and generalisability. Large There is still time for us to create the future we want. AI research-in-artificial-
intelligence-and-digital-health
language models can make mistakes easily missed by could continue to bring benefits if integrated cautiously. For more on the UN Secretary
humans or hallucinate non-existent sources. Transfer It could change practice for the better as an aid—not a General’s remarks see https://
press.un.org/en/2023/
of personal data to technology firms without adequate replacement—for doctors. But doctors cannot ignore AI. sgsm21880.doc.htm
regulation could compromise patient privacy. Health Medical educators must prepare health-care workers for For more on the AI Act see
equity is a particularly serious concern. Algorithms a digitally augmented future. Policy makers must work https://artificialintelligenceact.
eu
trained on health-care datasets that reflect bias in health- with technology firms, health experts, and governments For more on the global summit
care spending, for example, worsened racial disparities to ensure that equity remains a priority. Above all, the on AI see https://www.gov.uk/
government/news/uk-to-host-
in access to care in the USA. Most health data come medical community must amplify the urgent call for first-global-summit-on-artificial-
from high-income countries, which could bias models, stringent regulation. n The Lancet intelligence

www.thelancet.com Vol 402 August 12, 2023 503

You might also like