You are on page 1of 2

CORRESPONDENCE

Letter: Machine Learning and Artificial problems suited for AI solutions in neurosurgery. As neurosur-
Intelligence in Neurosurgery: Status, geons best understand the myriad clinical problems related to the
Prospects, and Challenges management of neurosurgical diseases, they play a pivotal role in

Downloaded from https://academic.oup.com/neurosurgery/advance-article/doi/10.1093/neuros/nyab337/6366941 by guest on 25 October 2021


guiding the development of AI algorithms for clinical application
To the Editor: in neurosurgery. Importantly, before widespread adoption and
I read with great interest the article by Dagi and colleagues.1 I translation into clinical practice may occur, AI algorithms would
would like to congratulate the authors on introducing artificial also need to be understood and trusted by neurosurgeons.18 Thus,
intelligence (AI) to neurosurgeons and presenting a compre- the perceived opacity of AI algorithms, described by Dagi et al as
hensive overview of AI concepts, applications, and challenges to “a black box whose internal logic may be elusive” is a barrier to
the adoption of AI algorithms in clinical practice. the translation of AI algorithms into neurosurgical practice.23,24
AI has the potential to revolutionize diagnostics, clinical To enable the neurosurgical community to better understand
decision making, prognostication, and operative interventions in how AI algorithms work and to reach a minimum level of trust
neurosurgery and has an important role to play in advancing that allows AI algorithms to be applied and used in neurosur-
precision medicine for neurosurgery in the future.2-6 Dagi et al gical practice, I would like to introduce the concept of explainable
highlighted that as of 2018, only 23 neurosurgical studies AI.25-27 Explainable AI is focused on the understanding and inter-
reported using machine learning (ML) or AI in the neurosurgical pretation of the behavior of AI algorithms, which aims to create
literature,7 although this number may be under-represented and insight into how and why these algorithms produce predictions,
may be closer to the order of 70 to 155 studies in recent years.8,9 while maintaining high predictive performance levels.28,29 Thus,
Despite the exponential increase in the number of publications on an AI algorithm is explainable if neurosurgeons are able to under-
the use of AI in medicine and neurosurgery,10,11 actual clinical stand how these algorithms make their predictions. Methods
applications and medical devices using AI algorithms remain used to develop explainable AI may be classified into explainable
limited.12,13 There is still a large gap between expectations of AI modeling and post-hoc explanations, and these methods are able
and the reality of applying AI in neurosurgical clinical practice. to provide global or local explanations of the AI algorithms
In their article, Dagi et al1 had introduced the need for being used.28 Examples of explainable AI have been reported in
validation and verification of AI algorithms, and highlighted the medical literature,30-32 and in neurosurgery, an explainable
the regulatory challenges of translating AI algorithms into real- deep-learning algorithm was developed for the detection of acute
world applications. In order to bridge the translational gap intracranial hemorrhage on noncontrast computed tomography
for AI algorithms to be used in neurosurgery, I would like scans.33 Further developments in the field of explainable AI may
to emphasize that, first, the development of accurate and allow both neurosurgeons and neurosurgical patients to better
unbiased AI algorithms is needed.14 This allows the neurosur- understand how AI algorithms work, and enable the translation
gical community to trust the use of AI algorithms in clinical of these algorithms into neurosurgical practice.
practice, and to mitigate the risks of unintended and negative Lastly, the true test of clinical applicability for AI algorithms
consequences in using AI in neurosurgery.5 These concerns are may occur in the form of robust clinical evaluation using real-
real, and existing reviews on AI algorithms in neurosurgery world randomized controlled trials (RCTs).18,34 Such RCTs for
have highlighted methodological issues related to using biased AI are an invaluable tool to demonstrate causal inference and
data for AI model development and applying models outside the effectiveness of AI algorithms in improving clinical outcomes
of populations intended for clinical use.7,15,16 Furthermore, the in neurosurgical practice.34 Data scientists cannot conduct such
risks of widespread implementation of inaccurate AI algorithms RCTs in neurosurgical practice. Neurosurgeons are thus best
have been illustrated in a recent example of a proprietary sepsis positioned to lead the development of AI solutions in neuro-
prediction model that was shown to have poor discrimination and surgery, to ensure that the neurosurgical community understand
calibration in predicting the onset of sepsis, raising fundamental the intended use of these algorithms, and to translate these
concerns about its use nationally.17 Thus, for AI algorithms to be algorithms into real-world neurosurgical applications.
translated into real-world applications in neurosurgery, challenges I hope that this letter would be complementary to Dr Dagi’s
related to algorithm development and ML science would need to article1 to highlight further challenges to the clinical application
be addressed.18 of AI algorithms in neurosurgery, introduce explainable AI to
In order to facilitate the development of AI algorithms for neurosurgeons, and encourage the development of AI algorithms
use in neurosurgery, the neurosurgical community would need for neurosurgical practice in the future.
to better understand the algorithm development pipeline,19
AI performance metrics,20,21 and potential bias in AI model Funding
development,22 and collaborate with data scientists to address This study did not receive any funding or financial support.

NEUROSURGERY VOLUME 0 | NUMBER 0 | 2021 | 1


CORRESPONDENCE

Disclosures 17. Wong A, Otles E, Donnelly JP, et al. External validation of a widely implemented
proprietary sepsis prediction model in hospitalized patients. JAMA Intern Med.
The author has no personal, financial, or institutional interest in any of the
2021;181(8):1065-1070.
drugs, materials, or devices described in this article. 18. Kelly CJ, Karthikesalingam A, Suleyman M, Corrado G, King D. Key
challenges for delivering clinical impact with artificial intelligence. BMC Med.
2019;17(1):195.
Mervyn J. R. Lim, MBBS, MPH 19. Ngiam KY, Khor IW. Big data and machine learning algorithms for health-care
Division of Neurosurgery delivery. Lancet Oncol. 2019;20(5):e262-e273.

Downloaded from https://academic.oup.com/neurosurgery/advance-article/doi/10.1093/neuros/nyab337/6366941 by guest on 25 October 2021


20. Menzies T, Pecheur C. Verification and validation and artificial intelligence. In:
University Surgical Centre Advances in Computers. Vol 65. Elsevier;2005:153-201.
National University Hospital 21. Park SH, Choi J, Byeon JS. Key principles of clinical validation, device approval,
Singapore and insurance coverage decisions of artificial intelligence. Korean J Radiol.
2021;22(3):442-453.
22. Gianfrancesco MA, Tamang S, Yazdany J, Schmajuk G. Potential biases in
REFERENCES machine learning algorithms using electronic health record data. JAMA Intern Med.
2018;178(11):1544-1547.
1. Dagi TF, Barker FG, Glass J. Machine learning and artificial intelligence in neuro- 23. Price WN. Big data and black-box medical algorithms. Sci Transl Med.
surgery: status, prospects, and challenges. Neurosurgery. 2021;89(2):133-142. 2018;10(471):eaao5333.
2. Topol EJ. High-performance medicine: the convergence of human and artificial 24. Burrell J. How the machine ‘thinks’: understanding opacity in machine learning
intelligence. Nat Med. 2019;25(1):44-56. algorithms. Big Data Soc. 2016;3(1):1-12.
3. Bohr A, Memarzadeh K. The rise of artificial intelligence in healthcare appli- 25. Lauritsen SM, Kristensen M, Olsen MV, et al. Explainable artificial intelligence
cations [published online ahead of print: June 26, 2020]. Artif Intell Healthc. model to predict acute critical illness from electronic health records. Nat Commun.
doi:10.1016/B978-0-12-818438-7.00002-2. 2020;11(1):3852.
4. Davenport T, Kalakota R. The potential for artificial intelligence in healthcare. 26. Tonekaboni S, Joshi S, McCradden M, Goldenberg A. What clinicians want: contex-
Future Healthc J. 2019;6(2):94-98. tualizing explainable machine learning for clinical end use. In: Machine Learning for
5. Panesar SS, Kliot M, Parrish R, Fernandez-Miranda J, Cagle Y, Britz GW. Promises Healthcare—MLHC 2019; 2019:359-380.
and perils of artificial intelligence in neurosurgery. Neurosurgery. 2020;87(1):33- 27. Cadario R, Longoni C, Morewedge CK. Understanding, explaining, and utilizing
44. medical artificial intelligence [published online ahead of print: June 28, 2021]. Nat
6. Perez-Breva L, Shin JH. Artificial intelligence in neurosurgery: a comment on the Hum Behav. doi:10.1038/s41562-021-01146-0.
possibilities. Neurospine. 2019;16(4):640-642. 28. Markus AF, Kors JA, Rijnbeek PR. The role of explainability in creating
7. Senders JT, Arnaout O, Karhade AV, et al. Natural and artificial intelligence in trustworthy artificial intelligence for health care: a comprehensive survey of
neurosurgery: a systematic review. Neurosurgery. 2018;83(2):181-192. the terminology, design choices, and evaluation strategies. J Biomed Inform.
8. Segato A, Marzullo A, Calimeri F, De Momi E. Artificial intelligence for brain 2021;113:103655.
diseases: a systematic review. APL Bioeng. 2020;4(4):041503. 29. Linardatos P, Papastefanopoulos V, Kotsiantis S. Explainable AI: A review of
9. Buchlak QD, Esmaili N, Leveque JC, et al. Machine learning applications to machine learning interpretability methods. Entropy. 2020;23(1):18.
clinical decision support in neurosurgery: an artificial intelligence augmented 30. Singh A, Sengupta S, Lakshminarayanan V. Explainable deep learning models in
systematic review. Neurosurg Rev. 2020;43(5):1235-1253. medical image analysis. J Imaging. 2020;6(6):52.
10. Tran BX, Vu GT, Ha GH, et al. Global evolution of research in artificial intelligence 31. Jansen T, Geleijnse G, Van Maaren M, Hendriks MP, Ten Teije A, Moncada-Torres
in health and medicine: a bibliometric study. J Clin Med. 2019;8(3):360. A. Machine learning explainability in breast cancer survival. Stud Health Technol
11. Secinaro S, Calandra D, Secinaro A, Muthurangu V, Biancone P. The role of Inform. 2020;270:307-311.
artificial intelligence in healthcare: a structured literature review. BMC Med Inform 32. Payrovnaziri SN, Chen Z, Rengifo-Moreno P, et al. Explainable artificial intelli-
Decis Mak. 2021;21(1):125. gence models using real-world electronic health record data: a systematic scoping
12. Staartjes VE, Stumpo V, Kernbach JM, et al. Machine learning in neurosurgery: a review. J Am Med Inform Assoc. 2020;27(7):1173-1185.
global survey. Acta Neurochir (Wien). 2020;162(12):3081-3091. 33. Lee H, Yune S, Mansouri M, et al. An explainable deep-learning algorithm for the
13. Benjamens S, Dhunnoo P, Mesko B. The state of artificial intelligence-based FDA- detection of acute intracranial haemorrhage from small datasets. Nat Biomed Eng.
approved medical devices and algorithms: an online database. NPJ Digit Med. 2019;3(3):173-182.
2020;3(1):118. 34. Angus DC. Randomized clinical trials of artificial intelligence. JAMA.
14. Parikh RB, Teeple S, Navathe AS. Addressing bias in artificial intelligence in health 2020;323(11):1043-1045.
care. JAMA. 2019;322(24):2377-2378.
15. Gregorio T, Pipa S, Cavaleiro P, et al. Prognostic models for intracerebral 
C Congress of Neurological Surgeons 2021. All rights reserved. For permissions, please
hemorrhage: systematic review and meta-analysis. BMC Med Res Methodol. e-mail: journals.permissions@oup.com
2018;18(1):145.
16. Gravesteijn BY, Nieboer D, Ercole A, et al. Machine learning algorithms performed
no better than regression models for prognostication in traumatic brain injury. https://doi.org/10.1093/neuros/nyab337
J Clin Epidemiol. 2020;122:95-107.

2 | VOLUME 0 | NUMBER 0 | 2021 www.neurosurgery-online.com

You might also like