You are on page 1of 6

What many people don't realize is that medicine as a whole is already some sort of

expert system (i.e.: a flavor of AI).


There are researchers that conduct experiments to produce meaningful data and extract
conclusions from that data. Then there are expert panels that produce guidelines from
the results of that research. Most diagnostics and treatments are prescribed following
decision diagrams that doctors themselves call... algorithms!

There are several limitations that prevent us from applying other AI techniques to the
problem. Off the top of my head:

- We do not have the technology for machines to capture the contextual and
communication nuances that doctors pick up on. There can be a world of difference
between the exact same statement given by two different patients or even the same
patient in two different situations. Likewise, the effect of a doctors' statement can be
quite literally the opposite depending on who the patient is and their state of mind. One
of the most important aspects of the GP's job is to handle these differences to achieve
the best possible outcomes for their patients.

- Society at large is not ready to trust machines to make such intimately relevant
decisions. It is not uncommon for patients to hide relevant information from their
doctors, and to blatantly ignore the recommendations from them. This would be many
times worse if the doctor part wasn't human.

- We cannot apply modern inference techniques (e.g.: deep learning) to the global
problem because we have strict rules that prevent medical data collection and analysis
without a clear purpose. Furthermore, these techniques tend to produce unexplainable
results -which is unacceptable in this field-. As a result, there's not enough political
capital to relax those rules.

reply

derefr 1 hour ago | root | parent | prev | next [–]

“Super important” — more like “super nice-to-have.” Hospitals don’t have any single
person on staff who stays attached to particular in-patients. Who knows you? Your
chart.
Yes, of course, hospital care would be better in many ways if we did have somebody
who statefully understood particular patients’ needs.
But what I’m saying is, the GPs in hospitals could be replaced with stateless diagnostic
AI without making hospital care any worse than it is now. And hospital care is a large
part of the medical system, so only replacing diagnostics there (while leaving
primary-care GPs alone) would still be a major optimization, freeing many doctors to
provide better care, go into specialties, etc.

reply

nradov 1 hour ago | root | parent | next [–]

That's simply false. You obviously have no idea how hospital care is actually delivered.
To start with, every admitted patient has an assigned attending physician who is
responsible for coordinating the care team. Some things can be documented in the
patient chart but there are always gaps. Clinical decision support systems for partially
automating diagnosis could potentially be helpful in some limited circumstances but
the ones built so far mostly don't work very well.
reply

robbiep 23 minutes ago | root | parent | next [–]

I’ll second how misguided that view of hospital care is. There is ALWAYS a treating
team, always an admitting consultant/attending
reply

HaZeust 3 hours ago | root | parent | prev | next [–]

Knowing the ontology of your patients and their risk is also a tenet of a doctor's job, but
we can do it with AI too. Hell, ontological engineering had a revamp specifically so that
we could have a standardized model to describe any and all "parts" of a "whole" in a way
that machines could understand.
reply

nradov 1 hour ago | root | parent | next [–]


No we really can't do that with AI yet. Current AI technology is nowhere near that level.
reply

telxosser 2 hours ago | root | parent | prev | next [–]

What data is being collected on you? Once a year blood test if that even?
I actually suspect it would be trivial to beat my doctor after 5 years of higher frequency
full blood panel data collection.

10 full blood panel samples a year, have 20 million people do that for a data set we can
do classification on. I think my doctor is kind of out of business then.

Will never happen in my life though with health insurance and health bureaucracy.

reply

nradov 1 hour ago | root | parent | next [–]

Beat your doctor on what? You can already get 10 full blood panel tests per year if you
want. You can just pay for it and don't need insurance. But what will you do with the
data? For most people the results won't tell you anything useful.
https://www.ondemand.labcorp.com/lab-tests/comprehensive-hea...

reply

zo1 9 minutes ago | root | parent | prev | next [–]

It won't happen primarily due to government regulation. Medical information has


"dangerous, don't touch this" written all over it, and everyone is scared to try.
reply

chromatin 4 hours ago | root | parent | prev | next [–]

It also helps to have a relationship with a patient (or person).


There are some people who will never, ever complain about anything. When they
complain of severe abdominal pain, for example, you pull out all the stops immediately
to figure out what's wrong, because it's probably really bad.

On the other hand, there are hypochondriacs and people will low pain tolerance. While
they can certainly also become seriously ill -- and one must never forget this -- the
tempo and pace of workup and order of intervention is markedly different, absent other
information that shifts the pretest probabilities.

reply

KerrAvon 3 hours ago | root | parent | next [–]

Sometimes a relationship is bad. If you think someone’s a hypochondriac, but in fact


they’re unusually sensitive, you’ll dismiss a lot of what they say and that can be quite
damaging over time. (Especially if they’re female
https://www.health.harvard.edu/blog/women-and-pain-dispariti...).
I wouldn’t eliminate GPs from the process, but many people actually would like to hear
what the robots have to say about their medical conditions. Having second opinions of
this sort available might lead to better patient outcomes.

reply

nradov 1 hour ago | root | parent | next [–]

There is no evidence that diagnostic robots would actually produce better outcomes.
The hypochondriacs are already able to Google their symptoms and make themselves
sick with anxiety.
reply

Spooky23 2 hours ago | root | parent | prev | next [–]

Lol. Maybe people who don’t have any medical problems.


There isn’t enough humanity in healthcare to begin with. Replacement of doctors with AI
sounds pretty horrific. General practice isn’t where healthcare costs are going bonkers,
and it seems weird to want to cost-cut something that actually kind of works in favor of
bullshit.

Know what would be a great use of AI? Something real like analyzing all of the telemetry
in EMRs to provide better guidance to doctors to proactively guide people. Some
CVSHealth chatbot telling me whatever is a waste of time.

reply

Elof 1 hour ago | root | parent | next [–]

I think the person was suggesting using the results of the AI to inform the doctors, not
replacing them. Which is something I would like as well
reply

nradov 57 minutes ago | root | parent | next [–]

Automated diagnosis applications have existed for decades. They have proven useful in
limited circumstances for certain specialties and rare conditions but for routine medical
care they're more hassle than they're worth.
reply

Aeolun 3 hours ago | root | parent | prev | next [–]

> On the other hand, there are hypochondriacs


That’s me. I really, really appreciate a GP that both understands that I’m not doing it on
purpose, and can reassure me that nothing is wrong, or figure out that we actually do
need more testing this time.

Unfortunately it’s been years since I had one like that :/

reply

PragmaticPulp 49 minutes ago | parent | prev | next [–]


> My mom worked for a GP for about 20 years, and it seemed to me that most of what
made that guy a doctor was bedside manner + being able to remember a lot of things.
That’s exactly right, but there’s nothing wrong with that.

A good doctor’s memory of patients spanning decades of a career and all of the various
treatments that they did or did not respond to is very valuable. It’s a good thing that they
offload as much as possible to other people so they can focus on doing what they do
best.

You might also like