You are on page 1of 49

CONTINUOUS NURSING CARE IN PSYCHIATRIC

AND MENTAL HEALTH NURSING CARE IN THE


INDUSTRIAL REVOLUTION 4.0

By: Dr Deeni Rudita Idris Cirebon, Jawa Barat, Indonesia,


Email: Deeni.idris@ubd.edu.bn Jan 2020
SPEAKER’S PROFILE
Dr Deeni Rudita Idris
Education, training &
Assistant Professor,
research interest:
UBD
 Bsc. (Hons) Adult Nursing (City
University, UK), Msc. in Clinical
Education (UEA, UK), Post-basic
diploma in Mental Health Nursing
(Brunei), Diploma in Nursing
(Brunei).

 Men's health: Masculinities and


health-help seeking behavior,
HIV/AIDS & STI prevention
measures, Men and aging (Elderly
care), Men & mental health.
CONTENTS

 Definition of Industrial Revolution 4.0

 Pros and Cons of IR 4.0

 How it has been utilised in our psychiatric nursing care

 How it would benefit health care system and patients


care

 Challenges of IR 4.0
INDUSTRIAL REVOLUTION 4.0 (FIR)

 The “Fourth Industrial Revolution”(FIR) is an age of advanced technology


based on information and communication e.g. the use of Artificial
Intelligence (AI) – use of high-tech robotics.

 AI can seem daunting to some. However, AI in general has grown


considerably over the past 50 years, and is the current driving force
behind the Fourth Industrial Revolution.

 AI proposes improvements to almost every field that it touches, including


the medical sciences.

 Experts say that the FIR needs to be prepared because it will change the
way people work, how they consume, and even how they think

 The industrial revolution led to changes in the labor market with


machines replacing human labor.
PHASES OF INDUSTRIAL
REVOLUTION
UTILIZATION OF AI IN
MENTAL HEALTH CARE
(1) USING AI TO DETECT, FLAGGING
RISKS AND & PREDICTION

 In Europe, the WHO estimated that 44.3 million people suffer


with depression and 37.3 million suffer with anxiety  Accessing
mental health services could be a problem & has little interaction
with the doctors

 AI could not only help with the diagnostics and early detection of
mental health issues, but it could also participate meaningfully in
the management of disorders.

 As compared to a human psychiatrist or psychologist, the most


advantageous features of smart algorithms could be their
anonymity and accessibility.
(1) USING AI TO DETECT, FLAGGING RISKS
AND & PREDICTION – USING MOBILE APPS

 Smartphone-based applications have been developed in recent


years that are able to proactively check on patients, be ready to
listen and chat anytime, anywhere, and recommend activities
that improve the users’ wellbeing.

 More affordable than therapy itself, thus also those people could
get some help who could otherwise not get any counselling at all.


(1) USING AI TO DETECT, FLAGGING
RISKS AND & PREDICTION

 Newly mobile app developed includes Wiebot, Wysa, Chatbot, Pacifica

 These apps asks patients 

 To answer a five to 10 minute series of questions by talking into their phone.

 Patients were asked about their emotional state

 To tell a short story

 listen to a story and repeat it

 Given a series of touch-and-swipe motor skills tests

 Included tools such as meditation, relaxation, mood and health tracking tools

 Referral made if necessary


SMARTPHONE APPS TO MANAGE
MENTAL HEALTH

 Woebot, an app-based mood tracker  It


promises to meaningfully connect with the
user, to show bits and pieces of empathy
while giving you a chance to talk about your
troubles to a virtual robot, and have some
counseling in return.

 Chatbot that combines AI and principles


from cognitive behavioral therapy with
guided meditation, breathing, and yoga.

 Chatbot was developed in collaboration


with researchers from Columbia and
Cambridge universities, and aims to help
users manage their emotions and thoughts.

 Still in trials either can be used in clinical


settings
CONT..

 Woebot Undergone trial in Norway. The team asked human


clinicians to listen to and assess speech samples of 225
participants – half with severe psychiatric issues; half healthy
volunteers – in Louisiana and Norway. They then compared those
results to those of the machine learning system.

 Currently, working to refine their measurements, and see how


the tool could be applied to a range of mental health conditions,
from schizophrenia to mild cognitive impairment.

 Need larger studies to test generalizability.


APPS:CALORIE COUNTING & FITNESS
DEVISE AND EATIGN DISORDERS

 This study explored associations between


the use of calorie counting and fitness
tracking devices and eating disorder
symptomatology.

 Participants (N=493) were college students


who reported their use of tracking
technology and completed measures of
eating disorder symptomatology.

 Individuals who reported using calorie


trackers manifested higher levels of eating
concern and dietary restraint, controlling
for BMI.

 Although preliminary, overall results


suggest that for some individuals, these
devices might do more harm than good.
TRACKING MENTAL WELL BEING

 Wearable devices offer further opportunities. Manypeople


already use wearables to track their sleep and physical
activity, both of which are closely related to mental well-
being

 Robot making friends in elderly care – people with dementia


and other few mental health issue residents appears happy
and interact.
AI TO DETECT SUICIDE RISK

 Written language is also a promising area for AI-assisted mental


health care.

 Studies has shown that machine learning algorithms trained to


assess word choice and order are better than clinicians at
distinguishing between real and fake suicide notes  able to pick
up on signs of distress.

 Using these systems to regularly monitor a patient’s writing,


perhaps through an app or periodic remote check-in with mental
health professionals, could feasibly offer a way to assess their risk
of self-harm.

 Still in development and trial


RESEARCH FINDINGS

 Suicide  2nd leading cause of death among 25-34


years old and the 3rd leading cause of death among 15-
25 years old in the US.

 In ED  up to clinical judgment

 Aims- to determine role of computational algorithms


in understanding patient’s thoughts as represented by
suicide notes.

 Methods- comparing suicide notes from 33 suicide


completers and matched to 33 elicited notes from
healthy control group.

 Participants= 11 mental health professionals, 31


psychiatric trainees to decide whether its genuine or
elicited.

 Results: trainers got 49% accurate, 63% mental health


professionals and 78% for the machine.

 = Important step in developing an evidence based


predictor of suicide attempts
(2) AS INTERACTIVE COMPANION

 A question-answering computer program that is able to


converse with humans.

 Increase interactions
(3) AI TO CONDUCT ASSESSMENT

 Digital interviewers by the side of human doctors

 Another area where algorithmic analysis could help is the


automation of certain tasks e.g. conducting structured
clinical interviews could be done in the future by virtual
humans –interviewees would not be that much burdened
by sharing their secrets to a virtual, anonymous entity as
to another, possibly judgmental human.
Cont..

 One study, a virtual human conducted interviews with real people in


emotional distress.

 Distinct speech patterns, such as slurring vowel sounds, and patterns in


body language, such as the direction someone is looking, were analyzed.

 Machine sends notification to healthcare professionals.

 Such technology could find patterns and behavior that human interviewers
might missed!
Why we should use FIR in our
patients’ care?
BENEFITS OF AI IN HEALTHCARE
SETTINGS

 Its benefits have been extensively discussed in medical literature.

 Benefits include:

 It can assist physicians by storing and locating relevant and


current medical information from journals, textbooks and clinical
practices, which in turn could inform patient care.

 It can assist with managing large-scale data collection processes,


and with the interpretation of the information collected.

 E.g. Medical informatics system in Japan - centralised database,


consultation and referral system, BruHIMS in Brunei Darussalam.
CONT..

 AI offers alternative solutions to communities where trained


doctors and resources are scarce – E.g. in Japan, they
manufactured devise that can be use to monitor contraction and
fetal heart of the fetus.

 Based on research with collaboration from Kagawa University,


Japan  this devise has proven to be helpful in Japan as it is a big
country and geographical location may make it harder for some
patients to have a regular checkup by their doctor.

 This machine is capable to send all these relevant and important


clinical information to the doctor without having to physically go to
the hospital.
CONT..

 Patients and practitioners are able to engage the program through


various means (SMS, voice notes and online messaging services),
and ask questions related to current stock of medications,
including when more stock of a specific drug will be ordered and
available for use.

 It can assist with managing large-scale data collection processes,


and with the interpretation of the information collected. E.g.
Medical informatics system in Japan - centralised database,
consultation and referral system, BruHIMS in Brunei Darussalam.
THE STATE OF MENTAL HEALTH
SERVICES

 Generally, the world is experiencing a mental health crisis.


Approximately 15.5% of the global population is affected by
mental illnesses, and those numbers are rising.
 Although there are many who require treatment, more than
50% of mental illnesses remain untreated.
 The critical shortfall of psychiatrists and other mental health
specialists to provide treatment exacerbates this crisis.
 In fact, nearly 40% of Americans live where there is
a shortage of mental health professionals; 60% of U.S.
counties don’t have a psychiatrist.
HOW AR HELPS WITH MENTAL
HEALTH CRISIS

 Support mental health professionals

Algorithms can analyze data much faster than humans, can suggest
possible treatments, monitor a patient’s progress and alert the
human professional to any concerns. In many cases, AI and a
human clinician would work together.

 24/7 Access

Due to the lack of human mental health professionals, it can take


months to get an appointment. If patients live in an area without
enough mental health professionals, their wait will be even longer.
AI provides a tool that an individual can access all the time, 24/7
without waiting for an appointment.
CONT..(Enhancing accessibility)

 Given the increasing interest of many national health care


systems in extending the accessibility of services and
treatment programmes for mental disorders, several new
technological strategies have been used, from telemedicine
to Internet approaches, vodcast and virtual reality scenarios

 Previous pilot studies have suggested that computer games


in general could be of help as additional interventions, in
areas such as ADHD, Schizophrenia, anxiety disorders
PlayMancer: a video game for
treating mental disorders

 Developed in Spain

 It introduces the player to an interactive scenario, where the


final goal is to increase emotional self-control skills in
patients and self-control over their general impulsive
behaviours.

 A multidisciplinary team of clinicians, engineers and


programmers have developed this video game, by
considering user requirements and emotional reactions as
well as personality profiles of the targeted patients.

 evaluation trials still ongoing


Cont..(PlayMancer)

 New interaction modes, such as emotion recognition from


speech, face and physiological reactions, and specific
impulsive reactions were integrated into the game.

 The video game uses feedback for helping patients to learn


relaxation skills, acquire better self-control strategies and
develop new emotional regulation strategies.
CONT..

 Not expensive

The cost of care prohibits some individuals from seeking help.


Artificial intelligent tools could offer a more accessible solution.

 Comfort talking to a bot

While it might take some people time to feel comfortable


talking to a bot, the anonymity of an AI algorithm can be
positive. What might be difficult to share with a therapist in
person is easier for some to disclose to a bot.
The UK experiences in using AI
in health care services
FIR & AR IN THE WEST

 Western countries e.g. Scotland developed


health and care policy is shifting from a
patriarchal medical model to a co-managed and
integrated approach.
 FIR is transforming manufacturing in line with
the digital consumer revolution.
 Digital health and care initiatives are beginning
to use some of the same capabilities to optimize
healthcare provision.
CARE 4.0: ITS CONCEPT

 Driven by the need to have a more person-centered


application of Industry 4.0 capabilities for care.

 Scotland introduces ‘Care 4.0’, a new paradigm that could


change the way people develop digital health and care
services, focusing on trusted, integrated networks of
organizations, people and technologies.

 In Scotland, policy and strategy are driving towards more


preventative, co-managed, integrated and community-based
care, with digital technology seen as a key asset to deliver
change at scale
CONT..

 Care 4.0 emphasises that technology should enable person-


centered care, whether this is from the perspective of those
providing or those receiving care and services.

 Technology needs to enable the right care at the right time through
providing access and ease of use for citizens to have control of
interactions with systems and services.

 Developing technology that ‘enables’ the provision and receipt of


care also alleviates fears that technology will replace human
interaction where it is most valued and appropriate.

 By 2020, 90% of UK adults will use a smart phone and 50% will have
an average of four online media subscription services
WORD OF CAUTION

 Technology should enable person-centered care, whether


this is from the perspective of those providing or those
receiving care and services.

 Technology needs to enable the right care at the right time


through providing access and ease of use for citizens to have
control of interactions with systems and services.

 Developing technology that ‘enables’ the provision and


receipt of care also alleviates fears that technology will
replace human interaction where it is most valued and
appropriate.
IMPORTANT TO CONSIDER..

 It is important to understand people’s lived experience


 Therefore, involving people who are likely to be the ‘end-
users’ of technology in the design process helps to foster a
culture of innovation by giving people permission and a safe
space to generate ideas and critically reflect and evaluate
potential solution.
AI & QUALITY NURSING CARE

 Care and treatment is provided for the patient. Thus care plan has to
be discussed and agreed by all, including the patient and family.

Consider the followings:

 What works for them? Do they prefers it?

 Does he/she has access to the technology?

 Knowledge and skills - HCP and patients

 Enough evidence?

 Health literacy level – important as it focuses on empowerment


EVIDENCE BASED
PRACTICE
When AR goes wrong…
UNFORTUNATE EVENT..

 AI can prevent ‘human error’ from occurring in clinical


practice.

BUT….THINGS HAVE GONE WRONG IN THE PAST (IT HAS


IMPROVED NOW!)

 Should an AI system make an error, this could have serious


implications for the practitioner, patients and institution
involved.
KILLED BY THE MACHINE:
THE THERAC 25

 Six unfortunate patients died in 1986


and 1987

 Therac-25 exposed them to massive


overdoses of radiation, killing four and
leaving two others with lifelong injuries.

How it happened?

 the software controlling the machine


contained bugs which proved to be
fatal.

 the design of the machine relied on the


controlling computer alone for safety.
There were no hardware interlocks or
supervisory circuits to ensure that
software bugs couldn’t result in
catastrophic failures
SYSTEM HACKED

 With privacy breaches in respect of healthcare data continually


being placed in the spotlight added complexities around the safety
of data contained within or generated from the AI system come into
question.

 There is potential for the AI system itself to be hacked, manipulated


or spammed with ‘fake’ data, which could become problematic if
not detected in time.

 Concerns have also been raised that AI may not be used for its
intended purpose.
IMPACT OF DATA BEING HACKED

 Stigma: Risk of
discrimination, particularly
with mental health issue
being sensitive and
uncomfortable issue to
talk about in some
cultures.
STILL NEED HUMAN INTERACTION

 Substituting the human element in the practice of healthcare


may lead to situations of social isolation, if AI replaces the
practitioner’s role entirely.

 Remember!!!
AI are not designed to replace doctors and psychiatrists,—
just to further improve their care!
TRACKING DEVISE & APPS

 There are significant privacy concerns as well as making


people comfortable and willing to accept various levels of
being monitored in their day-to-day lives.

 In addition, there is no regulation for these applications, so it


is advised that any app be used in conjunction with a mental
health professional
CONCLUSION

 There is no denying that AI, as the driving force behind the Fourth Industrial
Revolution, will have a positive impact on the future of medicine, especially in
developing-world settings, where resources are scarce.

 However, the excitement around AI must be balanced with broader ethical,


legal and social concerns when its use in medicine is contemplated.

 AI has the promise to provide critical resources we need to overcome our


mental health crisis.

 While FIR proposes improvements to almost every field that IT touches 


the medical sciences, ethical, social and legal challenges associated with its
implementation arise.

 Interacting with an AI device, while beneficial in terms of providing


empowerment and independence to the patient, should not aim to replace
human interaction entirely
REFERENCES

 Chute, C. & French, T. (2019). Introducing Care 4.0: An Integrated Care Paradigm Built on Industry 4.0 Capabilities. International Journal of
Environmental Research and Public Health. (16): 22-47.

 Creamer Media’s Engineering News. Artificial intelligence is solving African healthcare challenges. Available on-line at:
http://www.engineeringnews.co.za/article/artificial-intelligence-is-solving-african-healthcare-challenges-2018-07-20/rep_ id:4136

 Chandler, C, Foltz, P.W. & Elvevåg. B. Using Machine Learning in Psychiatry: The Need to Establish a Framework That Nurtures
Trustworthiness, Schizophrenia Bulletin,. Available on-line at:
https://doi.org/10.1093/schbul/sbz105, https://academic.oup.com/schizophreniabulletin/advance-article-
abstract/doi/10.1093/schbul/sbz105/5611057?redirectedFrom=fulltext

 DiSanzo, D. Watson. Health is committed to using AI to tackle major healthcare challenges. Available on line at:
https://www.ibm.com/blogs/watson-health/ai-healthcare-challenges/

 Jiang F, Jiang Y, Zhi H, et al. (2017). Artificial intelligence in healthcare: Past, present and future. Stroke Vasc Neurol. 2(4):230-24.

 Mahomed, S. (2018).Healthcare, artificial intelligence and the Fourth Industrial Revolution: Ethical, social and legal considerations. South
Africa Journal of Bioethics and Law. 11(2): 93 – 95.

 Pestian, J., Nasrallah, H., Matykiewicz, P., Bennett, A & Leenaars, A. (2010). Suicide note classification using natural language processing: A
content analysis. Biomed Inform Insights. (3): 19-28.

 Simpson, C,C. & Mazzeo, S.E. (2017). Calories counting and fitness tracking technology: Association with eating disorder symtomatology. (26):
89-92.

You might also like