Professional Documents
Culture Documents
2: July 2019
Richard Swann
Abstract
What forces might bring new pressures on the individual as the user of
therapy and the therapist as the provider, over the next thirty years?
Artificial Intelligence, Big Data and advancing medical science are
shaping new ways to think about the ontology of Dasein. Using the
technique of scenario planning, three visions of future worlds of therapy
are presented.
Key Words
Enframing, transindividuation, post-humanism, AI, big data, being-in-
the-world, Caring
‘If we think of the world’s future, we always mean the place it
will get to if it keeps going as we see it going now and it doesn’t
occur to us that it is not going in a straight line but in a curve
& that its direction is constantly changing.’
(Wittgenstein, 1998: p 5e)
Introduction
The occasion of the thirtieth anniversary conference of the Society of
Existential Analysis presents an opportunity to look back on the history
of the Society, but also to look forward. If such a thing is feasible, this
paper proposes looking forward thirty years to where we might be and
what challenges we may face in the work that we do. To look so far into
the future might seem, and probably is, a wholly foolhardy proposition,
as Wittgenstein suggests in the epigraph. The wondering of this paper is,
in the spirit of Foucault’s (2003: pp 1-2) lectures at the College de France,
1. Succession
Augé (2014: p 3) describes a view of the future that is an extrapolation
of what we see, experience and understand about the past and present
and that we believe will evolve into the future in more or less discernible
ways – trends and signs in the present can be seen as reliable way markers
to the future. This is a way of looking at the future that Wittgenstein
(1998) clearly criticizes when he comments ‘it doesn’t occur to us…’ that
the future cannot be so easily predicted (see epigraph). It is also the view
of history that psychoanalytic and psychologistically based schools of
psychotherapy espouse in their models of childhood development into
adulthood. There are aspects of life that we can more or less entertain as
predictable in order for us to be able to plan a future at all. There are also
significant political dimensions to this view of the future in the form of
a proposed human right – a right to a future tense, or futurity as Zuboff
(2019: p 329) describes – the right for the world not to be so chaotic and
unpredictable that all sense of a future becomes obscured in doubt, fear
and precarious living.
2. Inauguration
Alternatively, Augé (2014: p 19) describes a future that is a rupture from
the past and is therefore essentially unknowable because innovation and
change is so rapid, so profound and so disturbing as to make prediction
over anything other than a very short term, nonsensical, even as an exercise
in constructive misunderstanding (to which I hope that this paper aspires,
for example). There is reason to believe that the future, as currently
imagined, has strong elements of an inauguration, and extrapolation can
take place only over truncated time frames. Within this framework of
thinking, the question arises of whether we can speak meaningfully of a
future for the existential phenomenological tradition of psychotherapy,
except in the starkest of, quite literally, existential terms.
3. Culmination
Augé’s two views of the future are, I believe, based on the notion that
history will continue indefinitely to be a dynamic process of unfolding
of the human-focused set of narratives about the past, present and potential
futures of humankind. However there is a scenario, which I name a
‘culmination’, where humanity, as we understand it now, comes to an end
point – not necessarily a catastrophic one – but perhaps we might think
of it as a stasis, beyond which human level evolution is no longer possible,
desirable and/or necessary. The notion is inspired by Dante’s visions of
Inferno (Hell) and Paradiso (Heaven), as end points beyond which further
travel is impossible. As we will see later, there are some ideas for a post-
human future that have the flavor of an inauguration and a culmination
– one thinks specifically of the Singularity (Kurzweil, 2005) or, a bit less
dramatically, the ideology of post-humanism and techno-humanism (e.g.
Braidotti, 2013).
Symbolic Misery
Within the economy of digital reticulated traces, Stiegler identifies two
important processes bearing on Being and a consequent outcome that
flows from these processes. The processes are Proletarianization (ibid:
p 30 et passim) and Automatization (ibid: p 20) leading to the outcome
of symbolic misery (Stiegler, 2014: p 1). The first of these processes,
Proletarianization, is a rapid de-skilling of many forms of work. Originally,
this process was a mechanical one in which the artisanal skills used in
the end of the species of humanity. Braidotti poses the question with a
number of different variations:
is a case to say that we are no longer just ourselves but also our digital
history, which we author, like a book, every day. But also, like a book,
once published we authors no longer have full, or even much, control over
the history that our digital traces record. Not only governments, but especially
commercial entities can now record and store data-fied versions of our
behaviour and actions at a quality of granularity in time that is less than
second-by-second.
At another level, the anti-Humanistic project of undermining the sense
of the individual as a self is also recognition of the reality of ‘unprecedented
degrees of intimacy and intrusion’ (ibid: p 89) that we now experience in
our relationship with technology. For Braidotti, this presents the Post-
Human with an aporia that forces a re-examination of ‘ontological categories,
for instance between the organic and the inorganic, the born and the
manufactured, flesh and metal, electronic circuits and organic nervous
systems’ (ibid: p 89).
Finally, Braidotti’s concept of ‘spiritual death’ is of interest. This concept
accompanies and amplifies the call to re-examine ontological categories.
Certain activities and states of being, such as ‘…addictions, eating disorders,
melancholia, burn-out and states of apathy and disaffection…’ are described
as ‘…embodied social practices…’ using a language that removes these
words from the language practices of medical pathology, and re-appropriates
them for the language of ‘…neutral manifestations of interactions with
and resistance to the political economy of commodification of all that live’
(ibid: p 114). This was written before the scale of the opioid crisis in the
United States was fully understood (even if it is now?), but within this
earlier context Braidotti is already talking of the blurring of lines between
self-destructive and fashionable behaviour presented by choices of legal
drugs such as Prozac and Ritalin and illegal drugs. It was also written
before the scale of personal data harvesting, to which we are all now
subject, became clear.
These ideas present a contemporary take on the alienation of the individual
that is a familiar trope of the existential critiques of Kierkegaard, Nietzsche,
Marcuse and French existentialism. The twenty-first century has magnified
both the scale of the problem and the mechanisms by which it is effected
and sustained. We are becoming our data, but we do not own or have
access, in large part, to that data.
PART II – DRIVERS OF CHANGE
Artificial Intelligence
If we hold the potential global crises of climate change, political instability,
population growth, mass migration, resource depletion beyond the scope
of this paper and known already as existential risks, then Artificial
1. A capacity to learn;
2. An ability to deal effectively with uncertainty and probabilistic
information;
3. An ability to extract useful concepts from sensory data and internal
states; and
4. A capacity to leverage acquired concepts into flexible combinatorial
representations for use in logical and intuitive reasoning (ibid:
p 23).
guru and futurologist Ray Kurzweil in his book of the same name (2005).
This has profound existential implications. However, not all AI practitioners
are as certain about the prospects of being able to achieve AGI. Some
believe it is going to take much longer and some believe it will never be
possible. There are others who believe it will come more quickly as well,
hence the balance of probabilities is currently set at around 2047 or someway
between 2040-2055 (Bostrom, 2014; Tegmark, 2017).
One objection might be that AGI and human consciousness, whatever
that might be, will never be the same thing. In particular, the argument
that consciousness will simply emerge out of complexity, much like the
first self-replicating organisms eventually emerged out of the increasingly
complex chemistry of the early Earth, or human consciousness eventually
emerged through the long march of evolutionary complexity, is a guess,
or a hope, rather than a racing certainty. Another objection might be that
it is actually the more limited capabilities of the human brain in terms of
its powers to render intelligible its environment – and its embodiment in
a particular locus at any given moment – that render meaning for a human
in a way that an AGI entity cannot achieve. It is the very Being-there-ness,
of Being that simply cannot be replicated by an AGI entity. The AGI cannot
and will not experience the aspect of randomness in the thrownness of life
– why this life rather than any other life? And cannot experience learning,
growth and development in the many ways that a human being does, with
our generally limited capacities for memory and recall, and which involve
effort and pain and failure, as well as success and achievement. When
recall is perfect and instantaneous, and can draw on ubiquitous sensory
data and memory, where is the pain of doubt or of being found in error,
and where is the pleasure of achieving understanding? These aspects of
the Being of Dasein are rendered dead or useless by mechanical omniscience.
The sentimental renderings of ‘intelligent’ robots as depicted in films such
as Her (2013) or I, Robot (2004) can likely only ever be simulations of
emotions that have no grounds in hope, doubt or pain. It may well turn
out that pain, that most inconvenient of human sensations, is what is
inimitable to a machine.
Big Data
In 2008, Chris Anderson, then the Editor of Wired, declared the death of
theory. He argued that the integration of increasing amounts of digital data
and the evolving sophistication of algorithmic analysis rendered redundant
the need for theorizing about reasons and causes. One could simply crunch
the numbers, so to speak, and watch as computers churned out the results
of correlations in data sets that otherwise might never have been revealed
or taken months, even years of researcher time to analyse and uncover.
It is not the place to rehearse those arguments here. But only to note a few
future concerns such as:
n The ethics of super long life – have these been fully explored?
What questions do these possibilities pose for the exploration of
being-toward-death and the meaning that can be made of life?
n What do the possibilities of radical body alteration and body
enhancement represent in terms of Dasein’s struggle with their
thrownness into the world, an aspect of which must include Dasein’s
genetic inheritance, as well as its expression through the environment
in which body and mind grow and age.
n As the genetic revolution unfolds, perhaps along a certain path
where embryos become increasingly carefully selected and modified
to maximize health, vigour and mental prowess, do we risk a kind
of reversion to an artificial genetic mean that fixes an early twenty-
first century aesthetic in order to purchase certainty in the present
rather than the flexibility of randomness into the future? Again,
this poses questions about the bearing of Dasein toward the future
if so much of the future has already been resolved at the genetic
level – the nature of historicity and futurity subtly alters when
genetic pre-determination becomes significantly alterable.
n If there is a pill for everything, or an army of nanobots to repair
any damage caused by accident, illness or genetic malfunctioning,
then what responsibility to ourselves do we continue to owe?
Particularly in the field of psychotherapy, a pill to fix any mood
tends to obviate, or undercut, the power of the healing struggle.
Yet who would deny the opportunity to alleviate pain quickly and
easily (and non-addictively) if it were possible? Our whole notions
of healing, or helping in the repair and recuperation of others, is
open to question, especially in an environment where a narrow
idea of efficiency predominates.
n Finally, such developments as may become possible in these fields
over the next thirty years will almost certainly be unequally available
to those in need. For the wealthy, money is certainly more likely to
buy the best of health and, in the future, will be key to the initial
distribution of new body and mind enhancements. In a world where
a few can essentially fulfil the fantasy of acquiring enhanced capabilities,
what of the rest who cannot and may never be able to?
References
Anderson, C. (2008). The end of theory: The data deluge that makes
scientific method obsolete. Wired Magazine. https://www.wired.
com/2008/06/pb-theory/ [Accessed on 4 July 2018]
Atkinson, K. (2018). This Robot Co-Taught a Course at West Point. https://
www.axios.com/robot-ai-teaching-college-course-at-west-point-98ce5888-
873b-4b72-8de5-0f7c592d66b0.html [Accessed on 25 October 2018]
Dante, A. (2000). The Divine Comedy. Trans. Dale, P. London: Anvil
Press Poetry.
Foer, F. (2017). World Without Mind: The existential threat of big tech.
London: Penguin.
Foucault, M. (2003). Society Must be Defended. Trans. Macey, D. London:
Penguin.
Heidgger, M. (1962/1927). Being and Time. Trans. Macquarrie, J. &
Robinson, E. Oxford: Blackwell.
Heidegger, M. (1971). Poetry, Language, Thought. Trans. Hofstatder, A.
New York: HarperCollins.
Heidegger, M. (1977). The Question Concerning Technology and Other
Essays. Trans. Lovitt, W. New York: Harper Perennial.
Kurzweil, R. (2005). The Singularity is Near: When humans transcend
biology. London: Penguin.
Manyika, J. and Sneader, K. (2018). AI, Automation and the Future of
Work: Ten things to solve for. McKinsey Global Institute: New York
- https://www.mckinsey.com/featured-insights/future-of-organiza...
e50&hctky=1979692&hdpid=bb9f89f0-458b-4b4e-a1ee-ad99e602294e
[Accessed on 12 July 2018]
Nietzsche, F. (1986). Human, All Too Human: A book for free spirits.
Trans. Hollingdale, R. Cambridge, UK: Cambridge University Press.
Quora.com. (2018). https://www.quora.com/What-is-the-difference-between-
artificial-Intelligence-Machine-Learning-and-Human-Computer-Interaction-
And-how-where-are-these-courses-applied-to-real-world-problems.
[Accessed on 1 November 2018]
Stiegler, B. (2014). Symbolic Misery, Volume 1: The hyperindustrial
epoch. Trans. Norman, B. Cambridge: Polity Press.
Stiegler, B. (2016). Automatic Society, Volume 1: The future of work.
Trans. Ross, D. Cambridge, UK: Polity Press.
Trollope, A. (1994). The Way We Live Now. London: Penguin Classics.
Wittgenstein, L. (1998/1929). Culture and Value: Revised edition. Ed.
Wright, G. H. von. Trans. Winch, P. Oxford: Blackwell Publishing.
Wittgenstein, L. (2009/1953). Philosophical Investigations. Trans. Anscombe,
G. Hacker, P. & Schulte, J. Oxford: Blackwell.
Zuboff, S. (2019). The Age of Surveillance Capitalism: The fight for
a human future at the new frontier of power. London: Profile Books.