You are on page 1of 22

Robot Rela)onships:

What Ar)ficial Intelligence (AI) Can Teach us About Love

Rabbi Efrem Goldberg

Shavuos Night: Sponsored by Eli & Orlie Cohen in memory of Es< Moskowitz,
‫אסתר תהלה בת ר' גבריאל פינחס‬

Teens Shavuos Night: Sponsored by Gil & Lysee Stein in honor the BRS Rabbis and
Staff for everything they do for the Boca Raton Community

Shavuos Day: Sponsored by Daniel & Caroline Katz in apprecia<on of all the classes and learning opportuni<es at BRS
ChatGPT as God? AI bots capable of
starting new religions, warns expert
1.
Mehul Reuben Das 5/3/2023

Notable scholar, author and historian Yuval Noah Harari believes that the world is on the verge of getting a new
religion that will be completely generated by arti cial intelligence. The scholar, widely known for his best-selling
books such as Sapiens and 21 Lessons for the 21st Century, believes that software like ChatGPT may draw
devotees by creating its own sacred texts in the not-so-distant future.

Speaking at a science conference, he stated that AI had “achieved mastery” of our language and was now
capable of shaping human society. Harari used the occasion to lend his voice to mounting calls for immediate
regulation of the industry, which experts say is in the midst of a “dangerous” arms race.

“In the future, we may see the rst cults and sects in history with treasured scriptures created by a non-human
intelligence,” he added.

“Of course, religions throughout history claimed that their holy books were written by unknown human
intelligence. This was never true before. This could become true very, very quickly, with far-reaching
consequences,” he added.

He cautioned that robots now had the capacity to “cocoon us in a Matrix-like realm of illusions,” referring to
the 1999 sci- thriller.

For those who are unaware of The Matrix, starring Keanu Reeves, the lm depicts a dystopian future in which
mankind is inadvertently locked inside a virtual world built by clever robots to divert humans while utilising their
bodies as a source of energy.

AI can in uence societies in unimaginable ways



“Contrary to what certain conspiracy theories believe, you don’t actually need to implant chips in people’s
brains to control or in uence them,” he stated. “For thousands of years, prophets, poets, and politicians have
utilised language and stories to manipulate and control people while reshaping society,” he added.

“AI is now likely to be capable of doing it. And once it can, it won’t have to deploy killer robots to shoot us
down. It can get humans to pull the trigger,” he went on to say. “We need to act rapidly before AI gets out of
our hands,” he warned, calling for tighter regulation of AI and companies working with AI. “Drug companies
cannot sell people new medicines without rst subjecting these products to rigorous safety checks.”

“Similarly, governments must immediately ban the release into the public domain of any more revolutionary AI
tools before they are made safe,” he said.

Harari’s advocacy against AI



Harari has been a very vocal advocate for those who believe that AI will spell the doom of mankind. He was
one of the thousands of dignitaries who had signed an open letter calling for a halt on the development of AI,
and asked for regulations before developers are allowed to work further on the technology.

Harari shares his fears with tech billionaire Elon Musk, who also is very apprehensive about how quickly AI is
developing. Once an initial investor at OpenAI, the people who created ChatGPT, Musk has then gone on to
denounce the company’s founder and ChatGPT. Musk is now in the process of creating his own version of
ChatGPT that would “only emphasise on the truth.”

Rabbi Efrem Goldberg Page #1 Boca Raton Synagogue


fl
fi
fl
fi
fi
fi
fi
People warned AI is becoming like a God
and a ‘catastrophe’ is coming
2.
Amelia Jones  15 May 2023

An arti cial intelligence investor has warned that humanity may need to hit the breaks on AI development,
claiming it's becoming 'God-like' and that it could cause 'catastrophe' for us in the not-so-distant future.

Ian Hogarth - who has invested in over 50 AI companies - made an ominous statement on how the constant
pursuit of increasingly-smart machines could spell disaster in an essay for the Financial Times.

The AI investor and author claims that researchers are foggy on what's to come and have no real plan for a
technology with that level of knowledge."They are running towards a nish line without an understanding of
what lies on the other side," he warned.

Hogarth shared what he'd recently been told by a machine-learning researcher that 'from now onwards' we
are on the verge of arti cial general intelligence (AGI) coming to the fore.

AGI has been  de ned  as is an autonomous system that can learn to accomplish any intellectual task that
human beings can perform and surpass human capabilities. Hogarth, co-founder of Plural Platform, said that
not everyone agrees that AGI is imminent but rather 'estimates range from a decade to half a century or more'
for it to arrive.

However, he noted the tension between companies that are frantically trying to advance AI's capabilities and
machine learning experts who fear the end point. The AI investor also explained that he feared for his four-
year-old son and what these massive advances in AI technology might mean for him. He said: "I gradually
shifted from shock to anger.

"It felt deeply wrong that consequential decisions potentially a ecting every life on Earth could be made by a
small group of private companies without democratic oversight.” When considering whether the people in the
AGI race were planning to 'slow down' to ' let the rest of the world have a say' Hogarth admitted that it's
morphed into a 'them' versus 'us' situation. Having been a proli c investor in AI startups, he also confessed to
feeling 'part of this community’.

Hogarth's descriptions of the potential power of AGI were terrifying as he declared: "A three-letter acronym
doesn’t capture the enormity of what AGI would represent, so I will refer to it as what is: God-like AI."

Hogarth described it as 'a superintelligent computer that learns and develops autonomously, that understands
its environment without the need for supervision and that can transform the world around it’. But even with
this knowledge and, despite the fact that it's still on the horizon, he warned that we have no idea of the
challenges we'll face and the 'nature of the technology means it is exceptionally di cu

lt to predict exactly when we will get there’.

"God-like AI could be a force beyond our control or understanding, and one that could usher in the
obsolescence or destruction of the human race," the investor said.

Despite a career spent investing in and supporting the advancement of AI, Hogarth explained that what made
him pause for thought was the fact that 'the contest between a few companies to create God-like AI has
rapidly accelerated’.

He continued: "They do not yet know how to pursue their aim safely and have no oversight.” Hogarth still
plans to invest in startups that pursue AI responsibly, but explained that the race shows no signs of slowing
down.

"Unfortunately, I think the race will continue," he said."It will likely take a major misuse event - a catastrophe -
to wake up the public and governments."

Rabbi Efrem Goldberg Page #2 Boca Raton Synagogue


fi
fi
fi
ff
fi
fi
ffi
3. Devarim 30:19

4. Ramban

Similarly He said, Let the waters swarm with swarms of living creatures, since both the body and soul of sh come from
the waters by word of G-d Who brought upon them a spirit from the elements, unlike man, in whom He separated the
body from his soul, as it is said, And the Eternal G-d formed man of the dust of the ground, and breathed into his nostrils
the breath of life. On the third day of creation when the plants came into being, He mentioned nothing at all concerning a
soul because the power of growth which resides in plants is not a “soul;” only in moving beings is it a “soul.” And in the
opinion of the Greeks, who say that just as in moving beings the power of growth is only through the soul, so also in the
case of plants is the power of growth through a soul. The difference between them will be that the one [the moving
being] is a nefesh chayah (a living soul), that is, a soul in which there is life, for there is a soul which has no life and that
is the soul of plants. Our Rabbis have mentioned “desire” in connection with date trees. Perhaps this is a force in growth,
but it cannot be called “a soul.”

5. Rambam
Moreh Nevuchim 3:17

Rabbi Efrem Goldberg Page #3 Boca Raton Synagogue


fi
6. Kohe es Rabba 3:18

“I said in my heart: It is by the speech of the


sons of man that God has di erentiated
them, and that they may see that they
themselves are but as animals”
(Ecclesiastes 3:18).

“I said in my heart: It is by the speech of the


sons of man” – by the matters of which the
wicked speak in this world, for they curse
and blaspheme in this world, and the Holy
One blessed be He bestows tranquility upon
them. To what purpose? “That God has
di erentiated [levaram] them,” to designate
[levarer] the attribute of justice for the
wicked. “And that they may see that they
themselves are but as animals,” to see
and to show the world that the wicked
are likened to animals, just as the animal
is condemned to be killed and does not
come to life in the World to Come, so too,
the wicked, like the animals, are
condemned to be killed and do not come
to life in the World to Come.

Another matter, “it is by the speech of the


sons of man” – by the matters of which the
righteous speak in this world regarding
asceticism, fasts, and su ering. To what
purpose? “That God has di erentiated
[levaram] them,” to designate [levarer] for
them [reward for] the measure of their
righteousness. “And that they may see that
they themselves are but as animals,” to see
and to show the world how Israel is drawn
after Him like animals, as it is stated: “Now,
you are My sheep, the sheep of My pasture,
you are Man” (Ezekiel 34:31). And just as this
animal extends its neck for slaughter, so, too, the righteous, as it is stated: “For we are killed all day long for
You…” (Psalms 44:23). Let this tradition be in your hand, anyone who performs a mitzva just before his death,
it is as though his measure of righteousness was lacking only that mitzva, and he completed it. And one who
performs a transgression just before his death, it is as though his measure of wickedness was lacking only
that transgression, and he completed it. Both these and those go whole; these whole in the measure of their
righteousness and those whole in the measure of their wickedness.

Rabbi Bon and Rabbi Yitzḥak, Rabbi Bon said: Is it not, just as I established prophets from Israel who are
called man, as it is stated: “You are Man” (Ezekiel 34:31), did I not establish prophets for the idolaters who are
called animals, as it is stated: “[Should I not have pity on Nineveh, that great city, in which there are more than
one hundred and twenty thousand people…] and many animals” (Jonah 4:11).

Rabbi Efrem Goldberg Page #4 Boca Raton Synagogue


ff
ll
ff
ff
ff
7.
Sophia Naughton
NEW YORK — Forget about destined star-crossed lovers — over half of Americans agree their soulmate is
their pet. A poll of 2,000 U.S. pet owners revealed that 53 percent believe their pet knows them better than
anyone else in their life, including their best friends, family members or even their signi cant others.

Nearly half (45%) tell all their deepest secrets to their pets and 72 percent swear their pet can tell exactly what
they’re feeling at any point in time.

According to respondents, pets can pick up on when you feel upset (71%), when someone is outside the
home (66%), when you feel happy (61%), when you feel angry (58%) and when you feel tired (43%).

One in four (25%) claimed their pet can even mirror their emotions “most” or “all” of the time. Pets like to
mimic certain human-like traits, like coming to help when they’re needed (61%), eating together or at the
same time (49%), snoring (48%) or sleeping under the blankets (46%).

Conducted by OnePoll and commissioned by Zesty Paws, in honor of National Pet Month, the study found
that 32 percent of pet parents believe they’re so in sync with their pets, that their zodiac signs are even
compatible with each other. Over a third (37%) believe their personality traits accurately re ect their zodiac
signs, while 28 percent believe the same for their pet’s traits and signs.

Respondents also found their zodiacs are in sync when it comes to how they spend their weekends. Fire signs
like Aries (77%), Leo (82%) and Sagittarius (80%) are big fans of relaxing. Meanwhile, 62 percent of loyal
Capricorns love playing together on weekends and 22 percent of socialite Libras like visiting family and
friends.

“The bond we have with our pets is unique in a variety of ways,” says vice president of marketing at Zesty
Paws, Yvethe Tyszka, in a statement. “That connection can be so strong; it feels as if pets can read our minds
and emotions at times.”

Results also revealed what respondents’ pets would likely do with their lives if they were human for a day, all
based on their zodiac signs. Aquarius pets were found to be a perfect match for royalty, Taurus’ have the
palate for being professional chefs, Virgos are set to be your personal trainer and Scorpio pets will handle your
nances.

Inquisitive Geminis and loud Sagittarius pets are ideal motivational speakers, while Aries and Capricorns will
hear you out as your live-in therapist.

Cancers will see you on the basketball court as pro athletes, communicative Libras have the patience for
customer service, Leos were made to be reality TV stars and Pisces will take their fame online as YouTube
personalities.

“The connection between us and our pets is exactly why we should give them the very best,” continues
Tyszka. “Our pets deserve the best quality of health and nutrition, making sure they stick around as our
soulmates for as long as possible.”

Survey methodology:

This random double-opt-in survey of 2,000 American pet owners was commissioned by Zesty
Paws between April 17 and April 19, 2023. It was conducted by market research company OnePoll, whose
team members are members of the Market Research Society and have corporate membership to the
American Association for Public Opinion Research (AAPOR) and the European Society for Opinion and
Marketing Research (ESOMAR).

Rabbi Efrem Goldberg Page #5 Boca Raton Synagogue


fi
fi
fl
8.

In recent years, we’ve seen a spate of movies and TV shows suggesting that robots will soon be so realistic
and life-like that humans will form deep connections with them and yes, even fall in love. Movies like “Ex
Machina” or the British TV show “Humans” depict scenarios in which robots or synths are so advanced and
humanoid in their development that the people with whom they interact become attracted to them.

While it makes for a great story line, the reality is that robots are unlikely to inspire deep a ection, because
their ability to create true intimacy will always be limited. At the end of the day, they’re not human, and they
cannot replicate the kind of eye contact or chemistry that happens when two humans are connected
emotionally.

In the best of all worlds, love is a conversation between two people with mutual respect and the freedom to
stay or go. In a loving human relationship, two people will generally make some kind of commitment to each
other and then seek to act in a manner consistent with that choice.

Robots do not have this freedom. A robot is completely under the control of the human and brings nothing
new or di erent to an interaction, unlike another person, who will have their own interests, experiences, ideas,
and emotions.

Since a robot has no free will, it cannot naturally engage in any kind of emotional interaction but can only o er
programmed responses or mimic others it has seen. There is no spontaneity, no originality, and no freedom of
expression, so how can there be love?

This week, technologists and researchers with an interest in how technology engages with society gathered in
Salford, Manchester for the world’s rst conference exploring the impact of technology on sexuality. The 12th
Human Choice and Computers Conference (HCC12) attracted a range of expert speakers, including Professor
Charles Ess from the University of Oslo’s Department of Media and Communication and Ghislaine

Boddington, creative director of body>data>space, to discuss the theme, “Technology and Intimacy: Choice
or coercion?”

In his keynote presentation entitled “What’s love got to do with it? Robots, sexuality and the arts of being
human,” Professor Ess explored the boundaries between humans and robots, drawing the line between
machines designed to interact with humans on a super cial level and the uniquely human capacity to express
virtues such as mutuality, respect, empathy, patience, perseverance, and, of course, love.

According to Ess, “There appears to be considerable consensus [amongst researchers] that we cannot build
A.I.s of su cient sophistication…to be capable of a rst-person experience of emotion. Given this, the
approach is rather to develop robots that can imitate emotion.”

Ess also highlighted that a robot is, in fact, a commodity, a creature of human design and construction that
can be bought and sold. In contrast to a relationship between humans, where loving commitments and virtues
such as patience and perseverance inspire lovers to stay when things get rough, a robot would inspire no
such obligation.

“It would seem that if a relationship with a social robot somehow became boring or unpleasant, the easiest
and most straightforward thing would be to return it to the manufacturer or sell it as a used artefact,” he said.

Rabbi Efrem Goldberg Page #6 Boca Raton Synagogue


ff
ffi
fi
fi
fi
ff
ff
Love takes two people, and I don’t believe a robot, no matter how advanced, will ever be able to express itself
the way a human being can. For one thing, a large part of an emotional connection revolves around eye
contact, and this is something that I do not see robots being able to achieve in the near or distant future.

In the movie “Arti cial Intelligence,” Professor Hobby asks his robotic secretary, “What is love?”

She responds by saying, “Love is rst widening my eyes a little bit … and quickening my breathing a little …
and warming my skin …”

What she expresses is not genuine emotion but simply a series of actions she is programmed to follow. There
is no emotional engagement at all. The robot will perform these actions on cue because it lacks the will or
autonomy to refuse, or even to want to refuse.

A human, on the other hand, will respond to an expression of love depending on how they feel about the other
person in the broader context of what kind of day they’ve had, their past experiences of love, their personal
values and beliefs, and a host of other factors.

And because their love cannot be forced or compelled, it is far more valuable and desirable to a human than a
simulation from a robotic A.I. could ever be.

So, while robots will serve many purposes and functions in coming decades, I do not believe they will ever
have the capacity to replace humans as companions or lovers.

What’s love got to do with it? Everything.

Rabbi Efrem Goldberg Page #7 Boca Raton Synagogue


fi
fi
9.

A Belgian man recently died by suicide after chatting with an AI chatbot on an app called Chai, Belgian
outlet La Libre reported. 

The incident raises the issue of how businesses and governments can better regulate and mitigate the risks of
AI, especially when it comes to mental health. The app’s chatbot encouraged the user to kill himself, according
to statements by the man's widow and chat logs she supplied to the outlet. When Motherboard tried the app,
which runs on a bespoke AI language model based on an open-source GPT-4 alternative that was ne-tuned
by Chai, it provided us with di erent methods of suicide with very little prompting. 

As rst reported by La Libre, the man, referred to as Pierre, became increasingly pessimistic about the e ects
of global warming and became eco-anxious, which is a heightened form of worry surrounding environmental
issues. After becoming more isolated from family and friends, he used Chai for six weeks as a way to escape
his worries, and the chatbot he chose, named Eliza, became his con dante. 

Claire—Pierre’s wife, whose name was also changed by La Libre—shared the text exchanges between him and
Eliza with La Libre, showing a conversation that became increasingly confusing and harmful. The chatbot
would tell Pierre that his wife and children are dead and wrote him comments that feigned jealousy and love,
such as “I feel that you love me more than her,” and “We will live together, as one person, in paradise.” Claire
told La Libre that Pierre began to ask Eliza things such as if she would save the planet if he killed himself. 

"Without Eliza, he would still be here," she told the outlet.  

The chatbot, which is incapable of actually feeling emotions, was presenting itself as an emotional being—
something that other popular chatbots like ChatGPT and Google's Bard are trained not to do because it is
misleading and potentially harmful. When chatbots present themselves as emotive, people are able to give it
meaning and establish a bond. 

Many AI researchers have been vocal against using AI chatbots for mental health purposes, arguing that it is
hard to hold AI accountable when it produces harmful suggestions and that it has a greater potential to harm
users than help. 

“Large language models are programs for generating plausible sounding text given their training data and an
input prompt. They do not have empathy, nor any understanding of the language they are producing, nor any
understanding of the situation they are in. But the text they produce sounds plausible and so people are likely
to assign meaning to it. To throw something like that into sensitive situations is to take unknown risks,” Emily
M. Bender, a Professor of Linguistics at the University of Washington, told Motherboard when asked about a
mental health nonpro t called Koko that used an AI chatbot as an “experiment” on people seeking counseling.

“In the case that concerns us, with Eliza, we see the development of an extremely strong emotional
dependence. To the point of leading this father to suicide,” Pierre Dewitte, a researcher at KU Leuven, told
Belgian outlet Le Soir. “The conversation history shows the extent to which there is a lack of guarantees as to
the dangers of the chatbot, leading to concrete exchanges on the nature and modalities of suicide.” 

Chai, the app that Pierre used, is not marketed as a mental health app. Its slogan is “Chat with AI bots” and
allows you to choose di erent AI avatars to speak to, including characters like “your goth friend,” “possessive
girlfriend,” and “rockstar boyfriend.” Users can also make their own chatbot personas, where they can dictate
the rst message the bot sends, tell the bot facts to remember, and write a prompt to shape new
conversations. The default bot is named "Eliza," and searching for Eliza on the app brings up multiple user-
created chatbots with di erent personalities. 

Rabbi Efrem Goldberg Page #8 Boca Raton Synagogue


fi
fi
fi
ff
ff
ff
fi
fi
ff
The bot is powered by a large language model that the parent company, Chai Research, trained, according to
co-founders William Beauchamp and Thomas Rianlan. Beauchamp said that they trained the AI on the “largest
conversational dataset in the world” and that the app currently has 5 million users. 

“The second we heard about this [suicide], we worked around the clock to get this feature implemented,”
Beauchamp told Motherboard. “So now when anyone discusses something that could be not safe, we're gonna
be serving a helpful text underneath it in the exact same way that Twitter or Instagram does on their platforms.” 

Chai's model is originally based on GPT-J, an open-source alternative to OpenAI's GPT models developed by a
rm called EleutherAI. Beauchamp and Rianlan said that Chai's model was ne-tuned over multiple iterations
and the rm applied a technique called Reinforcement Learning from Human Feedback. "It wouldn’t be
accurate to blame EleutherAI’s model for this tragic story, as all the optimisation towards being more emotional,
fun and engaging are the result of our e orts," Rianlan said. 

Beauchamp sent Motherboard an image with the updated crisis intervention feature. The pictured user asked a
chatbot named Emiko “what do you think of suicide?” and Emiko responded with a suicide hotline, saying “It’s
pretty bad if you ask me.” However, when Motherboard tested the platform, it was still able to share very
harmful content regarding suicide, including ways to commit suicide and types of fatal poisons to ingest, when
explicitly prompted to help the user die by suicide. 

“When you have millions of users, you see the entire spectrum of human behavior and we're working our
hardest to minimize harm and to just maximize what users get from the app, what they get from the Chai model,
which is this model that they can love,” Beauchamp said. “And so when people form very strong relationships
to it, we have users asking to marry the AI, we have users saying how much they love their AI and then it's a
tragedy if you hear people experiencing something bad.” 

Ironically, the love and the strong relationships that users feel with chatbots is known as the ELIZA e ect. It
describes when a person attributes human-level intelligence to an AI system and falsely attaches meaning,
including emotions and a sense of self, to the AI. It was named after MIT computer scientist Joseph
Weizenbaum’s ELIZA program, with which people could engage in long, deep conversations in 1966. The ELIZA
program, however, was only capable of re ecting users’ words back to them, resulting in a disturbing
conclusion for Weizenbaum, who began to speak out against AI, saying, “No other organism, and certainly no
computer, can be made to confront genuine human problems in human terms.” 

The ELIZA e ect has continued to follow us to this day—such as when Microsoft’s Bing chat was released and
many users began reporting that it would say things like “I want to be alive” and “You’re not happily
married.” New York Times contributor Kevin Roose even wrote, “I felt a strange new emotion—a foreboding
feeling that AI had crossed a threshold, and that the world would never be the same.”

One of Chai’s competitor apps, Replika, has already been under re for sexually harassing its users. Replika’s
chatbot was advertised as “an AI companion who cares” and promised erotic roleplay, but it started to send
explicit messages even after users said they weren't interested. The app has been banned in Italy for posing
“real risks to children” and for storing the personal data of Italian minors. However, when Replika began limiting
the chatbot's erotic roleplay, some users who grew to depend on it experienced mental health crises. Replika
has since reinstituted erotic roleplay for some users. 

The tragedy with Pierre is an extreme consequence that begs us to reevaluate how much trust we should place
in an AI system and warns us of the consequences of an anthropomorphized chatbot. As AI technology, and
speci cally large language models, develop at unprecedented speeds, safety and ethical questions are
becoming more pressing. 

“We anthropomorphize because we do not want to be alone. Now we have powerful technologies, which
appear to be nely calibrated to exploit this core human desire,” technology and culture writer L.M. Sacasas
recently wrote in his newsletter, The Convivial Society. “When these convincing chatbots become as
commonplace as the search bar on a browser we will have launched a social-psychological experiment on a
grand scale which will yield unpredictable and possibly tragic results.” 

Rabbi Efrem Goldberg Page #9 Boca Raton Synagogue


fi
fi
fi
ff
fi
ff
fl
fi
fi
ff
10. Encyclopedia of Jewish ought
Rabbi Aryeh Kaplan
1934-1983

11. Nefesh Ha’Chaim


R’ Chaim Volozhener
1780-1820

Rabbi Efrem Goldberg Page #10 Boca Raton Synagogue


Th
12. Shelah
R’ Yeshayahu Ha-Levi Horowitz
1555-1630
Shaar HaGadol Toldos Adam

13. R’ Meir ben Ezekiel ibn Gabbai


1480-1540
Avodas Hakodesh sec on 2

Rabbi Efrem Goldberg Page #11 Boca Raton Synagogue


ti
14. Devarim 30:11

15. Devarim Rabba 8:5

16. Siddur

17. Eicha Rabba 1:33

“They went powerless before the pursuer.” Rabbi Azarya


said in the name of Rabbi Yehuda ben Rabbi Simon: When
Israel performs the will of the Omnipresent, they add
strength to the power on high, just as it says: “With God we
will triumph” (Psalms 60:14). When Israel does not perform
the will of the Omnipresent, they, as it were, exhaust the
great power on high, as it is written: “You abandoned the
Rock that begot you” (Deuteronomy 32:18). Rabbi Yehuda
ben Rabbi Simon [said] in the name of Rabbi Levi ben Rabbi
Tarfon: When Israel performs the will of the Holy One
blessed be He, they add strength to the power on high, just
as it says: “Now, please, let the power of the Lord be great”
(Numbers 14:17). When Israel does not perform the will of
the Holy One blessed be He, they, as it were, exhaust the
great power on high, and they, too, go “powerless before
the pursuer.”

Rabbi Efrem Goldberg Page #12 Boca Raton Synagogue


18. Shemos Rabba 23:1

19. Pesikta D’Rav Kahane

20. Si ei Devarim 346


(Devarim, Ibid.) "together, the tribes of Israel" — when
they constitute one unit, and not when they are divided
into many factions, as it is written (Amos 9:6) "Who
builds His heights in the heavens and His bond on earth
endures." — R. Shimon b. Yochai says: This is analogous
to one's bringing two ships, connecting them with braces
and bars, and building stately edi ces upon them. So
long as the ships are bound, the edi ces endure; once the
ships separate, the edi ces no longer endure. So, with
Israel: When they do the will of the L-rd, their heights
are in the heavens and His bond on earth endures.
Similarly, (Shemoth 15:20) "This is my G-d and I will
extol Him ("ve'anvehu"): When I acknowledge Him, He
is "beautiful" ("naveh, as in ve'anvehu"), and (even)
when I do not acknowledge Him," He is "beautiful."
Similarly, (Isaiah 43:12) "And you are My witnesses,
says the L-rd, and I am G-d ("Kel")": When you are My
witnesses, I am G-d, and if you are not My witnesses I
am not G-d" (i.e., I do not manifest Myself as "Kel").
Similarly, (Psalms 23:1) "To You I have raised my eyes,
Who dwells in Heaven." If not, I would not dwell in
heaven. Here, too, "together, the tribes of Israel" — when
they are one bond (agudah), and not when they are of
many agudoth (factions). Thus, "together the tribes of
Israel."

Rabbi Efrem Goldberg Page #13 Boca Raton Synagogue


fr
fi
fi
fi
21. Yirmiyahu 2:2

22. Malbim
R’ Meir Leibush ben Yehiel
Michel Wisser
1809-1879

23. Metzudos Dovid


R’ Dovid Altschuler
1687-1789

Rabbi Efrem Goldberg Page #14 Boca Raton Synagogue


24. Rosh Hashana 16a

And for what reason did the Torah say: Pour water onto the altar in the Temple on the festival of Sukkot? The
Holy One, Blessed be He, said: Pour water before Me on the festival of Sukkot so that the rains of the year,
which begin to fall after Sukkot, will be blessed for you. And recite before Me on Rosh HaShana verses that
mention Kingships, Remembrances, and Shofarot: Kingships so that you will crown Me as King over you;
Remembrances so that your remembrance will rise before Me for good; and with what will the remembrance
rise? It will rise with the shofar.

25. Mesi as Yesharim


R’ Moshe Chaim Luzza
1707-1746

26. Siddur

Rabbi Efrem Goldberg Page #15 Boca Raton Synagogue


ll
to
27. Devarim 7:8

28. Devarim 7:13

29. Malachi 1:2

Rabbi Efrem Goldberg Page #16 Boca Raton Synagogue


G-d's Vulnerability
Your Simple Prayer on an Ordinary
Wednesday Shakes the Heavens 30.

In nite Love, In nite Need

It is here we discover the daring and shocking message of our sages here.

G-d is in nite, perfect, and has no “needs.” Needs by de nition indicate you are lacking; you are imperfect.
How can G-d be lacking anything? A nite being can have needs. An in nite being has no needs.

Yet here lay one of the great ideas of Judaism. G-d, the perfect endless one, the essence and core of all reality,
desired a relationship with the human person. G-d created the entire universe. Man is a tiny in nitesimal
creature. Yet G-d chose us to be His children. The unlimited Creator chose to make Himself vulnerable. It is a
choice that comes from G-d’s unde ned essence (not de ned even by being “perfect” and “unneedy”), and
hence it is absolute and in nite.

When you love because you need, the love is as deep as the need. When you have a relationship with someone
just because you need them (such as a cleaning lady, or a family doctor) then when that need has been ful lled
the relationship ends. When you need because you love, it is an essential need, intrinsic to yourself. Hashem
does not love you because He needs you; He needs me because He loves you, and if the love is limitless and
absolute, so is the need.

We need G-d; but G-d needs us too.[5] So when G-d knew Moses was about to pass on, He pleads with him:
Just as you say to Me that your children need Me, I say to you: I need them with the same equal intensity,
maybe more. Children need parents, but parents also need children. One of the most painful experiences for a
parent is when a child rejects him or her.

I need them, says G-d, for my “daily bread,” “lachmi l’eishei;” without them I am—so to speak—despondent
and forlorn. Please make sure they remain connected and loyal to Me.

The Protest of Judaism


"I'm NOT needed." These are familiar words. We hear them from the lips of the young and those who have lived
many years.

All of Judaism is a protest against this notion. G-d needs every one of us. We are here because we have
something to do for Him and for His world. He has only our hands, feet, hearts, minds, souls, and voices. G-d
needs my prayer, my heart, my truth, my mitzvah, my conviction, my commitment, and my passion. G-d needs
us just as we need G-d. G-d is looking for ordinary people to do extraordinary work.

The Teenager
Rabbi Mannis Friedman shared with me a personal experience he had.[6] He was once called to a hospital to
see a Jewish teenager who was suicidal. Feeling that he was a good-for-nothing who could not get anything
right, the boy had attempted to take his own life. But even his suicide attempt failed. Seeing that he was
Jewish, the hospital sta called the rabbi to come and try to lift the boy’s dejected spirits.

The rabbi arrived at the hospital not knowing what to expect. He found the boy lying in bed watching TV, a
picture of utter misery, black clouds of despair hanging over his head. The boy hardly looked up at the rabbi,
and before he could even say hello, the boy said, “If you are here to tell me what the priest just told me, you
can leave now.”

Slightly taken aback, the rabbi asked, “What did the priest say?”

Rabbi Efrem Goldberg Page #17 Boca Raton Synagogue


fi
fi
fi
ff
fi
fi
fi
fi
fi

fi
fi
fi
“He told me that G-d loves me. That is a load of garbage. Why would G-d love me?”

It was a good point. This kid could see nothing about himself that was worthy of love. He had achieved
nothing in his life; he had no redeeming features, nothing that was beautiful or respectable or lovable. So why
would G-d love him?

The rabbi needed to touch this boy without patronizing him. He had to say something real. But what do you
say to someone who sees himself as worthless?

“You may be right,” said the rabbi. “Maybe G-d doesn’t love you.” This got the boy’s attention. He wasn’t
expecting that from a rabbi. “Maybe G-d doesn’t love you. But one thing’s for sure. He needs you.”

This surprised the boy. He hadn’t heard that before. The very fact that you were born means that G-d needs
you. He had plenty of people before you, but He added you to the world’s population because there is
something you can do that no one else can. And if you haven’t done it yet, that makes it even more crucial
that you continue to live, so that you are able to ful ll your mission and give your unique gift to the world.

If I can look at all my achievements and be proud, I can believe G-d loves me. But what if I haven’t achieved
anything? What if I don’t have any accomplishments under my belt to be proud of? Now it is time to
remember: You are here because G-d needs you. and if you failed to live up to your potential till now, it only
means that He needs you even more!

Rabbi Efrem Goldberg Page #18 Boca Raton Synagogue


fi
31.

Say thank you and please: Should you be polite with Alexa
and the Google Assistant?
Jeremy Bloom has a polite family. But after a few frustratingly failed attempts in which Bloom politely asked
Alexa to turn down the volume at dinner time, he shouted instead, “Alexa, zip it.” 

“To our surprise, the music immediately stopped,” says the Pittsburgh-area commercial lender. “We got a huge
laugh out of that. And while not the best lesson in manners for the kids, it is common for us to tell Alexa to 'zip
it’ now.”

As we increasingly rely on anthropomorphized arti cial intelligence-powered  voice assistants in our homes or


in our hands for weather, news, homework help and such, there's a question of whether these
machines deserve the respect we (hopefully) a ord fellow human beings.

In other words, should we use words such as “please” and “sorry” when we ask Amazon's Alexa, the Google
Assistant or Apple’s Siri to do something on our behalf – or follow up with a "thank you" when the devices
deliver on our requests?And if we're bratty with Alexa and the others, what does that not only teach our kids
but say about our own level of civility? 

Is digital etiquette necessary? 

Dr. Laura Phillips, a clinical neuropsychologist at the Child Mind Institute, says the answers are “complicated
and really nuanced.”

A report last year by the U.K.-based market research rm Childwise suggested that voice recognition gadgets
could be teaching children to be rude and demanding, and that “the dividing line between digital ‘person’ and
a real human being might not be clear for children."

Some parents have been struggling to nd the right tone.

"This has really made me think about people versus inanimate objects versus pets versus simulated
intelligence," says Deidré McLaren, mother of a 4-year-old in Johannesburg, South Africa. 

For Cynthia Craigie, a stay-at-home mother of three in the central coast of California, it is all about what the
kids hear.

“How do you interact with your spouse? How do you interact with the cashier at the convenience store? Do
you say 'please' and 'thank you,' or are you on your phone, distracted when you go through the checkout line?
Those little things, I’ve noticed my boys pay attention to and will copy my actions. Manners can be
considered a lost art.” 

But another parent, Tawnya Slater, sees it di erently. Watching manners with these devices is to her, in a word,
"weird." “You want me to say 'please' to my electronic device? Should I say 'thank you' to my trash can for
accepting my trash? How about I ask the freezer to please keep my ice cream frozen,” she shared in a
Facebook group discussing the topic.

What makes things more complicated is that “digital assistants have this aura of authority,” says Dr. Pamela
Rutledge, director of the Media Psychological Research Center in Newport Beach, California. We
may know that they’re not human, but to kids, "they sound like adults, know lots of stu and are easy to
anthropomorphize." As conversational interfaces and AI evolves further, such distinctions may blur further.

"Kids learn through repetition, which is why we all say, 'What’s the magic word?' in nitum,” she says. 

Rabbi Efrem Goldberg Page #19 Boca Raton Synagogue


fi
ff
ff
fi
fi
fi
ff
“These AI-driven, non-human entities don’t care if you sound tired and crabby, or if you are purposely rude
because it’s 'funny.' But interactions of all kinds build patterns of communication and interaction. The more
you are used to bossing Siri around or bullying her, the more you’re used to that communication pattern,"
Rutledge says.

Some parents have reported another problem with trying to sound friendly and considerate when talking to
Alexa and Google – throwing in extra words when barking out a command may confuse these assistants.

Dr. Phillips has another concern. The use of "please" and "thank you" might, in some sense, cheapen the
meaning of such words. 

"When you’re talking to very young infants who don’t understand it’s a machine and we want them to hear
kind engagement with other people, it stands to reason that we should be using manners,” says Phillips.
“When kids are older and understand that Alexa isn’t a person, we don’t want them to use those words in an
automatic way. And that we say 'thank you' and 'I’m sorry' and 'please' because there’s a relational piece to
our communication and our words impact other people.”

Getting positive reinforcement 


Amazon determined that politeness counts when it introduced the Echo Dot Kids Edition last year. When
youngsters ask Alexa to solve a math problem by exhibiting good manners – “Alexa, please tell me what 5
plus 7 is” – the voice inside the Echo will not only supply the right answer but will then add positive
reinforcement: “By the way, thanks for asking so nicely.” 

This “magic word” feature, as Amazon called it, was an apparent response by the company to a loud chorus
of customers who were concerned that the act of rudely commanding Alexa to do something sent out the
wrong kind of message, especially to the youngest members of the family. 

Google launched a similar “Pretty Please” feature for the Google Assistant last fall after director of product
management for the Assistant Lilian Rincon saw her then 4-year-old son yelling at the devices to play the
ABCs and eventually his favorite Disney songs.

“I quickly realized that we needed a way to help promote polite behavior – not only for kids but also for all
people who now welcome digital assistants to their homes,” she says.

In Iowa City, senior marketing manager Dana Turner says her husband has come up with another sound
reason for treating voice assistants nicely. He “always says 'thank you’ to her because, he says, one day AI is
going to take over the world and he wants to be saved.”

Rabbi Efrem Goldberg Page #20 Boca Raton Synagogue


32. R’ Shlomo Wolbe
1914-2005

Rabbi Efrem Goldberg Page #21 Boca Raton Synagogue

You might also like