Directions: Read the passage carefully and answer the questions that follow.
A new study urges caution in the development of Artificial Intelligence (AI) chatbots designed to mimic deceased loved ones, known as ‘deadbots’. Researchers have warned that these chatbots, while potentially comforting, could lead to psychological distress if not designed with safety in mind.
Deadbots, also known as griefbots, are AI-enabled digital representations of departed loved ones. These chatbots simulate their language patterns and personality traits using their digital footprint, like emails, social media posts and even voice recordings, to create a conversational AI that reflects their personality.
While the idea of holding a digital conversation with a lost loved one may be appealing to those coping with grief and loss, the study highlighted potential risks. Companies offering these services need to adopt safety standards to ensure that their technologies do not manipulate or cause psychological distress to the users, the paper published in journal Philosophy & Technology noted.
“This topic is particularly urgent considering the significant increase in the number of companies, research centres and private initiatives focused on digital immortalisation practices,” Katarzyna Nowaczyk-Basińska, Research Associate at the Leverhulme Centre for the Future of Intelligence, University of Cambridge and one of the authors, told Down To Earth.
In 2017, Microsoft secured a patent for a deadbot that could ‘resurrect’ the dead. An AI chatbot called Project December uses patent-pending technology to simulate text-based conversations with anyone, including the dead. Such services have taken off in the United States and China.
Over the past decade, the expert added, the world has witnessed a relatively marginalised niche of immortality-related technologies transitioning to a fully independent and autonomous market known as the “digital afterlife industry” and is expected to grow further due to the advent of generative AI.
“Therefore, I believe that it is essential to have a serious debate on safety standards for users of this technology,” the scientist stressed.
Nowaczyk-Basińska and Tomasz Hollanek from the University of Cambridge created three scenarios to highlight the potential risks of careless design of products that are technologically possible and legally realisable. These scenarios seem straight out of dystopian sci-fi and underline the need for regulations and ethical frameworks to ensure these tools are used responsibly and prioritise the well-being of those grieving.
The first scenario describes a user uploading all the data — text and voice messages — she received from her grandmother on the app to create a simulation. She then begins to chat and call her dead grandmother by paying for premium services initially. Upon its expiry, she begins to receive advertisements from the deadbot, making her sick.
The user perceives the deadbot as a puppet in the hands of big corporations.
In the second scenario, a parent uploads all her data, including text messages, photos, videos and audio recordings and trains the bot through regular interactions, tweaking its responses and adjusting the stories produced, to be able to chat with her son after she passes.
However, the app sometimes provides odd responses that confuse the child. For instance, when the son refers to his mother using the past tense, the deadbot corrects him, pronouncing that ‘Mom will always be there for you’.
“At the moment, our understanding of the psychological impact of re-creation services on adults and their grieving processes is limited,” the researchers wrote in their paper, adding that we know even less about the impact on children.
The third scenario represents the case of an old father creating his deadbot to allow his grandchildren to know him better after he dies. But he does not seek the consent of his children, whom he designates as the intended interactants for his deadbot.
One of his children does not engage with the deadbot and prefers coping with his grief by himself. But the deadbot sends him a barrage of additional notifications, reminders and updates, including emails. The other child begins to find herself increasingly drained by the daily interactions with the deadbot and decides to deactivate it. However, the company denies this request as the grandfather prepaid for a twenty-year subscription.
“The risks we talk about — including profit-focused exploitation of personal data, emotional manipulation, or privacy violation — should be at the forefront of all recreation service providers’ minds today,” Nowaczyk-Basińska noted.
Taking these risks into account, the team listed a few design recommendations. These include developing sensitive procedures for ‘retiring’ deadbots, ensuring meaningful transparency through disclaimers on risks and capabilities of deadbots, restricting access to adult users only and following the principle of mutual consent of both data donors and recipients to partake in the re-creation project.
Further, Nowaczyk-Basińska explained that the topics of death, grief and immortality are very delicate but also hugely culturally sensitive. “Solutions that might be enthusiastically adopted in one cultural context could be completely dismissed in another,” she observed.
Going forward, the researchers plan to understand these cross-cultural differences in the approach to digital immortality in three different Eastern locations, including Poland and India.
[Excerpt from Down To Earth “What are AI Deadbots?” Dated 10/05/24]
Q1: What is the primary ethical concern regarding the development of deadbots?
(a) Potential misuse of personal data
(b) Lack of transparency in their capabilities
(c) Over-commercialization of the technology
(d) Cultural variations in acceptance
Ans: (a) Potential misuse of personal data
Sol: The potential misuse of personal data in the development and use of deadbots, emphasizing the need for regulation to protect users from such harm.
Q2: What psychological impact could interaction with deadbots have?
(a) Increased understanding of the grieving process
(b) Confusion or distress if the AI behaves unexpectedly
(c) Enhanced emotional well-being in children
(d) Greater acceptance of digital afterlife concepts
Ans: (b) Confusion or distress if the AI behaves unexpectedly
Sol: How interaction with deadbots could lead to confusion or distress, particularly if the AI does not behave as expected, impacting users' psychological well-being.
Q3: What recommendation is suggested for safer implementation of deadbots?
(a) Increasing the use of deadbots among children
(b) Providing limited transparency about the bots' capabilities
(c) Instituting mutual consent policies for data donors and recipients
(d) Encouraging over-commercialization of deadbot technology
Ans: (c) Instituting mutual consent policies for data donors and recipients
Sol: The implementing mutual consent policies wherein both the data donors (or their estates) and the recipients agree to the recreation and use of the digital profiles for safer implementation of deadbots.
Q4: Why do experts argue for the establishment of strict safety and ethical guidelines for deadbots?
(a) To maximize over-commercialization opportunities
(b) To protect users from potential psychological harm
(c) To limit the use of deadbots to adult users
(d) To exploit cultural variations in acceptance
Ans: (b) To protect users from potential psychological harm
Sol: The need for strict safety and ethical guidelines to protect users from potential psychological harm and prevent misuse of their personal data in the context of deadbots.
Q5: What cultural consideration is mentioned regarding deadbots?
(a) The universality of acceptance across all cultural backgrounds
(b) The lack of impact on cultural beliefs regarding the afterlife
(c) Variations in acceptance across different cultural backgrounds
(d) The absence of research into cultural differences
Ans: (c) Variations in acceptance across different cultural backgrounds
Sol: Ongoing research aiming to explore differences in acceptance of deadbots across different cultural backgrounds, highlighting the significance of cultural considerations in understanding the implications of digital immortality technologies.
766 docs|572 tests
|
1. What was the significant event that occurred on 10 May 2024? |
2. Why is the date 10 May 2024 important in current affairs? |
3. How does the event on 10 May 2024 impact the current political landscape? |
4. Who were the key players involved in the event on 10 May 2024? |
5. What are the potential consequences of the event that occurred on 10 May 2024? |
|
Explore Courses for CLAT exam
|