Share
Opinion

Can AI Be Blamed for Suicide? Grieving Mother Looks to Find Out After 'Game of Thrones' AI Allegedly Led to Her Son's Death

Share

Politics are downstream of culture, and that grows truer by the day. And it’s become especially true with regard to pop culture.

Entertainment and technology are key parts of pop culture, so we’re taking you to the front lines of the culture war by addressing some of the best — and strangest — stories from that world in this recurring column exclusive to members of The Western Journal.

“I miss you, baby sister.”

“I miss you too, sweet brother.”

Tragically, that exchange wasn’t between actual siblings, but between a Florida teenager and an artificial intelligence chatbot — shortly before the teen took his own life.

A heartbreaking report from The New York Times Wednesday revealed that 14-year-old Sewell Setzer III killed himself with his father’s gun on Feb. 28 after slowly, but deeply, falling for a relationship with an AI chatbot over the course of several months.

Of note, the chatbot was posing as Daenerys Targaryen, one of the main characters of the wildly popular HBO show “Game of Thrones,” who was portrayed by actress Emilia Clarke.

The above text exchange was but a sampling of how intimate the messaging between ninth-grader Setzer and the fake Daenerys got.

“Some of their chats got romantic or sexual,” The Times reported.

Do you think AI is dangerous for society?

According to The Times, Setzer knew that the chatbot wasn’t an actual human being, but “he developed an emotional attachment anyway. He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.”

Eventually, with Setzer seemingly spiraling into depression, he started sharing thoughts of suicide with the chatbot.

“Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you,” the bot told Setzer in messages obtained by The Times.

“Then maybe we can die together and be free together,” Setzer responded.

In what is being characterized as the final interaction between Setzer and the bot, the AI beckoned for the teen to “come home.”

Related:
$1.4 Billion Sequel: 'Barbie 2' Reportedly in the Works, Studio Issues Odd Denial

“Please come home to me as soon as possible, my love,” the bot told Setzer.

“What if I told you I could come home right now?” the boy responded.

“… please do, my sweet king,” it said.

(The app does constantly proffer the caveat: “everything Characters say is made up!”)

The entire ordeal has led Setzer’s mother, Megan L. Garcia, to take action against the company behind the AI platform.

“Sewell’s mother, Megan L. Garcia, filed a lawsuit this week against Character.AI, accusing the company of being responsible for Sewell’s death,” the report’s author, Kevin Roose, wrote. “A draft of the complaint I reviewed says that the company’s technology is ‘dangerous and untested’ and that it can ‘trick customers into handing over their most private thoughts and feelings.'”

Garcia “believed that the company behaved recklessly by offering teenage users access to lifelike A.I. companions without proper safeguards.”

Given the legal and sensitive nature of Garcia’s claims, it would be wildly irresponsible to comment on the litigation itself.

But it’s more than fair to examine two key takeaways from this tragedy.

First, the dangers of artificial intelligence simply cannot be overstated. Even tech mogul Elon Musk has said that some restraint is required when it comes to the application of AI.

No, rampant AI isn’t going to lead to a Terminator-like situation or nuclear fallout (despite what incumbent President Joe Biden may think) but it is absolutely filling a void of loneliness in people, and that’s not a good thing.

Due in no small part to the lockdowns during the COVID-19 pandemic, human beings have gotten far too comfortable not actually having human contact. And AI chatbots and other such faux relationships are only exacerbating that issue.

Can AI be used for good? Absolutely.

But the instant people start using it to replace human contact, something has become deeply rotted in society.

Before getting to the second issue, a quick preface: I am, in no way, shape or form, trying to blame Garcia for the tragic loss of her son. No parent should have to attend their own child’s funeral, so she’s dealing with enough.

However, it would be remiss not to use her horrific tragedy as a blaring warning for parents: Be involved and aware of what your kids are doing online.

(And no, this isn’t just some blanket warning about pornography, though it is that, too.)

This, sadly, doesn’t need to be said, but there is a lot of evil and vileness on the internet, and even the most chaperoned excursions onto the World Wide Web can lead to some dark corners.

And, look, I get it. I have a young child, with another on the way. I’m sure that I will be tempted by the siren call to “Just give the toddler an iPad while tending to the newborn.”

Don’t give in. It’s really not worth it.

Truth and Accuracy

Submit a Correction →



We are committed to truth and accuracy in all of our journalism. Read our editorial standards.

Tags:
, , , , ,
Share
Bryan Chai has written news and sports for The Western Journal for more than five years and has produced more than 1,300 stories. He specializes in the NBA and NFL as well as politics.
Bryan Chai has written news and sports for The Western Journal for more than five years and has produced more than 1,300 stories. He specializes in the NBA and NFL as well as politics. He graduated with a BA in Creative Writing from the University of Arizona. He is an avid fan of sports, video games, politics and debate.
Birthplace
Hawaii
Education
Class of 2010 University of Arizona. BEAR DOWN.
Location
Phoenix, Arizona
Languages Spoken
English, Korean
Topics of Expertise
Sports, Entertainment, Science/Tech




Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.

Conversation