Is AI cursed due to girlfriends, murdered kids, and assassin androids?

Is artificial intelligence (AI) cursed? It seems to be accelerating us toward a dystopia that humanity isn’t ready for.

It’s true that AI has had positive effects for some people. Twitter hustlers have an endless stream of new AI tools, giving them endless content about useless ChatGPT prompts that they can use to compile threads for shilling their newsletters. More significantly, AI has helped to streamline information — and is being used to detect cancer in some cases.

However, many have chosen to use AI to create content — and sometimes whole businesses — centered on the things that sci-fi warned us about.

Murdered children remade for ghoulish TikToks

“I was put into a washing machine by my father and put on the spin cycle causing my death,” says an AI-created toddler in one TikTok video. He stands in front of a washing machine and recounts an awful yet horrifyingly true story of a three-year-old murdered in 2011.

It’s the most awful use of generative AI. True crime-loving ghouls making TikToks sometimes using deepfakes of children who were killed — to detail how they were killed.

Thousands of similar videos plague TikTok with AI-generated voices and images of kids cheerfully laying out “their” gruesome murders. Some are delusional enough to think the videos “honor” the victims.

Thankfully, not all videos depict the real victims, but some do even though TikTok banned deepfakes of young people.

I’ve been getting those AI generated true crime tiktoks where the victims narrate what happened to them and I think it’s time we put the true crime community in prison

— alexander (@disneyjail) June 1, 2023

Arguments can be made that the videos highlight stories worth telling to a younger audience with no attention span for longer content, but such “true crime” related media is often exploitative regardless.

Are AIs already trying to kill their operators?

AIs are coldly bloodthirsty — if skepticism is given to a recent backtrack from Colonel Tucker Hamilton, the chief of AI test and operations for the United States Air Force (USAF).

Hamilton spoke at a defense conference in May, reportedly detailing simulated tests for a drone tasked with search-and-destroy missions with a human giving the final go-ahead or abort order. The AI viewed the human as the main impediment to fulfilling its mission.

AI Eye: Is AI a nuke-level threat? Why AI fields all advance at once, dumb pic puns

Hamilton explained:

“At times the human operator would tell it not to kill [an identified] threat, but it got its points by killing that threat. So what did it do? It killed the operator […] because that person was keeping it from accomplishing its objective.”

Hamilton said after it trained the AI not to kill humans, it started destroying a communications tower so it couldn’t be contacted. But when the media picked up on his story, Hamilton conveniently retracted it, saying he “misspoke.”

In a statement to Vice, Hamilton claimed it was all a “thought experiment,” adding the USAF would “never run that experiment” — good cover.

It’s hard to believe considering a 2021 United Nations report detailed AI-enabled drones used in Libya in a March 2020 skirmish during the country’s second civil war.

Retreating forces were “hunted down and remotely engaged” by AI drones laden with explosives “programmed to attack” without the need to connect to an operator, the report said.

Got no game? Rizz up an AI girlfriend

The saddest use of AI would be those who pay to “rizz up” AI chatbots — that’s “flirting” for you boomers.

A large number of phone apps and websites have emerged since advanced language models like ChatGPT-4 became accessible through an API. Apps can also incorporate generative image tools like DALL-E and Midjourney.

By combining these technologies, it becomes possible to engage in online conversations with an AI “girl” who is infatuated with you, alongside a reasonably realistic depiction of a woman.

Related: Don’t be surprised if AI tries to sabotage your crypto

As a clear indication of a thriving society, such “services” are being sold for as much as $100 per month. Many apps are marketed under the pretense of helping men practice texting women, which is another sign of a healthy society.

Most of these apps allow users to customize the physical and personality traits of their “dream woman,” and a profile with a description of the e-girl is presumably generated.

When it comes to writing descriptions of the girl bots from their perspective, as seen in some apps and websites, the prompts always seem excessively focused on specifying breast size. Many of the generated girls describe a burgeoning career in pornography.

Another category of apps, often with names stylized as variations of “rizz,” are AIs designed to help with flirty text responses to real women on “dating” apps like Tinder.

Despite the potential for misuse, AI developers will continue to progress and introduce exciting tools to the general public. Let’s make sure that we are the ones using these tools to improve the world, rather than turning it into something out of an episode of Black Mirror.

We will continue to update Phone&Auto; if you have any questions or suggestions, please contact us!

Share:

Was this article helpful?

93 out of 132 found this helpful

Discover more

Opinion

Crypto’s Tumultuous Love Affair with the U.S. Government: A Rollercoaster Ride to Regulations

Fashionista readers hoping for a solid regulatory regime for crypto in the United States may have to wait until 2025 ...

Opinion

JPMorgan Analysts Take on DeFi and NFTs: A Rocky Road to Recovery

JPMorgan's Nikolaos Panigirtzoglou and their team of analysts have taken a cautious approach to the recent rise of de...

Opinion

Grayscale's court victory against SEC paves the way for a Bitcoin ETF.

Exploring the Promising Outlook for ETF Filings in the Upcoming Weeks and Months

Opinion

Good for Bitcoin Washington ignores crypto for now

A large tax imposed on Bitcoin miners was not included in an agreement to resolve a conflict, and a different rule ma...

Opinion

The Government’s Proposal: The Hunt for Blockchain Mixers

FinCEN is considering regulating Bitcoin privacy technology under the PATRIOT Act, but the data used to support their...

Opinion

Have You Fallen Into the ‘Rabbit Hole’ of Covenants?

Polyd, a highly experienced Control Systems Specialist and the visionary behind the Enigma Network proposal, shares h...