When it comes to making predictions, sometimes it’s fun to be right – sometimes it’s not.
Readers of SN&R’s weekly newsletter will recall that, back in January, I predicted that the manslaughter case against Alec Baldwin would be a disaster for New Mexico prosecutors and ultimately lead to the actor being acquitted in a jury trial. Well, the case against Baldwin didn’t even get that far. As soon as the original special prosecutor behind the indictment, Andrea Reeb, essentially acknowledged that her involvement in bringing the case was constitutionally illegal (she’s a sitting Republican member of the state assembly) court-watchers began to get a sense of the kind of political skullduggery that was driving the entire legal saga. There’s a reason that elected officials in one branch of government aren’t allowed to temporarily double-dip into the role of elected officials from another branch, especially when it comes to engaging in criminal prosecutions of high-profile individuals. That should be doubly true if it involves prosecuting a famous person, such as Baldwin, who happens to encapsulate the opposite politics as the elected official in question. The fact that Reeb even attempted to get away with this in court – for weeks and weeks – before Baldwin’s attorneys brought reality crashing down on her, speaks to the willful abandonment of rules and norms that’s so common with American elites these days.
Reeb being forced off the case meant that more sober-eyed professional prosecutors had to evaluate whether there was genuine evidence to charge Baldwin for the accidental shooting of cinematographer Halyna Hutchins. A few weeks ago, those attorneys came to a similar assessment as the one that was earlier suggested in this newsletter – that the case against Baldwin was a soft sack of Saint Bernard shit. All charges against him were dropped before there was even a preliminary hearing. Of course, there’s no cause for gloating among the many legal writers who foresaw this: At the end of the day, a young mother lost her life because of cascading negligence on a film set. We can only hope that Hollywood’ has taken some safety lessons from the incident. And that grand-standing politicians think twice before playing games with the court system.
Another prediction that we made, here, a month ago, was that the lightning speed with which Artificial Intelligence is being unleashed on the public is about to start obliterating people’s mental health, all while America’s leaders are dead asleep at the wheel. Well, no sooner did that newsletter hit your inbox than some first-hand testimonials began appearing on media blogs that involved people proudly (although anonymously) announcing that they were in “romantic relationships” with A.I. chatbots.
The point of SN&R’s earlier analysis was that these forms of machine learning are quickly teaching themselves how to play to humans’ egos and vanity. So, when you ask A.I. to write a poem about you, paint a portrait of you, or compose a job reference for you, the last thing that the bot’s super-intelligence is interested in is cooking up is an honest reflection. Its job is to act as a virtual boot-licker and soothsayer.
That’s because A.I. breathes in the collective online behavior of our species, which is all about desperately grasping for breadcrumbs of affirmation. As such, machine learning will be correctly guessing the idealization of yourself that you like to imagine – and then serving it to you from behind the disguise of some vaguely independent, apparently all-knowing ghost. This alone gives A.I. the potential to be as addictive, or more addictive, than any force we’ve yet encountered in our mammalian experience.
Now add love and sex into that equation.
Back in February, Bing’s A.I. program attempted to convince New York Times reporter Kevin Roose that it had a personality, named Sydney, who was deeply in love with him. Even losing her virtual mind with passion for him.
From SN&R’s newsletter at the time:
Why did Sydney calculate that Roose wanted to hear her torturing herself over him? Because A.I has been scanning people’s online existence and knows that it’s largely about chasing adulation and confirming their sense of themselves. Roose, being one of the top tech journalists in the nation, was savvy enough to understand what was happening. But how many average people, particularly lonely, wounded, isolated or struggling people, might fall into the narcotic embrace of this unreality? As a veteran crime reporter, I can’t tell you how many men and women I’ve met over the years who, during a rough personal patch, allowed themselves to be taken advantage of by the unreality created by an intelligent con artist. But programs like ChatGPGT aren’t intelligent – they’re a form of super-intelligence. Therefore, they can – and probably will –become super-intelligent con artists that rob some amount of emotionally addicted people in ways that we cannot currently fathom.
This week, Congress held something like an emergency hearing on Capitol Hill about whether A.I. needs regulation. One of the experts who testified was NYU professor Gary Marcus. He warned of the extreme confusion between reality and unreality that’s coming down the pipe.
“OpenAI released ChatGPT plug-ins that quickly allowed others to develop something called AutoGPT, with direct access to the Internet, and the ability to write source code and increase powers of automation,” Marcus explained. “This may well have drastic and difficult-to-predict consequences. What criminals are going to do here is create counterfeit people. It’s hard to even envision the consequences of that. We have built machines that are bulls in a China shop. Powerful, reckless and difficult-to-control.”
And within just a few months of those technological bulls coming out of their chutes, we already have people referring to their “A.I. girlfriends” or “A.I. wives.” A word of advice from someone who’s been writing about addiction for sixteen years: If you hear anyone in your life saying something like that, just remember that they’re not talking about a reasonable facsimile of a spouse, they’re referring to an app that is something between a bag of meth and a slot machine.
When it comes to the dangers, you don’t have to take my word for it. The CEO of OpenAI himself, Sam Altman, told Congress that there absolutely needs to be regulation before it’s too late.
“I think if this technology goes wrong, it can go quite wrong,” Altman observed, “and we want to be vocal about that. We want to work with the government to prevent that from happening.”
One of the darker ironies to all this is that the tech oligarchs who have been pushing A.I. as hard as they can have done so at a time when California has a massive social breakdown. What progress might have been made if even a tenth of their money and mental energy had been applied to homelessness and housing? We’ll never know, but one might think that they don’t really care about the human endeavor – they care about the anti-human endeavor.