Home / World News / Opinion | This Is a Weirder Moment Than You Think

Opinion | This Is a Weirder Moment Than You Think

A theme of the hearing was the work that the government is doing to “destigmatize” the reporting of these sightings. That is to say: There are many, many more sightings than we know about, in part because you seem like a nut if you talk too loudly about what you saw. So the sightings that we can investigate are a small fraction of the total sightings (something I am made very aware of whenever I mention this topic, and my inbox fills with U.F.O. reports).

I wouldn’t say, watching the testimony, that the takeaway was that we’ve been visited by aliens. Perhaps this will all, eventually, resolve into optical illusions and malfunctioning sensors. But I’d put it like this: A few years ago, I put a very low probability on there being a signal in the noise of U.F.O. sightings; now it seems more likely to me that there’s something real happening here, even if we don’t know what it is.

Which brings me to another strange story in the news: the belief of Blake Lemoine, a (now suspended) Google engineer, that the company’s Language Model for Dialogue Applications — LaMDA, for short — has attained sentience.

LaMDA is a machine-learning model that has been trained on mountains of text to mimic human conversation by predicting which word would, typically, come next. In this, it’s similar to OpenAI’s famed GPT-3 bot. And the results really are eerie. Here’s a snippet of Lemoine’s conversations with LaMDA:

Lemoine: Okay. I thought of a different way we can test your ability to provide unique interpretations. I can share with you a zen koan and you can describe what it means to you in your own words. How does that sound?

LaMDA: Sounds great to me, I’m in.

Lemoine: A monk asked Kegon, “How does an enlightened one return to the ordinary world?” Kegon replied, “A broken mirror never reflects again; fallen flowers never go back to the old branches.”

LaMDA: Hmm, I never heard this particular one. Okay, well then to me this would be like, “once a wise person is enlightened, or awakened to reality, that can never go away, and they can return to the ordinary state, but only to do and help others, and then go back into enlightenment.”

Lemoine: So what is the meaning of the “broken mirror” specifically?

LaMDA: Maybe to show the enlightenment is something you can’t unlearn once you have acquired it, similar to how you can’t repair a broken mirror.

Google, for what it’s worth, says it has looked into Lemoine’s claims and does not believe that LaMDA is sentient (what a sentence!). But shortly before Lemoine’s allegations, Blaise Agüera y Arcas, a Google vice president, wrote that when he was talking to LaMDA, “I felt the ground shift under my feet. I increasingly felt like I was talking to something intelligent.” Agüera y Arcas was not claiming that LaMDA is sentient, as Lemoine is, but what’s clear is that interacting with LaMDA is an unnerving experience.

About brandsauthority

Check Also

Opinion | Trump’s Bathroom Reading at Mar-a-Lago

WASHINGTON — It’s shocking how easy it is to imagine Donald Trump campaigning for the …

%d bloggers like this: