- City
- Dexter
- State
- MI
I saw mention of "Perplexity.ai" today as an new AI-powered internet search engine that was going to threaten google.
I thought I'd try it out (I'm trying to stay somewhat current on AI stuff for the day job)
I asked it a slightly trick CJ-8 question: "What rear axles were available in the 1984 Jeep Scrambler?"
Most here know, the answer was only one—the AMC 20, and it got that correct. But wait, there's more!
There were suggested/related search queries below the reply to my initial query. I did nothing to generate these, Perplexity suggested them.
One of the PERPLEXITY SUGGESTED QUERIES is shown in the screen grab... I clicked through.
Four issues:
1) As the first query's response correctly points out, only the AMC 20 rear axle was available in this model/year Jeep.
2) There is no such thing as the Dana 30 axle that it describes, it's only a front axle.
3) Not only is the response to the second, suggested query an utter fabrication and/or completely wrong, but the fact that it got the answer to the first should have precluded that it even suggested the second! It JUST told me that the second query and the resulting answer are 100% logically impossible.
4) Let's assume someone who doesn't know any different takes this incorrect information and puts it out on the web as content. They've just created another "source" reaffirming incorrect information that the next AI large language model will "learn" on. (I'm honestly a little worried even writing it out here, so I left all the text in the screen grab.)
This "knowledge death spiral" is what scares me to death, and it is perfectly summed up in this one screen shot.
Until AI is smart enough to prevent itself from making errors like this, we are in very, very dangerous territory.

I thought I'd try it out (I'm trying to stay somewhat current on AI stuff for the day job)
I asked it a slightly trick CJ-8 question: "What rear axles were available in the 1984 Jeep Scrambler?"
Most here know, the answer was only one—the AMC 20, and it got that correct. But wait, there's more!
There were suggested/related search queries below the reply to my initial query. I did nothing to generate these, Perplexity suggested them.
One of the PERPLEXITY SUGGESTED QUERIES is shown in the screen grab... I clicked through.
Four issues:
1) As the first query's response correctly points out, only the AMC 20 rear axle was available in this model/year Jeep.
2) There is no such thing as the Dana 30 axle that it describes, it's only a front axle.
3) Not only is the response to the second, suggested query an utter fabrication and/or completely wrong, but the fact that it got the answer to the first should have precluded that it even suggested the second! It JUST told me that the second query and the resulting answer are 100% logically impossible.
4) Let's assume someone who doesn't know any different takes this incorrect information and puts it out on the web as content. They've just created another "source" reaffirming incorrect information that the next AI large language model will "learn" on. (I'm honestly a little worried even writing it out here, so I left all the text in the screen grab.)
This "knowledge death spiral" is what scares me to death, and it is perfectly summed up in this one screen shot.
Until AI is smart enough to prevent itself from making errors like this, we are in very, very dangerous territory.



