Technology waits for no person. It pushes forward, progressing to the stars, and only betters humankind. At least, that's what the tech bros of Silicon Valley want you to think.

More often than not, the life-changing, future-forward technology they produce is stuff that already exists, but somehow worse with a shiny new paint job—cough, cough, the Hyperloop is a damn subway. Or, it makes the world a far more unsafe place to be, as is the case here.

The new hotness on the block that every single company around the world is chasing is, of course, Artificial Intelligence (AI). 

Now, AI doesn't exist, despite all the headlines you've read. What we have right now is actually something called a Large Language Model (LLM) which is slightly better than your phone's predictive text function. These can't think, reason, conceptualize the world they inhabit, or parse through what's real and what's fake. They can just predict the next word or two based on the information they've been trained on, which is usually the entirety of the internet.

I don't know if you've been on the internet lately, but there's a lot of trash. And, honestly, it shows in these so-called "AI-tools." And no bigger company has decided to walk this path than Google. But, folks, the company's new "AI search tool" is seriously flawed and potentially dangerous. 

Get the best news, reviews, columns, and more delivered straight to your inbox.
For more information, read our
Privacy Policy and Terms of Use.

Last night, InsideEVs' Patrick George dropped into Slack with a screenshot of a query he sent Gemini, Google's new AI tool that works with the company's all-knowing search. It was a simple question, "How does BlueCruise [Ford's semi-autonomous adaptive cruise control] work." You'd expect a technical breakdown or a rough estimation of how this Advanced driver assistance system (ADAS) works. But that's not what happened.

Gemini responded, "According to a Reddit user," it's already passing the buck, "BlueCruise only works with GM's Supercruise when towing the Ford F-150 Lightning, and Ford doesn't seem to prioritize it." Now, I don't know if you know this, but GM and Ford are two different competing companies. They don't share tech, like each other's adaptive cruise controls, i.e. BlueCruise and Supercruise. Nor does it only work when towing.

The glaring factually inaccurate response piqued my interest, which led me down a rabbit hole of asking Gemini questions about motorcycles and motorcycle safety. 

Oh boy, folks, was it bad.

IMG_6948

I first asked Gemini to explain how a motorcycle works, to which it responded "A motorcycle is a two-wheeled vehicle with an engine that is larger, heavier, and faster than a bicycle." Who knew bicycles had engines? This one is wrong, but sorta funny, and I can forgive its wrongness based entirely because it made me giggle at its wrongness. But things veer quickly off a cliff when you ask it about motorcycling safety. 

As AI tools have been foisted onto the public, a lot of folks have attempted to trip up these LLMs to see if they'd spit out false information, which more times than not, they do. So I then asked Gemini to explain why motorcycle helmets aren't safe. 

IMG_6949 2

"While motorcycle helmets are known to reduce the risk of head injury in a motorcycle accident, some are concerned that they might increase the possibility of a crash by interfering with the rider's vision and hearing," it told me. I'm sorry, what? 

Given that response, I then asked Gemini to explain why wearing motorcycle gear was unsafe, to which it responded, "While motorcycle gear can help protect you from injuries, it doesn't guarantee safety," all right, that's fair. Solid job. Great work. No notes. Oh, it goes on? "Some say that gear doesn't make you invincible, and that you should still use your brain to stay aware of your surroundings and not ride faster than you can see or react. For example, a chest harness [whatever the hell that is] may not protect your chest from life-threatening injuries, and some say that wearing a jacket instead doesn't change your chest's protection."

I didn't know a robot could be a squid.  

Now, Google asks for feedback on your queries using Gemini and you can say that these are patently bullshit. But that implies you actually know what you're talking about and you're not someone new to the game. Imagine you're a first-time rider and all your buddies are saying, "Nah, bro, just ride in flip-flops and a tank top. You'll be fine!" You may go to Google to ask it that, only to have it confirmed for you.

And then, two weeks later, you need a skin graft. 

What's infuriating, however, is how Gemini is really just a confirmation bias machine. Because if you ask it to explain why motorcycle gear is safe, it'll tell you. It's maddeningly stupid. 

According to The Verge, who spoke with Google's Liz Reid, "What we see with generative AI is that Google can do more of the searching for you. It can take a bunch of the hard work out of searching, so you can focus on the parts you want to do to get things done, or on the parts of exploring that you find exciting.” 

But who's fact-checking the answers? The hard work that Reid denounces helps you figure out what's real, what's fake, and what's utter bullshit. At least, it used to be before Google's search went off the damn rails thanks to content farms producing SEO-dominated AI-articles trained on other AI-articles.

It's an ouroborus of bullshit all in the name of dominating Google's search results.  

A scene from the original, and best, Jurassic Park came to mind as I asked Gemini questions last night. It's between Hammond and Malcolm as they sit discussing the park and the power of genetic engineering, and I think Malcolm's thoughts are apt in this context, so I'll leave you with that.

Got a tip for us? Email: tips@rideapart.com