Google AI chatbot Bard gives wrong answer in first demo

Editor’s opinion: It’s been a big week for AI, but the wrong step out of the gate highlights the dangers of pushing technology to the masses too quickly before it’s been fully tested. This is especially true of AI systems that give out information that some might interpret as fact.

Microsoft announced the artificial intelligence Bing search engine and Edge browser, and Google introduced the world Bard, an experimental conversational AI service based on its language model for conversational applications (or LaMDA for short). Chinese tech company Baidu is also working on a ChatGPT-like service called Ernie.

All of them are at an early stage of development, and their creators will need more time to iron out all the shortcomings, as evidenced by the unfortunate oversight of the Bard.

In a short video demonstrating how Bard works, the AI ​​was asked about discoveries from the James Webb Space Telescope to share with a 9-year-old. Bard gave several answers, including that the telescope “took the very first pictures of a planet outside of our solar system.” The problem is that it’s not accurate.

In accordance with NASAThe first photograph of an exoplanet (2M1207b) was taken by the European Southern Observatory’s Very Large Telescope (VLT) in 2004. webb did his first photo of an exoplanet last year, but it wasn’t V the very first photograph of an exoplanet ever taken.

The tweet with the wrong answer was posted on February 6 and received over a million views. It remains alive at the time of this writing and is still featured on Google blog post Bard announces.

In its announcement earlier this week, Google said it was making Bard available to trusted testers ahead of a wider rollout to the public in the coming weeks.

In the FAQ for their AI-generated responses, Microsoft warned that Bing sometimes misrepresents the information it finds, and you may see answers that sound convincing but are inaccurate, incomplete, or irrelevant. Redmond encourages people to use their own judgment and cross-check the facts before making decisions or taking action based on Bing’s responses.

Image credit: George Becker

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button