• Home
  • Motorcycles
  • Electric Motorcycles
  • 3 wheelers
  • FUV Electric 3 wheeler
  • Shop
  • Listings

Subscribe to Updates

Get the latest creative news from CycleNews about two, three wheelers and Electric vehicles.

What's Hot

Richa Infinity 3 Pro Jacket

Asus ROG Azoth X Review: A Space-Age Gaming Keyboard

How to Clean Hearing Aids

Facebook Twitter Instagram
  • Home
  • Motorcycles
  • Electric Motorcycles
  • 3 wheelers
  • FUV Electric 3 wheeler
  • Shop
  • Listings
Facebook Twitter Instagram Pinterest
Cycle News
Submit Your Ad
Cycle News
You are at:Home » Reduce AI Hallucinations With This Neat Software Trick
Electric Motorcycles

Reduce AI Hallucinations With This Neat Software Trick

cycleBy cycleJune 14, 202403 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


To start off, not all RAGs are of the same caliber. The accuracy of the content in the custom database is critical for solid outputs, but that isn’t the only variable. “It’s not just the quality of the content itself,” says Joel Hron, a global head of AI at Thomson Reuters. “It’s the quality of the search, and retrieval of the right content based on the question.” Mastering each step in the process is critical, since one misstep can throw the model completely off.

“Any lawyer who’s ever tried to use a natural language search within one of the research engines will see that there are often instances where semantic similarity leads you to completely irrelevant materials,” says Daniel Ho, a Stanford professor and senior fellow at the institute for Human-Centered AI. Ho’s research into AI legal tools that rely on RAG found a higher rate of mistakes in outputs than the companies building the models found.

Which brings us to the thorniest question in the discussion: how do you define hallucinations within a RAG implementation? Is it only when the chatbot generates a citation-less output and makes up information? Is it also when the tool may overlook relevant data or misinterpret aspects of a citation?

According to Lewis, hallucinations in a RAG system boil down to whether the output is consistent with what’s found by the model during data retrieval. Though, the Stanford research into AI tools for lawyers broadens this definition a bit by examining whether the output is grounded in the provided data as well as whether it’s factually correct—a high bar for legal professionals who are often parsing complicated cases and navigating complex hierarchies of precedent.

While a RAG system attuned to legal issues is clearly better at answering questions on case law than OpenAI’s ChatGPT or Google’s Gemini, it can still overlook the finer details and make random mistakes. All of the AI experts I spoke with emphasized the continued need for thoughtful, human interaction throughout the process to double check citations and verify the overall accuracy of the results.

Law is an area where there’s a lot of activity around RAG-based AI tools, but the process’s potential is not limited to a single, white collar job. “Take any profession or any business. You need to get answers that are anchored on real documents,” says Arredondo. “So, I think RAG is going to become the staple that is used across basically every professional application, at least in the near to mid-term.” Risk-averse executives seem excited about the prospect of using AI tools to better understand their proprietary data, without having to upload sensitive info to a standard, public chatbot.

It’s critical, though, for users to understand the limitations of these tools, and for AI-focused companies to refrain from overpromising the accuracy of their answers. Anyone using an AI tool should still avoid trusting the output entirely, and they should approach its answers with a healthy sense of skepticism even if the answer is improved through RAG.

“Hallucinations are here to stay,” says Ho. “We do not yet have ready ways to really eliminate hallucinations.” Even when RAG reduces the prevalence of errors, human judgment reigns paramount. And that’s no lie.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleNo Matter How You Package It, Apple Intelligence Is AI
Next Article The Anderson Cooper of Black Twitter Believes Journalism Can Survive Influencers
cycle
  • Website

Related Posts

Richa Infinity 3 Pro Jacket

June 22, 2025

Asus ROG Azoth X Review: A Space-Age Gaming Keyboard

June 22, 2025

How to Clean Hearing Aids

June 22, 2025
Add A Comment

Leave A Reply Cancel Reply

You must be logged in to post a comment.

Demo
Top Posts

Richa Infinity 3 Pro Jacket

June 22, 2025

The urban electric commuter FUELL Fllow designed by Erik Buell is now opening orders | thepack.news | THE PACK

July 29, 2023

2024 Yamaha Ténéré 700 First Look [6 Fast Facts For ADV Riding]

July 29, 2023
Stay In Touch
  • Facebook
  • YouTube
  • TikTok
  • WhatsApp
  • Twitter
  • Instagram
Latest Reviews

Subscribe to Updates

Get the latest tech news from FooBar about tech, design and biz.

Demo
Most Popular

Richa Infinity 3 Pro Jacket

June 22, 2025

The urban electric commuter FUELL Fllow designed by Erik Buell is now opening orders | thepack.news | THE PACK

July 29, 2023

2024 Yamaha Ténéré 700 First Look [6 Fast Facts For ADV Riding]

July 29, 2023
Our Picks

Electric bike maker SONDORS lowers IPO price… again

Are Those Viral ‘Cooling Blankets’ for Real?

Chromebooks Are Getting a New Button and Google AI Features

Subscribe to Updates

Get the latest news from CycleNews about two, three wheelers and Electric vehicles.

© 2025 cyclenews.blog
  • Home
  • About us
  • Get In Touch
  • Shop
  • Listings
  • My Account
  • Submit Your Ad
  • Terms & Conditions
  • Stock Ticker

Type above and press Enter to search. Press Esc to cancel.