Hallucinations aren’t a problem with the actually medically useful tools he’s talking about. Machine learning is being used to draw extra attention to abnormalities that humans may miss.
You are right. My pet peeve is that it is now used as a marketing term without actual meeting. Used to be the word smart. Now instead of “buy this smart toaster”, “buy this AI powered toaster”. Sorry if this reply was too verbose for your liking.
Dogs are can be also better at detecting cancer than humans. And dogs tend to hallucinate less
Hallucinations aren’t a problem with the actually medically useful tools he’s talking about. Machine learning is being used to draw extra attention to abnormalities that humans may miss.
It’s completely unrelated to LLM nonsense.
Perhaps, we should consider not calling all of them as AI. Machine learning is a useful tool.
“AI” long predates LLM bullshit.
You are right. My pet peeve is that it is now used as a marketing term without actual meeting. Used to be the word smart. Now instead of “buy this smart toaster”, “buy this AI powered toaster”. Sorry if this reply was too verbose for your liking.
They’re better at smelling cancer than humans.
I’m not sure we can definitively say they hallucinate less.