News

But three of those FDA employees told CNN that Elsa just makes up nonexistent studies, something commonly referred to in AI ...
Insiders tell CNN the FDA’s AI is “hallucinating” studies and can’t access key documents. Agency leaders insist the AI is getting better, and use is not mandatory.
Insiders at the Food and Drug Administration are ringing alarm bells over the agency's use of an AI to fast-track drug ...
FDA officials say the assistant is flawed, just as the Trump administration stresses AI adoption in healthcare.
The federal agency introduced Elsa last month, boasting about the AI tool's ability to increase efficiency at the FDA.
With reports that FDA’s AI Elsa is “confidently hallucinating” studies that don’t exist, the use of AI to streamline drug ...
The FDA's generative AI, Elsa, has a massive hallucination problem, according to the agency's employees themselves.
Despite ambitions to streamline regulatory review, FDA’s Elsa platform has been prone to hallucinations, prompting internal scrutiny and questions about AI reliability and governance.
Marcel Botha discusses ways the agency can ensure it gets the most accurate results possible from AI.
But according to three current FDA employees “ELSA” is creating studies that don’t exist. They say that makes it hard to ...
Following FDA’s first AI-assisted medical product review, the agency has committed to expanding the use of generative AI across all centers by June 30, 2025.
FDA said it intends to gather additional input for identifying types of information that FDA would recommend a manufacturer include in the labeling of AI/ML-based medical devices. 4.