Gair, Gair, Conason, Rubinowitz, Bloom, Hershenhorn, Steigman & Mackauf is a New York Plaintiff's personal injury law firm specializing in automobile accidents, construction accidents, medical malpractice, products liability, police misconduct and all types of New York personal injury litigation.

Articles Tagged with medical ma;lpractice

Published on:

African American Patient with DoctorsAs AI technology advances, the medical field has seen promising integrations, from diagnostic support to patient care management. However, recent reports highlight a significant risk: AI-powered transcription tools, such as OpenAI’s Whisper, are making alarming errors, particularly in medical settings. Whisper’s tendency to produce fabricated content—what experts call “hallucinations”—poses serious risks when used in doctor-patient consultations. These inaccuracies have far-reaching consequences for patients and healthcare providers alike, particularly if faulty transcriptions lead to misunderstandings, misdiagnoses, or even medical malpractice.

AI Hallucinations: What Are They?

Unlike traditional transcription errors, hallucinations involve the AI creating statements that were never said. For instance, in routine testing, Whisper reportedly added imaginary medical treatments or racial commentary that was absent in the original audio. While these hallucinations are concerning across all fields, they are particularly dangerous in healthcare, where accurate record-keeping is critical. Errors or fabricated statements in medical notes can misinform other medical professionals, lead to incorrect patient care, or provide grounds for miscommunication-based legal issues.