NEJM retracts case image after AI manipulation admitted

The New England Journal of Medicine retracted an "Images in Clinical Medicine" item after the authors acknowledged using AI to alter a submitted photograph, Retraction Watch reports. The short piece, published on April 18, 2026, described an 87-year-old man with lung injury following exposure to a forest fire and included a bronchoscopic image showing black casts alongside a measuring tape. According to Retraction Watch, authors Yuling Wang and Xiangdong Mu acknowledged using an AI tool to move a ruler in the image and submitted a notice on April 29, 2026 requesting retraction. The image drew widespread attention online; an anonymous commenter flagged irregularities in the tape numbering, and Mu posted a response defending the clinical conclusions while apologizing for not disclosing the image processing, per Retraction Watch.
What happened
The New England Journal of Medicine retracted an "Images in Clinical Medicine" short piece, according to Retraction Watch. The item, published on April 18, 2026, reported the case of an 87-year-old man with lung damage after exposure to a forest fire and included a bronchoscopic photograph with a measuring tape visible, Retraction Watch reports. Retraction Watch states that authors Yuling Wang and Xiangdong Mu acknowledged using an AI tool to alter the image and submitted a retraction notice on April 29, 2026 requesting withdrawal of the image and case report.
Reported statements
Retraction Watch quotes the authors' notice: "We were unaware of Journal policies on image manipulation and had altered our submission by using an artificial intelligence (AI) tool to move the ruler to the top of the image. We therefore wish to retract our image and case report." Retraction Watch also reproduces an online commenter noting irregular numbering on the measuring tape, and reports a reply from Mu defending the clinical findings while apologizing for not disclosing the processing.
Editorial analysis - technical context
Generative image tools make localized edits such as moving or straightening elements plausible without leaving obvious pixel-level traces to non-expert reviewers. Industry-pattern observations: reviewers and journals face growing difficulty detecting AI-mediated edits in clinical photography absent rigorous provenance, metadata preservation, or routine forensic checks.
Context and significance
Editorial analysis: This incident intersects clinical publishing integrity and the practical challenges of verifying visual evidence in medicine. For practitioners and publishers, the event highlights that AI-altered imagery can influence peer and public perception of clinical cases and that existing editorial workflows may not consistently detect modest, clinically framed edits.
What to watch
Editorial analysis: Observers should follow whether major medical journals update image-handling policies, require original-image submission and metadata, or adopt automated forensic tools. Reporting to date (Retraction Watch) indicates the authors acknowledged the edit; the journal's public response was not reported in the source at the time of Retraction Watch's article.
Scoring Rationale
The story is notable because it ties AI-generated image manipulation directly to a high-profile medical-journal retraction, raising integrity and verification concerns for practitioners and publishers. It is not a paradigm-shifting technical development, so the impact is notable but not industry-shaking.
Practice interview problems based on real data
1,500+ SQL & Python problems across 15 industry datasets — the exact type of data you work with.
Try 250 free problems


