OpenEvidence Withdraws AI Medical App From EU, UK

The HIStalk RSS item reports that OpenEvidence withdrew its AI-powered medical evidence app from the EU and UK, citing regulatory uncertainty that includes the EU Artificial Intelligence Act. Reuters reported that OpenEvidence raised $250 million in January at a $12 billion valuation and that its platform is used daily by more than 40% of U.S. physicians across over 10,000 hospitals and medical centers, supporting about 18 million clinical consultations in December, according to Reuters. Editorial analysis: Companies offering AI tools for clinical decision support face an uneven regulatory landscape between jurisdictions, which can force market exits even for well-funded startups and complicate deployment plans for health systems and practitioners.
What happened
The HIStalk RSS item reports that OpenEvidence withdrew its AI-powered medical evidence app from the EU and UK, citing regulatory uncertainty that includes the EU Artificial Intelligence Act. Reuters reported earlier that OpenEvidence raised $250 million at a $12 billion valuation in January 2026 and that the company said its platform is used daily by more than 40% of U.S. physicians across over 10,000 hospitals, supporting about 18 million clinical consultations in December, according to Reuters.
Technical details
Reuters describes OpenEvidence as a specialized medical search engine that limits training data to trusted medical sources and lists formal partnerships with organizations such as the New England Journal of Medicine and the American Medical Association, per the Reuters reporting. The HIStalk RSS item attributes the withdrawal to regulatory uncertainty; the item does not quote OpenEvidence executives nor provide a company statement in the scraped text.
Industry context
Editorial analysis: Regulatory uncertainty in the EU and UK has emerged as a material operational constraint for clinical AI vendors. Observers following the sector have noted that compliance requirements under the EU AI Act and associated medical-device frameworks can differ from U.S. oversight, creating a higher barrier to market entry in Europe. Editorial analysis: Even well-funded companies with strong U.S. adoption, such as the Reuters-profiled OpenEvidence, can choose to restrict geographic availability when compliance costs or legal risk are unclear.
What to watch
Reporting outlets and regulators for any formal statements from OpenEvidence clarifying the scope and duration of the withdrawal; whether the company files regulatory correspondence in the EU or seeks notified-body classification; and whether other clinical-AI vendors announce similar market restrictions. Industry watchers will also track whether the EU/UK issue clarifications or guidance that change the practical compliance burden for evidence-synthesis and clinical decision support tools.
Implications for practitioners
Editorial analysis: Healthcare providers deploying or evaluating clinical-AI tools should treat geographic availability and compliance posture as part of procurement risk assessments. Editorial analysis: Data scientists and ML engineers working on regulated clinical workflows should budget for regulatory engineering, such as auditable data provenance, model documentation, and performance monitoring, which are commonly required in high-assurance healthcare applications.
Scoring Rationale
The story matters because regulatory decisions affect deployment and availability of clinical-AI tools for practitioners; it is notable but not a sector-defining event. The presence of high-profile funding and large user numbers increases relevance for ML and healthcare engineering teams.
Practice with real Health & Insurance data
90 SQL & Python problems · 15 industry datasets
250 free problems · No credit card
See all Health & Insurance problems

