Connect with us

AI Research

AI Integration with Epic EHR: Promise and Practicalities

Published

on


Mike Hale, Principal Solutions Engineer at EchoStor

The integration of artificial intelligence into healthcare environments represents one of the most significant technological shifts in the industry today. For organizations running Epic, which powers the electronic health records of approximately 250 million patients across major health systems, the question is no longer if AI will transform their operations, but how and when. As healthcare IT leaders navigate this landscape, they face complex decisions balancing technical feasibility, clinical utility, and operational sustainability.

Understanding the Inflection Point

Healthcare organizations find themselves at a critical inflection point. The maturation of AI technologies coincides with increasing demands on health systems to improve clinical outcomes, operational efficiency, and patient experience. Epic environments, which traditionally focused on stability and reliability above all else, must now accommodate emerging AI capabilities without compromising their core functions.

This transition introduces unprecedented complexity. Epic systems were designed as comprehensive but largely self-contained ecosystems. Now, they must interface with AI technologies that may reside in different computing environments, rely on different data models, and operate according to different processing paradigms.

The Infrastructure Imperative

Perhaps the most immediate challenge organizations face involves infrastructure requirements. Epic systems already demand significant computational resources, with recent versions requiring exponentially higher performance compared to historical implementation patterns. Adding AI functionality compounds these demands substantially.

Consider the infrastructure implications: machine learning models, particularly those analyzing medical imaging or unstructured clinical notes, require specialized hardware configurations. Organizations must determine whether to expand their existing on-premises infrastructure or develop hybrid architectures that extend into public cloud environments.

This decision carries significant financial implications. Health systems have already invested millions in Epic infrastructure and continue to allocate substantial operational budgets to maintain these environments. Implementing AI may require additional capital expenditures, revisions to refresh cycles, and new staffing expertise.

The shift toward AMD processors in some Epic environments further complicates planning. Healthcare organizations must now balance processor architecture decisions with their AI implementation roadmap, determining whether traditional CPU-centric environments will suffice or if specialized GPU resources become necessary as AI workloads increase.

Data Governance Foundations

Beyond infrastructure considerations, data governance represents a foundational element of AI integration with Epic. Successful AI implementations require not just access to data, but consistent, controlled access to high-quality clinical information that maintains patient privacy while enabling analytic insights.

Health systems must establish comprehensive data governance frameworks that address:

  • Data quality standards for AI training and operation
  • Policies controlling which data elements can be processed by AI systems
  • Mechanisms to ensure AI outputs remain traceable to source data
  • Processes to identify and mitigate algorithmic bias
  • Procedures for managing data provenance across systems

These governance frameworks must function within existing regulatory constraints, including HIPAA and emerging AI-specific regulations, while maintaining operational flexibility. The governance challenge extends beyond technical implementation to include clinical and administrative stakeholders who must understand how patient data flows through AI systems.

Interoperability Challenges

Interoperability represents another critical consideration. Epic has made significant strides in supporting standards like FHIR (Fast Healthcare Interoperability Resources), but AI integration introduces new interfaces that must be carefully designed and maintained.

Healthcare organizations must determine how AI systems will access Epic data and how AI-generated insights will flow back into clinical workflows. Options include leveraging Epic’s APIs, implementing dedicated integration services, or utilizing third-party middleware designed specifically for healthcare AI implementations.

Each approach presents distinct advantages and limitations regarding real-time access, data transformation capabilities, and long-term sustainability. Organizations that have invested heavily in Epic extension capabilities may prefer native integration approaches, while those with broader technology portfolios might implement integration platforms that serve multiple systems beyond Epic.

Strategic Pathways

Healthcare IT leaders face three primary strategic pathways when integrating AI with Epic environments:

  • Epic-native AI capabilities – Leveraging functionalities developed by Epic itself, which offer tight integration but may offer less cutting-edge capabilities than specialized solutions
  • Hyperscale cloud provider partnerships – Implementing AI services from major cloud providers, which offer advanced capabilities but require careful integration planning
  • Custom AI development – Building organization-specific AI solutions tailored to particular clinical or operational needs, which can address unique requirements but demands specialized expertise

Most organizations will ultimately pursue a hybrid approach, selecting different strategies for different use cases based on clinical priority, technical complexity, and resource availability. Strategic success requires continual alignment between technical and clinical leadership to ensure AI capabilities address genuine organizational needs rather than pursuing technology for its own sake.

Clinical Adoption and Workflow Integration

Even technically successful AI implementations fail without meaningful clinical adoption. Healthcare organizations must carefully consider how AI-generated insights appear within Epic workflows, ensuring they enhance rather than disrupt clinical processes.

AI capabilities should augment clinical judgment rather than attempting to replace it, providing decision support that fits naturally within established workflows. This requires careful attention to user interface design, alert fatigue mitigation, and transparency regarding how AI generates its recommendations.

Organizations should implement structured feedback mechanisms allowing clinicians to report AI performance issues, creating a continuous improvement cycle that enhances both the technical performance and clinical utility of these systems.

Security Implications

AI integration introduces new security considerations for Epic environments. Organizations must evaluate how AI systems impact their security posture, particularly when these systems cross traditional infrastructure boundaries.

Key security considerations include:

  • Authentication mechanisms between Epic and AI systems
  • Data encryption requirements during processing
  • Vulnerability management across expanded technology surfaces
  • Monitoring requirements for AI-specific threats
  • Incident response procedures for AI-related security events

Security planning must address not just traditional threats but emerging concerns specific to AI, such as model poisoning attacks or adversarial inputs designed to manipulate AI outputs.

Measuring Success

Ultimately, healthcare organizations must establish clear metrics for evaluating AI integration success. These metrics should span technical performance, clinical outcomes, and financial impact, creating a comprehensive view of implementation effectiveness.

Rather than pursuing AI adoption as an end in itself, organizations should identify specific problems AI can meaningfully address, establish baseline measurements, implement targeted solutions, and rigorously assess outcomes. This measured approach ensures AI investments deliver tangible benefits rather than merely introducing additional complexity.

As AI in healthcare transitions from experimental to essential, organizations running Epic must develop coherent implementation roadmaps that balance innovation with the fundamental reliability requirements of clinical systems. Those that successfully navigate this transition will position themselves to deliver higher quality care while managing operational costs more effectively.


About Mike Hale

Mike Hale is a Principal Solutions Engineer at EchoStor, where he leads the company’s healthcare initiatives. He has nearly 20 years of executive leadership experience in the health technology sector. 



Source link

AI Research

If I Could Only Buy 1 Artificial Intelligence (AI) Chip Stock Over The Next 10 Years, This Would Be It (Hint: It’s Not Nvidia)

Published

on


While Nvidia continues to capture headlines, a critical enabler of the artificial intelligence (AI) infrastructure boom may be better positioned for long-term gains.

When investors debate the future of the artificial intelligence (AI) trade, the conversation generally finds its way back to the usual suspects: Nvidia, Advanced Micro Devices, and cloud hyperscalers like Microsoft, Amazon, and Alphabet.

Each of these companies is racing to design GPUs or develop custom accelerators in-house. But behind this hardware, there’s a company that benefits no matter which chip brand comes out ahead: Taiwan Semiconductor Manufacturing (TSM -3.05%).

Let’s unpack why Taiwan Semi is my top AI chip stock over the next 10 years, and assess whether now is an opportune time to scoop up some shares.

Agnostic to the winner, leveraged to the trend

As the world’s leading semiconductor foundry, TSMC manufactures chips for nearly every major AI developer — from Nvidia and AMD to Amazon’s custom silicon initiatives, dubbed Trainium and Inferentia.

Unlike many of its peers in the chip space that rely on new product cycles to spur demand, Taiwan Semi’s business model is fundamentally agnostic. Whether demand is allocated toward GPUs, accelerators, or specialized cloud silicon, all roads lead back to TSMC’s fabrication capabilities.

With nearly 70% market share in the global foundry space, Taiwan Semi’s dominance is hard to ignore. Such a commanding lead over the competition provides the company with unmatched structural demand visibility — a trend that appears to be accelerating as AI infrastructure spend remains on the rise.

Image source: Getty Images.

Scaling with more sophisticated AI applications

At the moment, AI development is still concentrated on training and refining large language models (LLMs) and embedding them into downstream software applications.

The next wave of AI will expand into far more diverse and demanding use cases — autonomous systems, robotics, and quantum computing remain in their infancy. At scale, these workloads will place greater demands on silicon than today’s chips can support.

Meeting these demands doesn’t simply require additional investments in chips. Rather, it requires chips engineered for new levels of efficiency, performance, and power management. This is where TSMC’s competitive advantages begin to compound.

With each successive generation of process technology, the company has a unique opportunity to widen the performance gap between itself and rivals like Samsung or Intel.

Since Taiwan Semi already has such a large footprint in the foundry landscape, next-generation design complexities give the company a chance to further lock in deeper, stickier customer relationships.

TSMC’s valuation and the case for expansion

Taiwan Semi may trade at a forward price-to-earnings (P/E) ratio of 24, but dismissing the stock as “expensive” overlooks the company’s extraordinary positioning in the AI realm. To me, the company’s valuation reflects a robust growth outlook, improving earnings prospects, and a declining risk premium.

TSM PE Ratio (Forward) Chart

TSM PE Ratio (Forward) data by YCharts

Unlike many of its semiconductor peers, which are vulnerable to cyclicality headwinds, TSMC has become an indispensable utility for many of the world’s largest AI developers, evolving into one of the backbones of the ongoing infrastructure boom.

The scale of investment behind current AI infrastructure is jaw-dropping. Hyperscalers are investing staggering sums to expand and modernize data centers, and at the heart of each new buildout is an unrelenting demand for more chips. Moreover, each of these companies is exploring more advanced use cases that will, at some point, require next-generation processing capabilities.

These dynamics position Taiwan Semi at the crossroad of immediate growth and enduring long-term expansion, as AI infrastructure swiftly evolves from a constant driver of growth today into a multidecade secular theme.

TSMC’s manufacturing dominance ensures that its services will continue to witness robust demand for years to come. For this reason, I think Taiwan Semi is positioned to experience further valuation expansion over the next decade as the infrastructure chapter of the AI story continues to unfold.

While there are many great opportunities in the chip space, TSMC stands alone. I see it as perhaps the most unique, durable semiconductor stock to own amid a volatile technology landscape over the next several years.

Adam Spatacco has positions in Alphabet, Amazon, Microsoft, and Nvidia. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Amazon, Intel, Microsoft, Nvidia, and Taiwan Semiconductor Manufacturing. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft, short August 2025 $24 calls on Intel, short January 2026 $405 calls on Microsoft, and short November 2025 $21 puts on Intel. The Motley Fool has a disclosure policy.



Source link

Continue Reading

AI Research

Researchers train AI to diagnose heart failure in rural patients using low-tech electrocardiograms

Published

on


WVU computer scientists are training AI models to diagnose heart failure using data generated by low-tech equipment widely available in rural Appalachian medical practices. Credit: WVU/Micaela Morrissette

Concerned about the ability of artificial intelligence models trained on data from urban demographics to make the right medical diagnoses for rural populations, West Virginia University computer scientists have developed several AI models that can identify signs of heart failure in patients from Appalachia.

Prashnna Gyawali, assistant professor in the Lane Department of Computer Science and Electrical Engineering at the WVU Benjamin M. Statler College of Engineering and Mineral Resources, said —a chronic, persistent condition in which the heart cannot pump enough blood to meet the body’s need for oxygen—is one of the most pressing national and global health issues, and one that hits rural regions of the U.S. especially hard.

Despite the outsized impact of heart failure on rural populations, AI models are currently being trained to diagnose the disease using data representing patients from urban and suburban areas like Stanford, California, Gyawali said.

“Imagine Jane Doe, a 62-year-old woman living in a rural Appalachian community,” he suggested. “She has limited access to specialty care, relies on a small local clinic, and her lifestyle, diet and health history reflect the realities of her environment: high physical labor, minimal preventive care, and increased exposure to environmental risk factors like coal dust or poor air quality. Jane begins to experience fatigue and shortness of breath—symptoms that could point to heart failure.

“An AI system, trained primarily on data from urban hospitals in more affluent, coastal areas, evaluates Jane’s lab results. But because the system was not trained on patients who share Jane’s socioeconomic and environmental context, it fails to recognize her condition as urgent or abnormal,” Gyawali said. “This is why this work matters. By training AI models on data from West Virginia patients, we aim to ensure people like Jane receive accurate diagnoses, no matter where they live or how their lives differ from national averages.”

The researchers identified the AI models that were most accurate at diagnosing heart failure in an anonymized sample of more than 55,000 patients who received medical care in West Virginia. They also pinpointed the exact parameters for providing the AI models with data that most enhanced diagnostic accuracy. The findings appear in Scientific Reports, a Nature portfolio journal.

Doctoral student Alina Devkota emphasized they trained the AI models to work from patients’ electrocardiogram results, rather than the echocardiogram readings typical for patient data from urban areas.

Electrocardiograms rely on round electrodes stuck to the patient’s torso to record electrical signals from the heart. According to Devkota, they don’t require specialized equipment or specialized training to operate, but they still provide valuable insights into heart function.

“One of the criteria to diagnose heart failure is by measuring the ‘ejection fraction,’ or how much blood is pumped out of the heart with every beat, and the gold standard for doing that is with echocardiography, which uses to create images of the heart and the blood flowing through its valves,” she said.

“But echocardiography is expensive, time-consuming and often unavailable to patients in the very same rural Appalachian states that have the highest prevalence of heart failure across the nation. West Virginia, for example, ranks first in the U.S. for the prevalence of heart attack and , but many West Virginians don’t have local access to high-tech echocardiograms. They do have access to inexpensive electrocardiograms, so we tested whether AI models could use electrocardiogram readings to predict a patient’s ejection fraction.”

Devkota, Gyawali and their colleagues trained several AI models on patient records from 28 hospitals across West Virginia. The AI models used either “deep learning,” which relies on multilayered neural networks, or “non-deep learning,” which relies on simpler algorithms, to analyze the patient records and draw conclusions.

The researchers found the models, particularly one called ResNet, did best at correctly predicting a patient’s ejection fraction based on data from 12-lead electrocardiograms, with the results suggesting that a larger dataset for training would yield even better results. They also found that providing the AI models with specific “leads,” or combinations of data from different electrode pairs, affected how accurate the models’ ejection fraction predictions were.

Gyawali said while AI models are not yet being used in due to reliability concerns, training an AI to successfully estimate from electrocardiogram signals could soon give clinicians an edge in protecting patients’ cardiac health.

“Heart failure affects more than six million Americans today, and factors like our aging population mean the risk is growing rapidly—approximately 1 in 4 people alive today will experience heart failure during their lifetimes. The prevalence is even higher in rural Appalachia, so it’s critical the people here do not continue to be overlooked.”

Additional WVU contributors to the research included Rukesh Prajapati, graduate research assistant; Amr El-Wakeel, assistant professor; Donald Adjeroh, professor and chair for computer science; and Brijesh Patel, assistant professor in the WVU Health Sciences School of Medicine.

More information:
AI analysis for ejection fraction estimation from 12-lead ECG, Scientific Reports (2025). DOI: 10.1038/s41598-025-97113-0scientific

Citation:
Researchers train AI to diagnose heart failure in rural patients using low-tech electrocardiograms (2025, August 31)
retrieved 31 August 2025
from https://medicalxpress.com/news/2025-08-ai-heart-failure-rural-patients.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Continue Reading

AI Research

Should artificial intelligence be embraced in the classroom? – CBS News

Published

on



Should artificial intelligence be embraced in the classroom?  CBS News



Source link

Continue Reading

Trending