Spend :01 of your time each Monday morning as Twelve:01 delivers timely tools, trends, strategies, and/or compliance insights for the CME/CE enterprise.
Platforms like HeyGen use AI avatars to help educators and healthcare organizations develop training and patient education videos without traditional production resources. These tools can support consistency in messaging, make complex topics easier to explain, and save time in content creation. Similar platforms (e.g., Synthesia) are also emerging, giving medical education providers more options to integrate AI-driven video into their education strategies.
The Emergency Care Research Institute’s (ECRI) 2025 safety forecast names “medical gaslighting,” the dismissal of patient or caregiver concerns, as the top patient safety threat of the year. With 94% of patients reporting symptoms being overlooked. Consequences include delayed diagnoses, poorer outcomes, and lost trust. ECRI emphasizes that this is a systemic patient safety issue, driven by time pressures, bias, and fragmented care. ECRI urges reforms like empathy-focused communication, scheduling improvements, and clinician education on commonly minimized conditions.
The AMA has released a new policy addressing development, deployment, and use of healthcare artificial intelligence, i.e., augmented intelligence (AI). The AMA deliberately frames AI in healthcare as “augmented intelligence” emphasizing AI’s role in assisting and enhancing human intelligence, not replacing it. The AMA’s evolving policy framework emphasizes ethical oversight, transparency, equity, liability, and data protection, while addressing insurer use and regulatory guardrails. With physician adoption of AI surging from 38% in 2023 to 66% in 2024, the AMA is positioning augmented intelligence as a driver of clinical care, education, and practice management, provided it is implemented responsibly.