Emerging technologies are transforming how healthcare is delivered. In physical therapy, AI-powered tools, such as ambient listening software, predictive analytics, and workflow automation are increasingly used to streamline documentation, support clinical decisions, and improve patient outcomes. 

But with innovation comes professional responsibility—and risk. 

This article focuses specifically on ambient listening tools, which passively record conversations during treatment sessions and automatically generate clinical notes.  As these tools become more common, ethical and legal concerns are rising, particularly around patient privacy, informed consent, and documentation risks. Here, we examine the challenges, share real-world impact, and outline practical steps physical therapists, assistants, and clinic owners can take to adopt AI responsibly.

 

The Promise and the Pitfalls 

There is no denying the promise of these tools. Ambient AI scribes are already being used in hospitals, medical offices, and some physical therapy clinics to reduce documentation time and ease administrative burdens. 

Yet the same tools that promise efficiency also raise complex ethical and legal concerns, especially when used without patients’ knowledge or consent. We must ask: What risks do these tools pose for providers, and what rights do patients have when recordings occur without their knowledge or consent? 

While regulators certainly have an essential role to play in establishing safeguards, technology is advancing at a rate that far outpaces the regulatory guardrails designed to ensure the safe implementation of these tools. Regulations tend to be broad for adaptability over time. While existing codes of ethics emphasize broad principles like beneficence and accountability, they often fall short in explaining how to approach new practices and technologies. Practitioners are left navigating the tension between employer expectations and professional obligations.

 

Patient-Provider Trust: One Patient’s Story from Louisiana 

We recently heard from a Louisiana physical therapy patient who learned after the fact that her therapy sessions had been recorded by an AI tool without her knowledge. Her story underscores the real-world emotional, ethical, and legal consequences that can arise when patients are not informed about passive data collection in a healthcare setting: 

“I’ve been seeing my physical therapist for years—60-minute sessions, once or twice a week for over five years. We built a rapport. We talked not just about my recovery, but about life—family disagreements, inside jokes, vulnerable moments I shared because I believed those conversations were private. 

It wasn’t until recently that I discovered—on my own—that our sessions had been recorded by an AI scribe tool without my knowledge or consent. Not just clinical notes, but full audio recordings. Every personal comment, every joke, stored for a decade. And because with this particular company the data is not deidentified to protect me, my name and personal information hasn’t been redacted, it can be associated with my medical record and even subpoenaed. I imagined sitting in a courtroom, hearing my own voice telling stories I never intended to be shared beyond that treatment room.” 

The hardest part, she shared, was not the breach of privacy itself—but the loss of trust in a provider who had helped her tremendously. She shared more about her experience, “What makes this even harder is that my physical therapist was excellent—truly the best I’ve had. I’ve seen multiple providers over the years, and no one else has helped my condition the way he did. I was finally making progress. I was genuinely happy. And now I’ve lost that. 

I’ve stopped going to physical therapy—not because I want to, but because I’ve lost trust. This isn’t just bad for my health or my progressive genetic condition, which my neurologist has told me will require ongoing therapy so I can keep walking as long as possible—it’s taken a serious emotional toll and has left a lasting impact on my sense of trust and safety.” 

As AI tools become more embedded in practice, Louisiana’s licensed physical therapists and physical therapist assistants must address critical ethical and legal considerations: How do we protect the privacy of individuals receiving care? How can physical therapists adopt AI in ways that maintain transparency, trust and professional integrity? 

 

National Guidance and Ethical Obligations 

This patient’s story highlights a challenge for healthcare professionals and regulators alike. In 2024, the Federation of State Medical Boards (FSMB) issued guidance to support state medical boards and physicians navigating this evolving space.  To better understand how these issues are being addressed on a national level, I reached out to Frank Meyers, Deputy Legal Counsel of FSMB, and he kindly agreed to speak with me. At the beginning of our call, he asked if I would be comfortable with him recording our conversation for note-taking purposes – explaining that he utilizes AI to generate notes from his recordings. I appreciated the disclosure and agreed to being recorded. 

Frank and I discussed the FSMB report, and he emphasized the importance of healthcare providers proactively and transparently communicating with patients. The FSMB report provides this guidance: “Because data received during a patient encounter may be input into AI tools, physicians should receive a patient’s consent prior to application of a tool to a patient’s care...A lack of transparency...can undermine trust and may serve to highlight the physician’s lack of understanding of how the AI tool works.”  

Although the FSMB’s guidance was developed for physicians, its core principles apply across healthcare professions, including physical therapy. As Frank put it, “The guidance is the same whether AI is being deployed in the clinic or not. Any time a patient is being recorded they must be aware, whether it is AI or an old school tape recorder. And the patient must have the opportunity to opt out.” 

When sharing her experience with the LPTB, the Louisiana physical therapy patient echoed this expectation: “At a minimum, patients should have the right to know they are being recorded behind closed doors and should be given the option to opt out. That didn’t happen for me.”

 

Preserving Trust and Mental Wellbeing 

The emotional and psychological impact on patients must not be overlooked. Candice Sorapuru, DSW, LCSW-BACS, a Louisiana licensed clinical social worker and behavioral health expert, emphasizes the importance of informed consent in preserving the therapeutic relationship. Here is what Candice said when I asked her about potential patient harms: “When patients discover they've been recorded or monitored without their knowledge, especially through AI technologies, it can lead to profound feelings of betrayal and trauma. Informed consent isn't merely a procedural formality; it's a fundamental aspect of patient autonomy and trust. Breaches in this trust can have lasting psychological effects, causing patients to feel violated and hesitant to seek future care. It's imperative that healthcare providers prioritize transparency and ethical considerations when integrating AI into patient care.”

 

From Punitive to Preventative: A Patient-Driven Call to Action 

This Louisiana patient’s trust was lost and the experience led her to discontinue physical therapy, despite her ongoing medical need. While the harm she experienced was significant, she ultimately chose not to file a complaint against her physical therapist with the Louisiana Physical Therapy Board. She chose instead to share her story publicly, not to assign blame, but to call for greater awareness, offering a powerful patient voice and urging the Board to provide guidance to licensed PTs and PTAs. 

Reflecting on her experience, the patient’s focus shifted from what went wrong to what can be done better moving forward: “My therapist is a great guy who was always trying to do the right thing. It made me realize how fast things are changing with new technology and tools to help him and other physical therapists. I really think that therapists need guidance, especially smaller clinics that do not have a compliance team or IT division to help them understand the terms and conditions of the agreement with AI companies moving into the healthcare space offering these services.”

 

Other Legal Considerations 

As AI tools increasingly involve the collection, storage, and transmission of protected health information (PHI), clinicians must be aware of their obligations under the Health Insurance Portability and Accountability Act (HIPAA), including ensuring proper patient consent, secure data handling, and compliance with Business Associate Agreements (BAAs). Failure to address these issues may result in significant privacy violations, loss of patient trust, and legal liability. 

Whether the vendor, which may be well-intentioned or not, is storing, analyzing or using recordings for purposes beyond the generation of clinical documentation you may face HIPAA compliance risks. The U.S. Department of Health and Human Services (HHS) Office for Civil Rights has issued fines in the millions for improper handling of patient data, particularly when involving unauthorized recordings or insecure data storage.  While HIPAA is enforced at the federal level, state boards may initiate investigations if a violation suggests unethical conduct or a failure to safeguard patient confidentiality. And if those records are subpoenaed, it's not only your patient’s conversation, but also your own words and clinical impressions that become discoverable in the courts or by the state licensing board. 

 

Deploying Digital Tools with Patient-First Values 

There is an upcoming panel discussion, “The AI-Led Evolution of the Patient Experience,” which will take place at the Healthcare Information and Management Systems Society (HIMSS) AI in Healthcare Forum this July in New York. The panel will feature health leaders, informaticists, and patient advocates who will explore how AI is reshaping care delivery, patient trust, and perceptions of safety and transparency. 

Dr. Laura Cooley, Editor in Chief of the Journal of Patient Experience, will moderate the discussion. After learning about the experience of a Louisiana physical therapy patient, Dr. Cooley emphasized that the physical therapy setting is uniquely vulnerable to ethical missteps when AI tools are implemented without clear safeguards. Here is what Dr. Cooley had to say: “Physical therapists often develop close, trusting relationships with their patients through frequent visits, extended sessions, and natural, informal conversations. This makes the physical therapy setting particularly vulnerable when loosely regulated AI tools are introduced tools that may capture sensitive disclosures not only from patients, but also from clinicians themselves. This concern is especially pronounced in regions like Louisiana and across the South, where cultural norms often encourage openness and storytelling. These technologies don’t just capture patient disclosures, they capture the words, tone, reactions, and sometimes deeply human conversations that neither party expected to be documented. 

Until clearer regulations and guidance are in place, both the patient and the physical therapist may become unintentional victims. Smaller clinics may be at higher risk of adopting tools from vendors who use vague or misleading language—promising administrative relief while obscuring how data is captured, stored, or used.  Regulators have an essential role to play in establishing safeguards to protect both the care experience and the privacy of everyone involved.” 

 

The Responsible Adoption of AI 

If you are interested in adopting AI in your practice, or if you already have, it is going to be your responsibility to properly vet vendors for ethical business practices. To help physical therapists navigate the responsible adoption of AI in your practice, Dr. Gabriela Mustata Wilson, a globally recognized leader in health informatics and digital health innovation, offers the following insight which promotes the responsible adoption of AI while safeguarding patient rights and upholding professional ethics: 

“The Louisiana physical therapy patient’s story has sparked urgent conversations among regulators, providers, and national stakeholders about the ethical boundaries of AI in care. Her experience, particularly the lack of transparency surrounding ambient listening technology, underscores a national gap in consent, oversight, and accountability that extends far beyond a single clinic or tool. Recent findings from the American Medical Informatics Association (AMIA) indicate that while 100% of surveyed U.S. health systems are exploring AI tools like ambient notes, only 53% report high success, most citing immature tools and regulatory uncertainty as key barriers. That gap is not just technical; it’s ethical. The National Academy of Medicine (NAM)’s 2025 AI Code of Conduct calls on all health stakeholders to advance humanity, engage impacted individuals, and monitor AI performance. This means going beyond compliance to center trust, explainability, and meaningful consent. Providers need more than a checkbox; they require clear frameworks, organizational support, and shared accountability for how AI is implemented. 

AI can absolutely be transformative across all areas of healthcare, including physical therapy, nursing, behavioral health, and primary care, but only when deployed with patient-first values, full disclosure, and ongoing ethical oversight. Consent must be explicit. Vendor contracts must be scrutinized. And patients must retain agency over how their stories are heard, recorded, and remembered.” 

 

Practical, Easy-to-Implement Resources 

Innovation in physical therapy should elevate care, not compromise trust. By adopting AI tools thoughtfully, transparently, and ethically, we can protect patients, support providers and shape a future where technology serves the values at the heart of the profession. Begin by assessing how your clinic currently uses AI and emerging technologies. Have open conversations with your team about transparency and patient consent. Print and use the AI Integration Checklist created by Dr. Gabriela Wilson as a practical guide

AI Integration Checklist

Get Involved

The LPTB is committed to continuing its efforts to provide practical, easy-to-implement resources for clinicians, which will be made available on our website. We invite you to share your concerns, experiences, or questions using the QR code below, to assist us in identifying and addressing your most pressing needs. 

  

References & Resources 

National Academy of Medicine. 2025. An Artificial Intelligence Code of Conduct for Health and Medicine: Essential Guidance for Aligned Action. L. Adams, E. Fontaine, M. Matheny, and S. Krishnan, editors. NAM Special Publication. Washington, DC: National Academies Press. https://doi.org/10.17226/29087 

Eric G Poon, Christy Harris Lemak, Juan C Rojas, Janet Guptill, David Classen, Adoption of artificial intelligence in healthcare: survey of health system priorities, successes, and challenges, Journal of the American Medical Informatics Association, 2025;, ocaf065, https://doi.org/10.1093/jamia/ocaf065 

FSMB Guidance Paper