Will AI replace my physician? Navigating the legal and regulatory landscape of AI in healthcare | Insights
The rapid rise of generative AI (Gen AI) has led to intense speculation about the kinds of jobs that AI will supplement, and which ones it will replace1. While Gen AI holds significant potential for the medical industry, offering healthcare professionals tools to facilitate more effective healthcare delivery, what are the limits of these technologies for medical purposes, and what are the legal and ethical implications involved in incorporating these tools into medical practice?
As Gen AI continues to evolve, the question becomes more complex still. Given the unique combination of expert problem solving and human empathy required by medical professionals, is AI capable of replacing our physicians?
Software as a medical device: Blurring the line between “supporting” and “replacing” clinical judgement
While AI holds the potential to drive innovative medical advancements, its integration into the delivery of healthcare services raises significant legal and regulatory considerations. Stakeholders are often surprised to learn that software itself can be considered a medical device if its intended use is for a “medical purpose”2. This designation is relevant because medical devices can require pre-market approval from Health Canada and the collection of safety and efficacy data, which can be a significant commercialization hurdle for product manufacturers to overcome.
Software that is purely administrative (such as software intended for clinical communication, electronic records, or general wellness) is typically not considered to have a medical purpose and is not regulated as a medical device in Canada3. Further, most software that is only intended to support a healthcare professional or patient/caregiver in making decisions about the prevention, diagnosis, or treatment of a disease or condition is typically not considered a medical device—as long as it is not intended to replace the clinical judgment of a healthcare professional4.
In practice, however, it can be difficult to neatly distinguish between software that “supports” clinical judgment and software that “replaces” clinical judgment. For example, AI-based software such as chatbots are becoming more common in virtual care settings as a triaging tool. If the chatbot is replacing part of the patient’s interaction with a healthcare professional, this blurs the line between supporting and replacing a clinical decision.
While Health Canada has provided some guidance on the matter, its approach to regulation continues to evolve5. We expect more guidance on software as a medical device as AI continues to develop.
AI’s impact on the healthcare system and clinical practice
As AI-based software continues to revolutionize healthcare delivery, we can expect to see changes to Canada’s current healthcare reimbursement models. In fact, a significant obstacle to integrating AI-based software into Canada’s public healthcare system is the lack of a clear reimbursement model associated with the use of AI tools. While private insurers may cover such technologies, there is no overarching reimbursement process for such devices and technology in the public model.
Physicians in Canada generally work on a billing-by-task model and AI might be incompatible in some respects with the current public billing system. For example, an AI capable of answering patients’ questions could reduce the amount physicians can bill under the current system, and some critics suggest that this could create a disincentive for the adoption of AI in the healthcare setting.
In determining whether to integrate AI into patient care, institutions will likely assess the benefits of efficiency against any impacts on reimbursement models. Manufacturers of AI-enabled healthcare products will also need to take such impacts into account when commercializing their technology and selecting key customer bases. To address this inconsistency, provincial governments will need to consider reimbursement regulations and billing policies, for example by creating new billing codes, to avoid allowing Canada to become an outlier compared to other jurisdictions (for more on regulatory considerations for artificial intelligence, read “What’s new with artificial intelligence regulation in Canada and abroad?”).
Liability risks of AI-based medical software
Like any other tool in healthcare, AI-based software carries risks and can expose healthcare professionals to liability. Depending on a medical practice’s risk tolerance, this could be another roadblock to the integration of new AI technologies into Canada’s healthcare system.
link
