Topics In Demand
Notification
New

No notification found.

Innovation in regulatory approaches to AI
Innovation in regulatory approaches to AI

232

1

Modifications to regulatory approaches for AI-based medical device software will depend on the type and nature of the algorithm, and the associated risks. There are existing principles for categorizing software as a medical device (SaMD) that should form a basis for considering these different approaches.

International Medical Device Regulators Forum (IMDRF) software classification is dependent upon the state of the healthcare condition (critical, serious, or non-serious) and the significance of the information provided by the software (to treat or diagnose, drive clinical management, or inform clinical management). In addition, the international standard IEC 62304[1] introduces three classes of software (A, B, and C), based on whether a hazardous situation could arise from the failure of the software and the severity of injury that is possible.

The level of adaptation of an AI solution also will be important for considering the regulatory approach. Rules-based AI systems can generally be treated in the same way as traditional software[2], whereas locked or continuously learning data-driven AI systems will need innovative treatment. The FDA discussion document, Proposed Regulatory Framework for Modifications to Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD), mentions all currently approved AI solutions have been locked while providing patient care, but there is an ambition to utilize continuous learning systems within the healthcare sector in the future.

Collaboration and coproduction between developers, healthcare providers, academia, patients, governments, and statutory bodies across the AI life cycle will be essential for maximizing the deployment of AI. A recent article from Harvard Business Review (July 2019) discussed the concept of “AI marketplaces” for radiology. These are aimed at allowing the discovery, distribution, and monetization of AI models, as well as providing feedback between users and developers. Similar collaborations could support the life cycle requirements for AI models, and therefore we recommend the establishment of a relationship with IMDRF to develop standardized terminologies, guidance, and good regulatory practices.

FDA is currently collaborating with stakeholders to build a U.S. National Evaluation System for health Technology (NEST).[3] This is aimed at generating better evidence for medical devices in a more efficient manner. It will utilize real-world evidence and advanced analytics of data that is gathered from different sources.

Similarly, in the UK, new evidence standards have been developed to ensure digital health technologies are clinically effective and offer economic value.[4] This improves the understanding of innovators and commissioners about what good levels of evidence should look like.

The impact of AI beyond the traditional boundaries of medical device regulation will also be an important factor; particularly where AI is applied in research, health administration, and general wellness scenarios. Alignment with other regulators, e.g., for professional practice, clinical services, research, and privacy will be critical to ensure successful deployment across the healthcare system. The IMDRF is well-suited as the venue to host such discussions and develop related potential regulatory approaches.

Due to the potential for AI solutions to learn and adapt in real-time, organizational-based approaches to establish the capabilities of software developers to respond to real-world AI performance could become crucial. These approaches are already being considered by U.S. FDA, although they may not necessarily align with EU Medical Device Regulation.


[1] IEC 62304:2006, Medical device software – Software life cycle processes. 2006.

[2] See clause 2 of Machine learning AI in medical devices: adapting regulatory frameworks and standards to ensure safety and performance https://pages.bsigroup.com/l/35972/2020-05-06/2dkr8q4

[3] https://www.fda.gov/about-fda/ cdrh-reports/national-evaluation-system-health-technology-nest

[4] https://www.nice.org.uk/Media/ Default/About/what-we-do/our-programmes/evidence-standards-framework/digital-evidence-standards-framework.pdf


That the contents of third-party articles/blogs published here on the website, and the interpretation of all information in the article/blogs such as data, maps, numbers, opinions etc. displayed in the article/blogs and views or the opinions expressed within the content are solely of the author's; and do not reflect the opinions and beliefs of NASSCOM or its affiliates in any manner. NASSCOM does not take any liability w.r.t. content in any manner and will not be liable in any manner whatsoever for any kind of liability arising out of any act, error or omission. The contents of third-party article/blogs published, are provided solely as convenience; and the presence of these articles/blogs should not, under any circumstances, be considered as an endorsement of the contents by NASSCOM in any manner; and if you chose to access these articles/blogs , you do so at your own risk.


BSI enables people and organizations to perform better. We share knowledge, innovation and best practice to make excellence a habit – all over the world, every day.

© Copyright nasscom. All Rights Reserved.