Computer System Validation a Hot Topic at DIA 2012
In March I wrote a blog post previewing the DIA Annual Meeting session on innovative computer systems validation (CSV) that I was honored to be chairing. Unfortunately a nasty bug knocked me out of commission for a few weeks in June and I was unable to make it to the meeting at all.
Fortunately, Ron Fitzmartin of the FDA stepped in to moderate our panel of participants from Covance, Foresight Group and Court Square Group. After the session, I had the opportunity to speak with Ron and all of the panelists and was also able to listen to DIA’s audio posting. To follow is my attempt to summarize some highlights of the discussion that took place that day.
Ron and the panelists first offered their views on the current state of industry and regulatory expectations and challenges related to computerized systems. The audience was then given the opportunity to add their thoughts and opinions. The session lived up to its expectations and reflected some truly enlightened (albeit controversial) views on CSV, software development, cloud computing, supplier audits and end user responsibilities.
Some of the panelists questioned the sensibility of sponsors independently revalidating commercial off-the-shelf (COTS) systems and SaaS solutions over and over again, after the vendor has already performed extensive testing. There were mixed views on the value of sponsors executing vendor-supplied test scripts.
There was a lot of discussion related to vendor audits. Vendors (both on the panel and from the audience) complained about the burdensome number of audits and the “quality” of those actually conducting those audits. On the other hand, several participants commented on the lack of regulatory knowledge of newer technology vendors, as well as vendors in the electronic health record arena, thus necessitating the need for continued audits.
Everyone acknowledged the dominant move to SaaS solutions, the dependence on cloud computing (private and public) and even the adoption of agile development techniques. But there was still no consensus of how best to deal with this “brave new world” of technology development and deployment.
In the end, there were a few recommendations that were generally accepted by all, including greater use of automated testing, leveraging a more sensible risk-based approach to testing, end users taking more responsibility for and getting much more involved in User Acceptance Testing, and the need to assess and adopt alternative ways of auditing technology suppliers.
Given the hearty interest in the topic of CSV, we are already discussing plans for a two-part session at next year’s DIA Annual Meeting. The first session will likely be a redo of this year’s more open forum session so the conversation can continue. The second session will attempt to generate some solid recommendations of how to address the industry and regulatory challenges associated with computerized systems. Looking forward to the ongoing dialogue. What do you think?
More about Fran Nolan