V
Computerised System Validation- An Understanding and Approach
The pharmaceutical industry is facing ever-tightening regulatory constraints, increased pressure to shorten new product development times, escalating R&D costs, shortening product life cycles, and reforms from governments eager to constrain healthcare budgets. The race to discover, develop, and market new drugs is a fiercely competitive one, requiring significant investments in both time and money.
Automation remains critical to optimizing the drug manufacturing process. With the latest advances in automation, information, and business systems, pharmaceutical enterprise architectures continue to evolve.
Computer technology has changed the framework of business in every industry, transforming the way the businesses operate internally, as well as they interface with customers and external businesses. Many companies today see computer technology as vital to delivering products and services to the marketplace. The Pharmaceutical industry is one of the many transformed by computer and software. Computer technology fulfills functions of automation & control as well as facilitates information management.
It is in this area that the rapid pace of computer technology has presented challenges.
In general, GAMP (Good Automated Manufacturing Practices) recognizes automated system in a broad form, including automated manufacturing equipment, control systems, automated laboratory systems, manufacturing execution systems and computers running laboratory or database systems. The automated system generally consists of the hardware, software and network components, together with the controlled functions and associated documentation. GAMP has recently released their latest guidelines named GAMP 4 specifically encompassing 21 CFR Part 11 issues as well.
We are not covering all in this paper and have briefly attempted to address basic issues pertaining to the following systems.
a) Programmable Logic Controller Validation
b) Computer System Validation
c) Software Validation
Programmable Logic Controller (PLC) Validation
Up until the not-too-distant past, industrial equipment was generally controlled with relays. Electricians would wire them together, people would push buttons, and machines would operate. During the '60s or thereabouts, companies began to implement some of the functionality of the relays with software. In doing so, they invented a programming language called Ladder Logic. It was designed so electricians of that day could understand it; it mirrored nomenclature used at the time. The PLC, though, is a general-purpose computer that has been dedicated to input and output functions. Input/Output (IO) operates on simple algorithms, and it is dedicated to control applications.
My first experience with this was probably typical to what many people have experienced. Engineering developed a system, and the specifications were not available. The equipment was there for validation. The system needed to be validated right away, but there was a shortage of time and resources. Also there was no data available and my team has to persuade/cajole equipment vendor to provide some semblance of design documentation. Most of the time was spent in ‘reading’ the program and ‘deciphering’ it as per process needs. Further time was spent in translating this ‘data’ into validation documentation.
PLC validation could be hard, but it doesn't have to be painful. It can, at times, be tedious or frustrating, but it doesn't have to be hard. If you organize your validation program and have the proper things in place before you get started, you can get through it. And, when you're done, you can actually have a validation package that people can look at, understand, and get something from. It can be done.
What Makes PLC Validation Unique?
There are some differences between PLC and "standard" validation, and these differences apply equally well to software validation in general. What makes software validation so different is the complete absence of components. Normally in equipments or machines you can walk out, out your hands on the parts, identify them physically, measure it with some means.
That isn't the case with software of PLC software. You can never actually touch, see, sense, or perceive the true software. Anytime you're looking at a screen display of the software, or a printout of the software, all you are seeing is an interpretation, by another device, of what the software actually looks like.
So the big challenge in PLC validation, and validation in general, is to somehow give reality to these components. Your challenge as validation professionals is to document that these components are valid and responsive to their user's needs.
Another significant difference in the validation of PLCs is that validation starts at the beginning of development. With traditional equipment, it is conceivable that the engineering department could design something. The facilities department could buy it, install it, and then same can be validated.
That won't work with PLC validation. Validation has to be part of the program from the beginning, or you're going to have a huge amount of work at the end. And you probably won't be happy with the results.
Even a simple PLC program and system is probably more complex than the most complex piece of equipment you'll ever validate. A tablet press may have a few thousand moving components, but a PLC program with many thousands of lines of Ladder Logic - if you consider each line to be a component - is immensely more complex.
That is another challenge for the people in validation. You can't be expected to look at every component of the PLC. It certainly isn't done when you look at a piece of equipment. With equipment, you look at its functionality, and that's what we need to do with PLCs. We need to look at the functionality of the software.
Don't forget the expertise of the developer. The person who programs the PLC knows very little about the process, current Good Manufacturing Practice (cGMP), or anything else that matters as validation personnel. The developer is an engineer, an electrician, or a software guy looking at a set of specifications and as he is executing that software, points are going to be missed, very important points.
The Wrong Way to do PLC Validation
Though awareness has started spreading in the industry, most of the times there is no link between DQ and IQ or as its normally, there are no DQ available for PLC system!
The typical development cycle is like this. The end-user asks machine vendor to provide automated machine without defining what does he want in automation portion and that too real fast. Machine vendor in turn provides generic information to his automation vendor to automate certain parameters. This guy is typically seller of some MNC branded PLC hardware & software. The job is entrusted to some local system integrator with limited or no exposure to cGMP norms as well as GDP (Good Documentation Practices).
The system, which doesn’t perform as desired, gets tweaked to make it do what the user originally wanted during installation/commissioning phase. Then the system gets handed over to validation, who is told to get it validated “real quick” to take care of production needs.
When you do it the wrong way, you discover problems during validation. That is the worst possible time to discover something amiss, because, as usual, the validation function is at the tail end of the time line and everybody is interested in getting over with the job and get the production started.
Now, changes are required though, because you discover problems and, worst of all, the defects are hidden in the system. If the right development work hasn’t been done up front, you can’t avoid the fact that there are going to be hidden problems that you won’t find until much further down the road. It’s an inconvenience for validation, but it’s a potential disaster for the business (especially if it requires a recall).
Worst of all, the system will fail to meet the user’s needs. The opportunity is there, during the development phase, to make these software control systems meet the user’s needs. It only requires a little bit of paperwork up front and communication. It’s truly unfortunate when all of the effort to put a system in place is expended, and it doesn’t do what anybody really needed it to do.
One of the biggest challenges with validation of PLCs is to really understand what each piece of code means; it’s not like a computer program. With PLCs, the developer has to sit down and explain address, alarms, input/output, the interfaces, and so forth so you can understand.
Involving validation experts early in the process will result in few changes to the system during development, and that’s always preferable. Hopefully, you’ll have no hidden defects. If the validation and testing plan has been carried out properly, there shouldn’t be any, and the system will meet the user’s needs.
It costs no more to do it the right way than to do it the wrong way. The challenge is getting the people to make the investment up front rather than correcting things later on.
This way of doing things requires the validation personnel to be pro-active in talking to the people who develop and create these things in their facilities. These people need to be convinced that it is in their best interests to be open with you, to tell you what’s going on, and spend time with you explaining the system.
Computer Validation
Computer validation wasn’t as much “introduced” as “formalized.” The necessity to work in a disciplined, common sense way to deliver the end user’s requirements has always existed. New company policies, procedures, and standards however have been introduced.
Questions abound about how to interpret the regulatory requirements for computerized systems. What constitutes an electronic record? Which systems need validation? To what level does a particular system need validation?
Let’s first put aside the mandatory requirements (two token access, electronic audit trail, etc.), and answer the following question:
“Are you very confident that at any point in time your computerized system is doing exactly the job it was intended to do in an acceptable way and can you prove this to an independent person?”
By picking random times throughout the specification, design, implementation, and usage of a computerized system, this question raises many more questions that force the need for control onto a project.
An interesting question to ponder is:
“Does more documented evidence mean a higher degree of assurance?”
If the validation is being performed to satisfy an external regulatory authority rather than to improve the implemented system, then in all probability important points are missed and maybe you are spending your money unwisely.
There is always a fixed amount of money available for any project - out of which must include the project and validation costs.
The premise that validation will save money is true. The first few projects (for some, this may be quite a few), the validation costs include the additional training (both direct and indirect) costs of instilling the company’s validation policies.
Is Computer Validation Really Needed?
Computer validation as a regulatory requirement is necessary as it provides the impetus for companies to continually review the quality of software solutions they implement and use.
There is no guarantee for a successful computerized system implementation. If there were, the emphasis on computer validation would not be where it is today.
Very few of us are at a stage where computer validation requirements are performed as second nature. As project teams, we are all in a transitional phase where we need to discover what approach to validation works best for our team and the project. During this phase there will be mistakes. Documents will be created (sometimes with little benefit to the project) because procedures say that they are required. Some documents will be reworked countless times and others will be signed off in haste. Activities will be performed before they have been formally approved to occur and signatures will be missed.
In this phase, validation is a stricter policing exercise because examples need to be set.
None of this means that the concepts of validation do not work. It just means that we have a learning curve to get over and experience to be gained to ensure that future projects benefit from the current mistakes.
During this transitional phase, it is important to note that what may be classified a successful validation may not automatically guarantee a successful computerized system.
We sometimes lose focus that these deliverables alone do not guarantee a successful computerized system implementation. They only guarantee a consistent approach within and between projects. It is actually the content of the “project related” deliverables that determines the success of a system.
A validation stage can be deemed successful if it finds the errors or anomalies (i.e., incorrect requirements, wrong design approaches, and coding errors, etc.) introduced during that stage. If it takes many iterations to fix them (or worse, they never get fixed properly) then that project stage cannot be deemed a success.
Key Components of all Computerized Systems
The main features that are required from any system are reliability and consistency. A computerized system consists of mainly following components:
· Software - the most difficult component to understand since it cannot be touched, heard, seen, or smelled.
· Hardware - computers, printers, cabling, modems, scanners, etc.
· People - users, support staff, and developers
· Procedures - ways of integrating the other components to achieve a desired result
What Effect do People Have on These Systems?
The biggest factor of a successful computerized system is trained personnel. The importance of this point in any project cannot be overstated.
Individual training and experience records need to be maintained. For each major project, every person working on that project needs to be assessed as to their current skill and experience for the tasks they are to perform.
Requirements are both explicit and implicit and written to varying degrees of detail. Thought needs to be given to the readers of these documents and their assumed knowledge of the requirements. The skill of being able to write clear, concise requirement documents that are easy to read is now as important as being able to write good code.
Although forms of automatic code generators exist, on the whole, people develop software. Just as we’ve all read books by good and bad authors, we’ve all used software developed by good and bad developers. Good software can be recognized by ease of use, Ability to perform the required process, Ease of support and maintenance.
Just as books can be written in many languages and styles, so can software. Good software can only be written when the developer is skilled in the software language, the development style (development standards), and the intent of the process that the software will automate.
Very few computerized systems will only be used by a single user. Often these people will have different needs from the same process. It is critical to get as many requirements from all the affected user groups early on in a project. This stage is often missed or thought unnecessary when a company adopts a corporate software solution, or Commercial Off-the-Shelf Software (COTS) is implemented.
It is also important that users are adequately trained in the use of the system. As we live in a world where computerized systems are abundant and staff turnovers are typically high, regular personnel retraining needs to be assessed.
Computerized system’s end users make the best software testers. It is always easier to test a system for what it is designed to do rather than to test it for the variety of activities it may encounter for which it was not designed.
Note that it isn’t an auditor’s job to go into the technical depth as to the level of testing performed, just that the qualified personnel have approved the documentation.
As software projects usually involve a team of people, standards across this team are very important to maintain a consistent approach to all tasks. Standards such as document conventions and coding practices may seem obvious, but issues such as defining levels for categorizing testing anomalies can be very subjective.
It is common to define these anomalies into “minor,” “major,” and “critical” deficiencies and have predefined approaches to continuing the testing should these occur.
On the whole, software operates in a consistent manner whereas people do not. Variance not only exists between individuals but within individual people. How I react to a situation today may not be the same way I react to the same situation tomorrow.
This should be kept in mind when designing software solutions. If an application requires a certain process flow (that is, screen one must be completed before screen two, etc.), why should the software allow any ordering of operation? This not only freely enables end users to make mistakes, but makes the validation of available (as opposed to acceptable) process paths almost impossible.
All such features make the system easier to use, reducing to some degree user variability. They also make the system easier to validate as testing, training, and on-going support will all be easier.
We all berate software that does not do what we want it to or what it was developed or purchased to do. Unfortunately, the software doesn’t care and continues to do the wrong thing everytime we conduct that operation. It is far easier to criticize the software/hardware rather than the people, who designed, developed, implemented, and tested the finished product.
For a successful implementation, an open, honest approach should be encouraged and properly managed between all members of the project team. It is far better to ask questions during the requirements and design stages of a project, than after a system has been developed and implemented.
In short, the key to implementing successful computerized system solutions is not in the software or hardware. These days, software and hardware can be purchased, configured, and developed to do virtually anything you can define. The key lies in the selection of the correctly qualified people to perform every task from inception, through development and implementation and continuing with on-going maintenance. To put it simply, “getting the right people for the job.”
Unfortunately, there isn’t a huge pool of people who are trained in good software project management that intimately understand your requirements.
There seems to be two ways to rectify this:
1. Hire or contract IT people skilled in software project management and train them on your business requirements.
2. Take knowledgeable business users with a working knowledge of computerized systems and train them in good software project management.
A combination of both of the above would seem the most logical approach.
Developing/Implementing Computerized Systems in the Regulatory Industry
For most of the manufacturers of automated equipment, software is a small part of a wider development effort. The mechanical element is often perceived as the most important, followed by the electrical parts, and only then, by controlling software.
For many IT professionals, describing IT concepts in a non-technical manner is not as easy as it sounds.
By its nature, most of actual software development involves non-communicative activities. Between hours of coding and debugging, and struggling with restrictions on the software tools, the actual user requirements may become less focused as the project expands.
Consider the typical V-model defined by GAMP that describes the accepted approach to software development and superimpose the key personnel groups and their input percentage of effort at each stage. The point to this is that we typically do not involve our programmers at the requirements stage or our users at the coding stage.
This assumes that the users know exactly what they want and the programmers know exactly how they want it. Two very big assumptions that if not adequately mediated could easily implement the wrong job the wrong way.
Users feel they cannot request different features because it’s not in the specifications. Developers feel they must focus on delivering what’s in the spec, and are thus reluctant to make their own suggestions for improved functionality.
This highlights the importance of the Project Leader in coordinating the correct information to the entire team. It also highlights the importance of having accurate and easy-to-read project documentation to pass from one phase of the project to the next.
Conclusion
The concepts of computer validation are sound and need to be followed. However, rather than just implementing a set of quality policies and procedures and then policing them, the following considerations should be addressed:
· Scrutinize the “roles and responsibilities” section in the quality/validation plan
· Consider additional training in the non-project areas (communications, document writing, and presentation skills)
· Ensure the personnel leading the requirements development have the necessary skills.
· Encourage openness and honesty between all project members
· Foster relationships between users and developers
If every IT project team member is given clear instructions, has the skills to perform the tasks they have been allocated, and can successfully pass this information onto the next person in the project, then this will surely raise the confidence that the computerized system will reliably and consistently perform the tasks it has been documented to perform.
21 CFR Part 11- The Regulation
The FDA rule relating to the use of Electronic Records and Electronic Signatures is one of the most significant pieces of new legislation of USA to affect the pharmaceutical manufacturing industry in recent times.
With ever-greater use of information technology and computer system at all stages of manufacturing, more and more of the operating processes are being automated. As a result, key decisions and actions are being taken through electronic interfaces, with regulatory records being generated electronically.
While recognizing the long-term benefits of 21 CFR Part 11 will bring in permitting technological advances, industry is also faced with applying the rule to existing systems and current projects. With this comes an urgent need to improve understanding of the rule, its interpretation, and application. But that is some other time
The pharmaceutical industry is facing ever-tightening regulatory constraints, increased pressure to shorten new product development times, escalating R&D costs, shortening product life cycles, and reforms from governments eager to constrain healthcare budgets. The race to discover, develop, and market new drugs is a fiercely competitive one, requiring significant investments in both time and money.
Automation remains critical to optimizing the drug manufacturing process. With the latest advances in automation, information, and business systems, pharmaceutical enterprise architectures continue to evolve.
Computer technology has changed the framework of business in every industry, transforming the way the businesses operate internally, as well as they interface with customers and external businesses. Many companies today see computer technology as vital to delivering products and services to the marketplace. The Pharmaceutical industry is one of the many transformed by computer and software. Computer technology fulfills functions of automation & control as well as facilitates information management.
It is in this area that the rapid pace of computer technology has presented challenges.
In general, GAMP (Good Automated Manufacturing Practices) recognizes automated system in a broad form, including automated manufacturing equipment, control systems, automated laboratory systems, manufacturing execution systems and computers running laboratory or database systems. The automated system generally consists of the hardware, software and network components, together with the controlled functions and associated documentation. GAMP has recently released their latest guidelines named GAMP 4 specifically encompassing 21 CFR Part 11 issues as well.
We are not covering all in this paper and have briefly attempted to address basic issues pertaining to the following systems.
a) Programmable Logic Controller Validation
b) Computer System Validation
c) Software Validation
Programmable Logic Controller (PLC) Validation
Up until the not-too-distant past, industrial equipment was generally controlled with relays. Electricians would wire them together, people would push buttons, and machines would operate. During the '60s or thereabouts, companies began to implement some of the functionality of the relays with software. In doing so, they invented a programming language called Ladder Logic. It was designed so electricians of that day could understand it; it mirrored nomenclature used at the time. The PLC, though, is a general-purpose computer that has been dedicated to input and output functions. Input/Output (IO) operates on simple algorithms, and it is dedicated to control applications.
My first experience with this was probably typical to what many people have experienced. Engineering developed a system, and the specifications were not available. The equipment was there for validation. The system needed to be validated right away, but there was a shortage of time and resources. Also there was no data available and my team has to persuade/cajole equipment vendor to provide some semblance of design documentation. Most of the time was spent in ‘reading’ the program and ‘deciphering’ it as per process needs. Further time was spent in translating this ‘data’ into validation documentation.
PLC validation could be hard, but it doesn't have to be painful. It can, at times, be tedious or frustrating, but it doesn't have to be hard. If you organize your validation program and have the proper things in place before you get started, you can get through it. And, when you're done, you can actually have a validation package that people can look at, understand, and get something from. It can be done.
What Makes PLC Validation Unique?
There are some differences between PLC and "standard" validation, and these differences apply equally well to software validation in general. What makes software validation so different is the complete absence of components. Normally in equipments or machines you can walk out, out your hands on the parts, identify them physically, measure it with some means.
That isn't the case with software of PLC software. You can never actually touch, see, sense, or perceive the true software. Anytime you're looking at a screen display of the software, or a printout of the software, all you are seeing is an interpretation, by another device, of what the software actually looks like.
So the big challenge in PLC validation, and validation in general, is to somehow give reality to these components. Your challenge as validation professionals is to document that these components are valid and responsive to their user's needs.
Another significant difference in the validation of PLCs is that validation starts at the beginning of development. With traditional equipment, it is conceivable that the engineering department could design something. The facilities department could buy it, install it, and then same can be validated.
That won't work with PLC validation. Validation has to be part of the program from the beginning, or you're going to have a huge amount of work at the end. And you probably won't be happy with the results.
Even a simple PLC program and system is probably more complex than the most complex piece of equipment you'll ever validate. A tablet press may have a few thousand moving components, but a PLC program with many thousands of lines of Ladder Logic - if you consider each line to be a component - is immensely more complex.
That is another challenge for the people in validation. You can't be expected to look at every component of the PLC. It certainly isn't done when you look at a piece of equipment. With equipment, you look at its functionality, and that's what we need to do with PLCs. We need to look at the functionality of the software.
Don't forget the expertise of the developer. The person who programs the PLC knows very little about the process, current Good Manufacturing Practice (cGMP), or anything else that matters as validation personnel. The developer is an engineer, an electrician, or a software guy looking at a set of specifications and as he is executing that software, points are going to be missed, very important points.
The Wrong Way to do PLC Validation
Though awareness has started spreading in the industry, most of the times there is no link between DQ and IQ or as its normally, there are no DQ available for PLC system!
The typical development cycle is like this. The end-user asks machine vendor to provide automated machine without defining what does he want in automation portion and that too real fast. Machine vendor in turn provides generic information to his automation vendor to automate certain parameters. This guy is typically seller of some MNC branded PLC hardware & software. The job is entrusted to some local system integrator with limited or no exposure to cGMP norms as well as GDP (Good Documentation Practices).
The system, which doesn’t perform as desired, gets tweaked to make it do what the user originally wanted during installation/commissioning phase. Then the system gets handed over to validation, who is told to get it validated “real quick” to take care of production needs.
When you do it the wrong way, you discover problems during validation. That is the worst possible time to discover something amiss, because, as usual, the validation function is at the tail end of the time line and everybody is interested in getting over with the job and get the production started.
Now, changes are required though, because you discover problems and, worst of all, the defects are hidden in the system. If the right development work hasn’t been done up front, you can’t avoid the fact that there are going to be hidden problems that you won’t find until much further down the road. It’s an inconvenience for validation, but it’s a potential disaster for the business (especially if it requires a recall).
Worst of all, the system will fail to meet the user’s needs. The opportunity is there, during the development phase, to make these software control systems meet the user’s needs. It only requires a little bit of paperwork up front and communication. It’s truly unfortunate when all of the effort to put a system in place is expended, and it doesn’t do what anybody really needed it to do.
One of the biggest challenges with validation of PLCs is to really understand what each piece of code means; it’s not like a computer program. With PLCs, the developer has to sit down and explain address, alarms, input/output, the interfaces, and so forth so you can understand.
Involving validation experts early in the process will result in few changes to the system during development, and that’s always preferable. Hopefully, you’ll have no hidden defects. If the validation and testing plan has been carried out properly, there shouldn’t be any, and the system will meet the user’s needs.
It costs no more to do it the right way than to do it the wrong way. The challenge is getting the people to make the investment up front rather than correcting things later on.
This way of doing things requires the validation personnel to be pro-active in talking to the people who develop and create these things in their facilities. These people need to be convinced that it is in their best interests to be open with you, to tell you what’s going on, and spend time with you explaining the system.
Computer Validation
Computer validation wasn’t as much “introduced” as “formalized.” The necessity to work in a disciplined, common sense way to deliver the end user’s requirements has always existed. New company policies, procedures, and standards however have been introduced.
Questions abound about how to interpret the regulatory requirements for computerized systems. What constitutes an electronic record? Which systems need validation? To what level does a particular system need validation?
Let’s first put aside the mandatory requirements (two token access, electronic audit trail, etc.), and answer the following question:
“Are you very confident that at any point in time your computerized system is doing exactly the job it was intended to do in an acceptable way and can you prove this to an independent person?”
By picking random times throughout the specification, design, implementation, and usage of a computerized system, this question raises many more questions that force the need for control onto a project.
An interesting question to ponder is:
“Does more documented evidence mean a higher degree of assurance?”
If the validation is being performed to satisfy an external regulatory authority rather than to improve the implemented system, then in all probability important points are missed and maybe you are spending your money unwisely.
There is always a fixed amount of money available for any project - out of which must include the project and validation costs.
The premise that validation will save money is true. The first few projects (for some, this may be quite a few), the validation costs include the additional training (both direct and indirect) costs of instilling the company’s validation policies.
Is Computer Validation Really Needed?
Computer validation as a regulatory requirement is necessary as it provides the impetus for companies to continually review the quality of software solutions they implement and use.
There is no guarantee for a successful computerized system implementation. If there were, the emphasis on computer validation would not be where it is today.
Very few of us are at a stage where computer validation requirements are performed as second nature. As project teams, we are all in a transitional phase where we need to discover what approach to validation works best for our team and the project. During this phase there will be mistakes. Documents will be created (sometimes with little benefit to the project) because procedures say that they are required. Some documents will be reworked countless times and others will be signed off in haste. Activities will be performed before they have been formally approved to occur and signatures will be missed.
In this phase, validation is a stricter policing exercise because examples need to be set.
None of this means that the concepts of validation do not work. It just means that we have a learning curve to get over and experience to be gained to ensure that future projects benefit from the current mistakes.
During this transitional phase, it is important to note that what may be classified a successful validation may not automatically guarantee a successful computerized system.
We sometimes lose focus that these deliverables alone do not guarantee a successful computerized system implementation. They only guarantee a consistent approach within and between projects. It is actually the content of the “project related” deliverables that determines the success of a system.
A validation stage can be deemed successful if it finds the errors or anomalies (i.e., incorrect requirements, wrong design approaches, and coding errors, etc.) introduced during that stage. If it takes many iterations to fix them (or worse, they never get fixed properly) then that project stage cannot be deemed a success.
Key Components of all Computerized Systems
The main features that are required from any system are reliability and consistency. A computerized system consists of mainly following components:
· Software - the most difficult component to understand since it cannot be touched, heard, seen, or smelled.
· Hardware - computers, printers, cabling, modems, scanners, etc.
· People - users, support staff, and developers
· Procedures - ways of integrating the other components to achieve a desired result
What Effect do People Have on These Systems?
The biggest factor of a successful computerized system is trained personnel. The importance of this point in any project cannot be overstated.
Individual training and experience records need to be maintained. For each major project, every person working on that project needs to be assessed as to their current skill and experience for the tasks they are to perform.
Requirements are both explicit and implicit and written to varying degrees of detail. Thought needs to be given to the readers of these documents and their assumed knowledge of the requirements. The skill of being able to write clear, concise requirement documents that are easy to read is now as important as being able to write good code.
Although forms of automatic code generators exist, on the whole, people develop software. Just as we’ve all read books by good and bad authors, we’ve all used software developed by good and bad developers. Good software can be recognized by ease of use, Ability to perform the required process, Ease of support and maintenance.
Just as books can be written in many languages and styles, so can software. Good software can only be written when the developer is skilled in the software language, the development style (development standards), and the intent of the process that the software will automate.
Very few computerized systems will only be used by a single user. Often these people will have different needs from the same process. It is critical to get as many requirements from all the affected user groups early on in a project. This stage is often missed or thought unnecessary when a company adopts a corporate software solution, or Commercial Off-the-Shelf Software (COTS) is implemented.
It is also important that users are adequately trained in the use of the system. As we live in a world where computerized systems are abundant and staff turnovers are typically high, regular personnel retraining needs to be assessed.
Computerized system’s end users make the best software testers. It is always easier to test a system for what it is designed to do rather than to test it for the variety of activities it may encounter for which it was not designed.
Note that it isn’t an auditor’s job to go into the technical depth as to the level of testing performed, just that the qualified personnel have approved the documentation.
As software projects usually involve a team of people, standards across this team are very important to maintain a consistent approach to all tasks. Standards such as document conventions and coding practices may seem obvious, but issues such as defining levels for categorizing testing anomalies can be very subjective.
It is common to define these anomalies into “minor,” “major,” and “critical” deficiencies and have predefined approaches to continuing the testing should these occur.
On the whole, software operates in a consistent manner whereas people do not. Variance not only exists between individuals but within individual people. How I react to a situation today may not be the same way I react to the same situation tomorrow.
This should be kept in mind when designing software solutions. If an application requires a certain process flow (that is, screen one must be completed before screen two, etc.), why should the software allow any ordering of operation? This not only freely enables end users to make mistakes, but makes the validation of available (as opposed to acceptable) process paths almost impossible.
All such features make the system easier to use, reducing to some degree user variability. They also make the system easier to validate as testing, training, and on-going support will all be easier.
We all berate software that does not do what we want it to or what it was developed or purchased to do. Unfortunately, the software doesn’t care and continues to do the wrong thing everytime we conduct that operation. It is far easier to criticize the software/hardware rather than the people, who designed, developed, implemented, and tested the finished product.
For a successful implementation, an open, honest approach should be encouraged and properly managed between all members of the project team. It is far better to ask questions during the requirements and design stages of a project, than after a system has been developed and implemented.
In short, the key to implementing successful computerized system solutions is not in the software or hardware. These days, software and hardware can be purchased, configured, and developed to do virtually anything you can define. The key lies in the selection of the correctly qualified people to perform every task from inception, through development and implementation and continuing with on-going maintenance. To put it simply, “getting the right people for the job.”
Unfortunately, there isn’t a huge pool of people who are trained in good software project management that intimately understand your requirements.
There seems to be two ways to rectify this:
1. Hire or contract IT people skilled in software project management and train them on your business requirements.
2. Take knowledgeable business users with a working knowledge of computerized systems and train them in good software project management.
A combination of both of the above would seem the most logical approach.
Developing/Implementing Computerized Systems in the Regulatory Industry
For most of the manufacturers of automated equipment, software is a small part of a wider development effort. The mechanical element is often perceived as the most important, followed by the electrical parts, and only then, by controlling software.
For many IT professionals, describing IT concepts in a non-technical manner is not as easy as it sounds.
By its nature, most of actual software development involves non-communicative activities. Between hours of coding and debugging, and struggling with restrictions on the software tools, the actual user requirements may become less focused as the project expands.
Consider the typical V-model defined by GAMP that describes the accepted approach to software development and superimpose the key personnel groups and their input percentage of effort at each stage. The point to this is that we typically do not involve our programmers at the requirements stage or our users at the coding stage.
This assumes that the users know exactly what they want and the programmers know exactly how they want it. Two very big assumptions that if not adequately mediated could easily implement the wrong job the wrong way.
Users feel they cannot request different features because it’s not in the specifications. Developers feel they must focus on delivering what’s in the spec, and are thus reluctant to make their own suggestions for improved functionality.
This highlights the importance of the Project Leader in coordinating the correct information to the entire team. It also highlights the importance of having accurate and easy-to-read project documentation to pass from one phase of the project to the next.
Conclusion
The concepts of computer validation are sound and need to be followed. However, rather than just implementing a set of quality policies and procedures and then policing them, the following considerations should be addressed:
· Scrutinize the “roles and responsibilities” section in the quality/validation plan
· Consider additional training in the non-project areas (communications, document writing, and presentation skills)
· Ensure the personnel leading the requirements development have the necessary skills.
· Encourage openness and honesty between all project members
· Foster relationships between users and developers
If every IT project team member is given clear instructions, has the skills to perform the tasks they have been allocated, and can successfully pass this information onto the next person in the project, then this will surely raise the confidence that the computerized system will reliably and consistently perform the tasks it has been documented to perform.
21 CFR Part 11- The Regulation
The FDA rule relating to the use of Electronic Records and Electronic Signatures is one of the most significant pieces of new legislation of USA to affect the pharmaceutical manufacturing industry in recent times.
With ever-greater use of information technology and computer system at all stages of manufacturing, more and more of the operating processes are being automated. As a result, key decisions and actions are being taken through electronic interfaces, with regulatory records being generated electronically.
While recognizing the long-term benefits of 21 CFR Part 11 will bring in permitting technological advances, industry is also faced with applying the rule to existing systems and current projects. With this comes an urgent need to improve understanding of the rule, its interpretation, and application. But that is some other time