Guidance on corrective action and preventive action and related QMS processes GHTF/SG3/N18:2010
Annex A: Examples of Phase Activities
List of possible activities corresponding to the phases in Figure 1.
The following is an outline/aid memoir of the main points described in this document. It is not intended as a “box ticking” exercise and should not be used as such, but used purely to summa-rize and align the steps in the process described in this document. The activity numbers do not imply sequential steps – some steps may take place in parallel.
The references in this Annex refer to the sections in this document.
Phase Activities
Planning
1. 2. 3. 4. 5. 6. 7. 8.
Identify all data sources (internal/external) by product type (4.1)
Identify resources required and individual personnel responsibilities for measuring each data source (4.1)
Define the requirements for each data source and the data elements within each data source that will be measured and analysed (4.1) Define requirements for escalation to the improvement phase (4.1) Define requirements for monitoring the measurements in the data sources (5.1)
Establish data sources (4.2)
Measure and analyse all data sources for nonconformities and potential nonconformities (5.0, 5.1 and 5.2)
Have reports of nonconformity or potential nonconformity come from more than one data source?
Is the nonconformity or potential nonconformity systemic?
Measurement and Analysis within and across Data Sources
Improvement
Input to Management
9.
10. Determine scope and required outcome of investigation (6.1) 11. Investigate nonconformity or potential nonconformity (6.1)
12. Analyse nonconformity or potential nonconformity for root cause(s)
(6.2)
13. Identify actions ( correction, corrective action or preventive action)
(6.3)
14. Verify proposed actions before implementation (6.4) 15. Implement proposed actions (6.5)
16. Determine effectiveness of actions (validate if possible) (6.6)
17. Report investigation and outcome to management (7.1) 18. Review investigation, analysis and outcome (6.6, 7.2) 19. If not satisfied return to step 10
20. If required, report to regulator (note: reporting may be required earlier
depending on severity)*
21. Audit system at determined intervals*
22. If numbers of nonconformities or potential nonconformities exceeds
targets, review all QMS processes*
*Steps 20 to 22 are not described in this document but are added as reminders of general management responsibilities
in this area of the QMS.
November 4, 2010
Page 21 of 26
Guidance on corrective action and preventive action and related QMS processes GHTF/SG3/N18:2010
Annex B: Examples of Data Sources and Data Elements
Examples of data sources and their data elements can be, but are not restricted to:
Data Sources Regulatory Requirements ? ? Management Review ? Supplier ? Performance/Controls ? ? ? ? ? ? ? Complaint Handling ? ? ? ? ? ? ? Adverse Event Reporting ? ? ? ? ? ? Process Controls ? ? ? ? ? ? ? ? ? ? ? ? ? Finished Product ? Quality Audits ? (internal/external) ? ? ? ? ? November 4, 2010
Data Elements Result of a regulatory inspection New or revised regulatory requirements Management review output Number of batches received Batch and/or shipment Inspection and test records Quantity of rejects or deviations Reason for rejection By supplier, if more than one supplier Use in which product or service Supplier problems Quantity By product family By customer (physician, healthcare facility, patient, etc.) Reason for complaint Complaint codes Severity Component involved Event Quantity By product family By customer (physician, healthcare facility, patient, etc.) Type of event (death or serious injury, etc.) Component involved By product Operator Work shift Equipment and/or instruments used Inspection and test records In-process control results Process control parameters Inspection process Final acceptance Rejects Special process Validation study results Process monitoring observations Inspection and test records Observations (number, category, corporate policy, regulatory requirements, significance, etc.) Repeat observations (indicative of effectiveness) Closure times Overall acceptability of contractor or supplier Compliance to audit schedule Audit personnel Page 22 of 26
Guidance on corrective action and preventive action and related QMS processes GHTF/SG3/N18:2010
Data Sources Data Elements ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? Timeliness of recall communication Classification of recall Recall effectiveness checks Frequency of replacement Batch number of spare part By supplier of spare part, if more than one supplier By customer By location or area of customer Installation First use of equipment Frequency of maintenance visits Types of repairs Frequency of repairs Usage frequency Parts replaced Service personnel Quantity Reason for returning product By customer Types of defects identified on returned product Customer preferences Customer service response time Solicited information on new or modified products Research papers Articles in trade journals Design and development review results Design and development verification results Design and development validation results Design and development changes (reason or cause for change, effectiveness of change, etc.) Controls on purchased products or services (See above Supplier Performance/Controls) Verification results of purchased product Inspection and testing data of purchased product Production and Service processes- Cleaning operations of product and facilities Sterilization Installation results Servicing and Maintenance if required (See also: Service Reports) Verification and Validation results of processes used in production and service. Including approval of equipment and qualification of personnel Traceability Data Controls of monitoring and measuring devices Calibration and maintenance of equipment Customer Information- New or repeat customer Customer feedback maybe in other forms than complaints or returned product (Customer Service call data, repeat sales , delivery/distribution data) Published reports/literature of failures of similar products Stakeholder concerns and generally accepted state of the art Risk acceptability criteria Page 23 of 26
Product Recall Spare Parts Usage Service Reports Returned Product Market/Customer Sur-veys Scientific Literature Media Sources Product Realization ( Design, Purchasing, Production and Service and Customer informa-tion) Risk Management ? ? ? November 4, 2010
Guidance on corrective action and preventive action and related QMS processes GHTF/SG3/N18:2010
Annex C: Examples of Contributing Factors
Examples of possible contributing factors to be considered when doing the root cause analysis:
Materials ? ? ? ? ?
Defective raw material (does material meet specification?) Batch related problem
Design problem (wrong material for product, wrong specifications) Supplier problem (lack of control at supplier, alternative supplier) Lack of raw material.
Machine / Equipment ? ? ? ? ? ?
Incorrect tool selection – suitability
Inadequate maintenance or design – calibration? Equipment used as intended by the manufacturer? Defective equipment or tool End of life?
Human error – inadequate training?
Environment ? Orderly workplace
? Properly controlled – temperature, humidity, pressure, cleanliness ? Job design/layout of work
Management ? ? ? ? ?
Inadequate management involvement Stress demands Human factors
Hazards not properly guarded
Were management informed / did they take action?
Methods ? Procedures not adequately defined
? Practice does not follow prescribed methodology ? Poor communications
Management system ? ? ? ?
Training or education lacking Poor employee involvement Poor recognition of hazard
Previous hazards not eliminated
Measurement, monitoring and improvement ? Inadequate measuring and improvement
November 4, 2010
Page 24 of 26
Guidance on corrective action and preventive action and related QMS processes GHTF/SG3/N18:2010
Annex D: Examples for Documentation of the Improvement Processes
The table below includes guidance for documenting various requirements of the improvement processes. Problem Statement Guidance ? Clearly defined problem statement. State how the issue was discov-ered. The process/procedure that was not followed. ? Provide evidence What, When, Who, Where and How much (as applicable) General Examples ? Containment, ? Stop of shipment/ supply ? Issuance of advisory notice ? Incident awareness / training ? Change or suspend production process Example Documentation During in-process testing of Product A finished product on [date], two devices out of 30 were found to be noncon-forming per Design Document 123456, revision A. Note 2.1 in Design Document 123456 requires that the surface finish be 32 μinch maximum on all exterior surfaces. The two nonconforming devices had a surface finish above the maximum 32 μinch finish as follows: ? Serial Number 54321 had a surface finish of 67 μinch ? Serial Number 65432 had a surface finish of up to 38 μinch The supplier was notified of the issue on [date]. The supplier conducted an operator awareness training of the incident on [date]. Initial extent of the issue is restricted to supplier lot #678. All unused components and product built with components from this lot were controlled on [date]. No product built with this lot had been distributed. See initial problem statement. Subsequent investigation confirmed that the issue was limited to lot #678. All addi-tional available lots of this component were inspected with a 95/95 inspection plan and no additional lots were con-firmed to have the issue. The incoming inspection process and component FMEA were reviewed and determined to be adequate and accu-rate, respectively. Review of finished product reject data over the past year revealed no other rejects for surface finish of this compo-nent. The following problem-solving tools and methods were used during the course of the investigation of the surface finish issue. ? Fishbone analysis – see the attached file labeled ?Sur-face Finish Analysis?. ? Conference calls and documentation reviews with the Supplier – see attached file which contains the mi-nutes from the conference calls. Results of the investigation were the following. Two dif-ferent raw tubing lots were mixed at the Supplier?s finish-ing process. One raw tubing lot was intended for customer A?s products (Lot number 10000-100 requiring a surface finish of 32 μinch maximum) and the other was intended for customer B?s product which had a surface finish above the 32 μinch maximum. Page 25 of 26
Correction Investigate ? ? ? ? Clearly defined problem statement (update/refine if new information is determined) What information was gathered, reviewed and/or evaluated Results of the reviews/evaluations of the information Identification of cause(s) or con-tributing factors November 4, 2010
Guidance on corrective action and preventive action and related QMS processes GHTF/SG3/N18:2010
Identify Root Cause ? The output of the root cause analy-sis should be a clear statement of the most fundamental cause(s) re-sulting in the nonconformity Specify: ? What the action is ? Who will do it ? When it should be done It has been concluded that the root cause of the tubing sur-face finish issue is inadequate line clearance procedures established at the supplier. Planned actions Verification of actions Verification of effective-ness Corrective action: Supplier to add line clearance require-ments to documented procedures by [date]. Preventive action: Not applicable. ? Verification activities are to ensure General examples are included below. Actual documenta-that all the elements of the pro-tion would need to be more specific. posed action (documentation, ? Review and approval of the procedural changes prior training etc) will satisfy the re-to use quirements of the proposed action ? Conduct a pilot of new procedure on a specific project/department/time frame prior to full scale im-? Validation activities generate data plementation and information that confirm the ? Verification that the updated supplier procedure ad-likelihood of the effectiveness of dresses the process that caused the nonconformity the corrective action to eliminate ? Verification that the training materials address the the nonconformity or proposed specific process that caused the nonconformity nonconformity. ? Comparing a new design specification with a similar proven design specification ? Performing calculations using an alternative method ? Perform validation of equipment, software, production processes, test method, component, etc. Specific example: Review and approval of supplier procedure XXX by the supplier and the customer to ensure adequacy of the up-dated line clearance process. Method or data for the determination of X months after implementation: effectiveness with acceptance criteria. ? The improvement goal ? Conduct a query of the electronic manufacturing data ? The evidence (data sources) that system to verify there are zero surface finish rejects will be used to support effective-for this component at finished Product A final inspec-ness (e.g., a data source could be tion. where the problem was initially found) ? Supplier Quality Engineer to conduct on site review at ? The time frame that effectiveness the supplier of the action to confirm the procedures will be monitored (e.g., upon com-are in place, are known to the operators, and there is pletion of actions or three months, evidence that the procedures are being followed. six months as appropriate) OR ? Sample size required to demon-strate effectiveness Winterhufen 1.0 November 4, 2010 Page 26 of 26