? ?
original data should be reviewed;
original data and/or certified true and exact copies that preserve the content and meaning of the original data should be retained;
? as such, original records should be complete, enduring and readily retrievable and readable throughout the records retention period.
Examples of original data include original electronic data and metadata in stand-alone computerized laboratory instrument systems (e.g. UV/Vis, FT-IR, ECG, LC/MS/MS, haematology and chemistry analysers, etc.), original electronic data and metadata in automated production systems (e.g. automated filter integrity testers, SCADA, DCS, etc.), original electronic data and metadata in network database systems (e.g. LIMS, ERP, MES, eCRF / EDC, toxicology databases, deviation and CAPA databases, etc.),
handwritten sample preparation information in paper notebooks, printed recordings of balance readings, electronic health records, paper batch records.
Review of original records Expectations for paper Expectations for electronic Controls for review of original paper records Controls for review of original electronic include, but are not limited to: ? records include, but are not limited to: written procedures and training and review and audit and inspection controls that ensure personnel conduct an adequate review and written procedures and training and ? review and audit and self- inspection controls that ensure personnel conduct an adequate review and approval of original paper records, including papers used to record the contemporaneous capture of information; ?
data review procedures should
?
approval of original electronic records, including human readable source records of electronic data;
data review procedures should describe review of original electronic data and relevant metadata. For example, written procedures for review should require that persons evaluate changes made to original information in electronic records (such as changes documented in audit trails or history fields or found in other meaningful metadata) to ensure these changes are appropriately documented and justified with substantiating evidence and investigated when required;
data review should be
documented. For electronic records, this is typically signified by
electronically signing the electronic data set that has been reviewed and
describe review of relevant metadata. For example, written procedures for review should require that persons evaluate changes made to original information on paper records (such as changes documented in cross out’ or data correction’) to ensure these changes are appropriately documented, and justified with substantiating evidence and investigated when required; ?
data review should be documented. On paper records this is typically signified by signing the paper records that have been reviewed. Where record approval is ? a separate process this should also be similarly signed. Written procedures for data review should clarify the meaning of the review and approval signatures to
ensure persons understand their
responsibility as reviewers and approvers to assure the integrity, accuracy, consistency and compliance with established standards of the paper records subject to review and approval; ?
a procedure should describe the actions to be taken if data review identifies an error or omission. This procedure should enable data corrections or clarifications to be made in a GxP compliant manner, providing visibility of ? the original record and audit trailed traceability of the correction, using ALCOA principles.
approved. Written procedures for data review should clarify the meaning of the review and approval signatures to ensure persons understand their responsibility as reviewers and approvers to assure the integrity, accuracy, consistency and compliance with established standards of the electronic data and metadata subject to review and approval;
a procedure should describe the actions to be taken if data review identifies an error or omission. This procedure should enable data corrections or clarifications to be made in a GxP compliant manner, providing visibility of the original record and audit trailed traceability of the correction, using ALCOA principles.
Special risk management considerations for review of original records
? Data integrity risks may occur when persons choose to rely solely upon paper printouts or pdf reports from computerized systems without meeting applicable regulatory expectations for original records. Original records should be reviewed – this includes electronic records. If the reviewer only reviews the subset of data provided as a printout or pdf, these risks may go undetected and harm may occur.
? Although original records should be reviewed, and persons are fully
accountable for the integrity and reliability of the subsequent decisions made based upon original records, a risk-based review of the content of original records is recommended.
? A risk-based approach to reviewing data requires process understanding and knowledge of the key quality risks in the given process that may impact patient, product, compliance and the overall accuracy, consistency and reliability of GxP decision-making. When original records are electronic, a risk-based approach to reviewing original electronic data also requires understanding of the computerized system, the data and metadata and data flows.
? When determining a risk-based approach to reviewing audit trails in GxP computerized systems, it is important to note that some software developers may design mechanisms for tracking user actions related to the most critical GxP data using metadata features and not named these audit trails but may have used the naming convention “audit trail” to track other computer system and file maintenance activities. For example, changes to scientific data may sometimes be most readily viewed by running various database queries or by viewing metadata
fields labelled “history files” or by review of designed and validated system reports, and the files designated by the software developer as audit trails alone may be of limited value for an effective review. The risk-based review of electronic data and metadata, such as audit trails requires an understanding of the system and the scientific process governing the data life cycle so that the meaningful metadata is subject to review, regardless of naming conventions used by the software developer. ?
Systems typically include many metadata fields and audit trails. It is expected that during validation of the system the organization will establish – based upon a documented and justified risk assessment – the frequency, roles and responsibilities, and approach to review of the various types of meaningful metadata, such as audit trials. For example, under some circumstances, an organization may justify periodic review of audit trails that track system
maintenance activities, whereas audit trails that track changes to critical GxP data with direct impact on patient safety or product quality would be expected to be reviewed each and every time the associated data set is being reviewed and approved – and prior to decision-making. ?
Systems may be designed to facilitate audit trail review via varied means, for example, the system design may permit audit trails to be reviewed as a list of relevant data or by a validated exception reporting process. ?
Written procedures on data review should define the frequency, roles and responsibilities, and approach to review of meaningful metadata, such as audit trials. These procedures should also describe how aberrant data is handled if found during