Quality

From Libre Pathology
(Redirected from Quality assurance)
Jump to navigation Jump to search

Quality, in pathology, has got a lot of attention lately because there have been high-profile irregularities that lead to significant harm.[1][2]

General

The keys to quality are:

  1. Understanding the needs of the stakeholders (surgeons, oncologists, patients, other pathologists, the public at large).
  2. Understanding the processes.
  3. Developing measures of quality.
  4. Tracking the measures of quality & assessing their validity.
  5. Understanding the causes of failures/adverse events in the context of the processes.
  6. Continually doing all of the above with the aim of improving outcomes - continuous quality improvement.

Definitions

System documentation and description

Quality Management Program-Laboratory Services (QMP-LS) defines a hierarchy of documentation:[3]

  • Policy.
  • Process
  • Procedures.

Policy

  • High level document
  • Describes rationale for processes, defines goals/objectives - includes parameters that can be measured.

Process

  • Intermediate level document.
  • Defines input and outputs, outlines the steps taken to achieve an objective - should not be overly detailed.

Procedure

  • Low level document.
  • Detailed line-by-line instructions - description of the workflow.

Other

Quality control

  • Examines whether a process is hitting its target(s) for its measure(s) of quality.

In short: Does it hit the targets?

Quality assurance

  • Program to insure that a process is yielding the desired output(s).

In short: Does it produce the desired output?

Analysis

Overview

Quality issues can be examined in a number of different ways.

Finding a problem:

  • Root cause analysis.

Anticipating problems:

  • Failure mode and effects analysis (FMEA).

General error analysis

Pathology errors happen any time from when the lab gets the specimen until after the report is issued.

When errors happen:

  • Work-up the problem.
    • Where did the error occur? Pathologist error?
  • Talk to the clinician.
    • If it is a critical diagnosis contact the most-responsible physician immediately... if they are unreachable call the physician on-call for the most-responsible physician... if the patient is out-of-town you may have to coordinate with the local emergency department.
  • Talk to the chief of pathology.
  • Incident report.
  • Reconstruct error.
    • Was it a specimen mix-up?
      • Is there another error?
  • Amend the report(s).
  • Remedy the source of error.

The classic structural break down

A classic structural break down for error analysis is:

 
 
 
 
 
 
 
 
Errors in pathology
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Pre-analytical errors
 
 
Analytical errors
 
 
Post-analytical errors

Note:

  • This break down is arbitrary and in of itself most useful for answering exam questions.
  • In a practical context, it is a frame work for classifying errors. It is not useful for understanding the source of an error or addressing it.

Pre-analytic errors

  • Container mix-up - pre-lab & in-lab.
  • Block mix-up.
  • Slide mix-up - labels wrong.
  • Poor quality slides (fixation, processing, staining).
  • Lost specimen - can be potentially anywhere in the process.

Analytic errors

  • Interpretation wrong.
    • Factors:
      • Difficult case.
      • Technical factors (quality of slides).
      • Lack of clinical history.

Post-analytic errors

  • Wrong case signed-out.
  • Filing problem/lost report.
  • Interpretation of report problem (poorly written report, misinterpretation).

Sources of error

  • "Human error".
    • Training.
    • Work flow.
  • Process gaps.
    • Process control.
    • Lack of redundancy.

Types of errors

Can be subdivided into the following groups:[4]

  • False-negative - missed diagnosis.
  • False-positive - diagnosis made that on review considered not to be present.
  • Threshold - difference of opinion regarding a diagnostic threshold.
  • Type and grade.
  • Missed margin.
  • Other.

Grading of errors

May be subdivided by three groups:

  • Grade 1: no consequence.
  • Grade 2: possible consequence.
  • Grade 3: definitely a consequence.

Error reduction

Various strategies can be employed:[5]

  • Training of staff - on error handling.
  • Computer order entry.
    • Avoid duplication fatigue.
    • Quick correlation with several identifying features.
      • Full name, sex, date of birth -- these all appear when one opens a case.
  • Barcode use.
    • Avoid transcription errors.
  • Clinical information entry required.
    • Allow correlation with test.
      • The interpretation may differ if the history says "screening coloscopy" versus "large cecal mass, anemia and weight loss" versus "breast cancer".
  • The use of algorithms to guide decisions where applicable.[6]
    • Remove subjectivity.
    • Increase objectivity, reproducibility.

Dealing with diagnostic errors

  • Opinion is split on whether reports should be amended or addended - see sign out article.

Measures of quality

Any number of parameters can be used to measure quality. The when, where and how-often something is measured depends on the value-added.

General measures of quality

There are really only two:

  1. Timeliness, i.e. turn-around time (TAT).
  2. Error rate.

Note:

  • 1 and 2 can be examined/quantified in any number of ways.
  • Error, in the context of a measurement, has to be defined.

Internal measures of quality

Smaller categories

Smaller categories - errors:[7]

  • Analytic: specimen identification & transport.
  • Preanalytic/analytic: tissue processing, e.g. fixation, blocking, embedding, sectioning, staining.
  • Analytic: interpretation.
  • Postanalytic: reporting/report integrity.
Individual measures

Specific measures:[7]

  • Preanalytic:
    • Identification - numbers match requisition.
    • Appropriate container.
  • Analytic:
    • Mislabeling.
    • Interpretation errors - based on:
      • Internal review.
        • Cytology-histology correlation.
        • Biopsy-resection correlation.
        • Frozen section-permanent section correlation.
        • Internal comparisons, e.g. ASCUS/LSIL between pathologists.
      • External review.
        • External standards/expected rate.
    • Amended reports - captures several of the above.
  • Postanalytic:
    • Completeness of report.
    • Critical diagnosis timely?
    • Report delivered to appropriate person?

External measures of quality

Benchmark

  • An external quality measure, i.e. a comparison to an outside group or agency.
    • Slides are sent around from an external source:
      • Lab has to stain 'em and send 'em back for an assessment.
      • Pathologists render diagnoses on 'em and are given the (externally rendered) consensus diagnosis.

Immunohistochemistry

Classification of IHC tests

IHC tests are classified in a paper by Torlakovic et al.:[8]

  • Class I:
    • Results used by pathologists.
    • Adjunct to histomorphology.
    • Examples: CD45, S-100.
  • Class II:
    • Used by clinicans for treatment decisions.
    • Considered independent of the other information in the pathology report; thus, cannot be derived from other information in the report.
    • Examples: ER, PR, HER2, Ki-67, CD117, CD20.

The implication of irregularies in the different classes are different. Problems in Class II tests are potentially more severe, as there is no internal control.

Work-up of suspected IHC problems

  • Review controls (internal and external).
    • Isolated to case vs. larger problem?
      • Discuss with lab/make other pathologists aware of the issue.
  • Repeat test - to identify the cause.

IHC process:

  1. Ischemia time - warm ischemia, preparation of specimen.
  2. Fixation - under, over, defective fixative, not enough fixative.
  3. Processing prior to antibody binding, usu. heating (antigen retrieval).
  4. Antibody-antigen binding.
  5. Reporter molecule binding.
  6. Counterstaining.
  7. Interpretation problem.
    • Known/expected epitope cross-reactions, e.g. CMV & HSV.[9]
    • Unknown/unexpected epitope cross-reactions.

Notes:

  • Problems can arise at any step.

Other

Data retention standards

  • There are data retention standards - how long results have to be retained.

College of American Pathologists

  • In the United States, there are standards from College of American Pathologists (CAP) and Clinical Laboratory Improvement Amendments (CLIA).[10]

Selected CAP and CLIA standards:[11]

  • Cytology slide (non-fine needle aspiration): 5 years from the exam date.
  • Fine needle aspiration: 10 years from the exam date.
  • Histopathology slides: 10 years from the exam date.

Canadian Association of Pathologists

The Canadian standards are higher than the US ones.

Summary of selected suggestions:[12]

Material Origin Suggested retention period Additional notes
Wet tissue surgical 4 weeks after final report -
Paraffin blocks surgical 20 years 50 years for paediatric cases
Slides surgical 20 years -
Wet tissue autopsy 3 months after final report Coroners'/medical examiner cases may be longer
Paraffin blocks autopsy 10 years Coroners'/medical examiner cases may be longer
Slides autopsy 10 years Coroners'/medical examiner cases may be longer

Failure-potential analysis

Adapted from Ullman:[13]

  1. Identify potential individual failures.
  2. Identify the consequences of those failures.
  3. Identify how the individual failures can arise.
  4. Identify the corrective action.

Biopsy size

Very small tissue fragments are associated with a decreased diagnostic yield and an increased diagnostic uncertainty.

Quality standards organization

There are a large number of organizations that have written standards for quality in laboratory medicine.

International

International standards organization

  • Abbreviated ISO.

Standard:

  • ISO 15189:2007.[14].
    • Published in 2007. Supersedes a standard published in 2003.

Note:

  • Unfortunately one has to shell out money to get a peak at 'em.

United States of America

Clinical laboratory improvement amendments

  • Abbreviated CLIA.
  • Published a multitude of standards & guidelines.[15]

College of American Pathologists

  • Do laboratory accreditation.[16]

Canada

Canadian immunohistochemistry quality control

Ontario

United Kingdom

  • National Pathology Benchmarking Service (NPBS).[17]

See also

References

  1. URL: http://www.attorneygeneral.jus.gov.on.ca/inquiries/goudge/index.html. Accessed on: 1 March 2011.
  2. Judicial inquiry probes faulty breast cancer tests. CBC website. URL: http://www.cbc.ca/news/background/cancer/inquiry.html. Accessed on: 30 January 2012.
  3. URL: http://www.qmpls.org/LaboratoryAccreditation/OLAActivitiesEducationalTools/OLAPresentations/tabid/111/id/11/Default.aspx). Accessed on: 18 April 2012.
  4. Renshaw, AA. (Mar 2001). "Measuring and reporting errors in surgical pathology. Lessons from gynecologic cytology.". Am J Clin Pathol 115 (3): 338-41. doi:10.1309/M2XP-3YJA-V6E2-QD9P. PMID 11242788.
  5. Fabbretti, G. (Jun 2010). "Risk management: correct patient and specimen identification in a surgical pathology laboratory. The experience of Infermi Hospital, Rimini, Italy.". Pathologica 102 (3): 96-101. PMID 21171512.
  6. Kahneman D. ["Als wären wir gespalten": Der Psychologe und Nobelpreisträger Daniel Kahneman über die angeborenen Schwächen des Denkens, trügerische Erinnerungen und die irreführende Macht der Intuition]. Der Spiegel. Nr. 21. 2012. URL: http://www.spiegel.de/spiegel/print/index-2012-21.html.
  7. 7.0 7.1 Nakhleh, RE. (Nov 2009). "Core components of a comprehensive quality assurance program in anatomic pathology.". Adv Anat Pathol 16 (6): 418-23. doi:10.1097/PAP.0b013e3181bb6bf7. PMID 19851132.
  8. Torlakovic, EE.; Riddell, R.; Banerjee, D.; El-Zimaity, H.; Pilavdzic, D.; Dawe, P.; Magliocco, A.; Barnes, P. et al. (Mar 2010). "Canadian Association of Pathologists-Association canadienne des pathologistes National Standards Committee/Immunohistochemistry: best practice recommendations for standardization of immunohistochemistry tests.". Am J Clin Pathol 133 (3): 354-65. doi:10.1309/AJCPDYZ1XMF4HJWK. PMID 20154273.
  9. Balachandran, N.; Oba, DE.; Hutt-Fletcher, LM. (Apr 1987). "Antigenic cross-reactions among herpes simplex virus types 1 and 2, Epstein-Barr virus, and cytomegalovirus.". J Virol 61 (4): 1125-35. PMC 254073. PMID 3029407. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC254073/.
  10. URL: http://www.cms.gov/clia/. Accessed on: 1 April 2012.
  11. URL: http://home.ccr.cancer.gov/lop/intranet/policymanual/generalpolicy/CAPCLIA.asp. Accessed on: 1 April 2012.
  12. URL: http://cap-acp.org/guide_retention-human-biologic-material.cfm. Accessed on: 6 May 2012.
  13. Ullman, David G. (1997). The mechanical design process. Toronto: McGraw-Hill Companies Inc.. ISBN 0-07-065756-4.
  14. URL: http://www.iso.org/iso/iso_catalogue/catalogue_ics/catalogue_detail_ics.htm?csnumber=42641. Accessed on: 18 April 2012
  15. URL: http://www.cms.hhs.gov/Regulations-and-Guidance/Legislation/CLIA/index.html?redirect=/clia/. Accessed on: 18 April 2012.
  16. URL: http://www.cap.org/apps/cap.portal?_nfpb=true&cntvwrPtlt_actionOverride=%2Fportlets%2FcontentViewer%2Fshow&_windowLabel=cntvwrPtlt&cntvwrPtlt{actionForm.contentReference}=laboratory_accreditation%2Faboutlap.html&_state=maximized&_pageLabel=cntvwr. Accessed on: 18 April 2012.
  17. URL: http://www.keele.ac.uk/pharmacy/general/npbs/. Accessed on: 18 April 2012.

External links