Quality

From Libre Pathology
Revision as of 14:40, 1 April 2012 by Michael (talk | contribs) (→‎References: more)
Jump to navigation Jump to search

Quality, in pathology, has got a lot of attention lately because there have been high profile screw-ups that lead to significant harm.[1][2]

General

The keys to quality are:

  1. Understanding the beeds of the stakeholders (surgeons, oncologists, patients, other pathologists, the public at large).
  2. Understanding the processes.
  3. Developing measures of quality.
  4. Tracking the measures of quality & assessing their validity.
  5. Understanding the causes of failures/adverse events in the context of the processes.
  6. Continually doing all of the above with the aim of improving outcomes.

Analysis

Overview

Quality issues can be examined in a number of different ways.

Finding a problem:

  • Root cause analysis.

Anticipating problems:

  • Failure mode and effects analysis (FMEA).

General error analysis

Pathology errors happen any time from when the lab gets the specimen until after the report is issued.

When errors happen:

  • Work-up the problem.
    • Where did the error occur? Pathologist error?
  • Talk to the clinician.
    • If it is a critical diagnosis contact the most-responsible physician immediately... if they are unreachable call the physician on-call for the most-responsible physician... if the patient is out-of-town you may have to coordinate with the local emergency department.
  • Talk to the chief of pathology.
  • Incident report.
  • Reconstruct error.
    • Was it a specimen mix-up?
      • Is there another error?
  • Amend the report(s).
  • Remedy the source of error.

The classic structural break down

A classic structural break down for error analysis is:

 
 
 
 
 
 
 
 
Errors in pathology
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Pre-analytical errors
 
 
Analytical errors
 
 
Post-analytical errors

Note:

  • This break down is arbitrary and in of itself most useful for answering exam questions.
  • In a practical context, it is a frame work for classifying errors. It is not useful for understanding the source of an error or addressing it.

Pre-analytic errors

  • Container mix-up - pre-lab & in-lab.
  • Block mix-up.
  • Slide mix-up - labels wrong.
  • Poor quality slides (fixation, processing, staining).
  • Lost specimen - can be potentially anywhere in the process.

Analytic errors

  • Interpretation wrong.
    • Factors:
      • Difficult case.
      • Technical factors (quality of slides).
      • Lack of clinical history.

Post-analytic errors

  • Wrong case signed-out.
  • Filing problem/lost report.
  • Interpretation of report problem (poorly written report, misinterpretation).

Sources of error

  • "Human error".
    • Training.
    • Work flow.
  • Process gaps.
    • Process control.
    • Lack of redundancy.

Types of errors

Can be subdivided into the following groups:[3]

  • False-negative - missed diagnosis.
  • False-positive - diagnosis made that on review considered not to be present.
  • Threshold - difference of opinion regarding a diagnostic threshold.
  • Type and grade.
  • Missed margin.
  • Other.

Grading of errors

May be subdivided by three groups:

  • Grade 1: no consequence.
  • Grade 2: possible consequence.
  • Grade 3: definitely a consequency.

Error reduction

Various strategies can be employed:[4]

  • Training of staff - on error handling.
  • Computer order entry.
    • Avoid duplication fatigue.
    • Quick correlation with several identifying features.
      • Full name, sex, date of birth -- these all appear when one opens a case.
  • Barcode use.
    • Avoid transcription errors.
  • Clinical information entry required.
    • Allow correlation with test.
      • The interpretation may differ if the history says "screening coloscopy" versus "large cecal mass, anemia and weight loss" versus "breast cancer".

Other strategies:

  • Statistical process control.

Measures of quality

Any number of parameters can be used to measure quality. The when, where and how-often something is measured depends on the value-added.

General measures of quality

There are really only two:

  1. Timeliness, i.e. turn-around time (TAT).
  2. Error rate.

Note:

  • 1 and 2 can be examined/quantified in any number of ways.
  • Error, in the context of a measurement, has to be defined.

Smaller categories

Smaller categories - errors:[5]

  • Analytic: specimen identification & transport.
  • Preanalytic/analytic: tissue processing, e.g. fixation, blocking, embedding, sectioning, staining.
  • Analytic: interpretation.
  • Postanalytic: reporting/report integrity.
Individual measures

Specific measures:[5]

  • Preanalytic:
    • Identification - numbers match requisition.
    • Appropriate container.
  • Analytic:
    • Mislabeling.
    • Interpretation errors - based on:
      • Internal review.
        • Cytology-histology correlation.
        • Biopsy-resection correlation.
        • Frozen section-permanent section correlation.
        • Internal comparisons, e.g. ASCUS/LSIL between pathologists.
      • External review.
        • External standards/expected rate.
    • Amended reports - captures several of the above.
  • Postanalytic:
    • Completeness of report.
    • Critical diagnosis timely?
    • Report delivered to appropriate person?

Type of quality measures

Benchmarking

  • An external quality standard.

Immunohistochemistry

Classification of IHC tests

IHC tests are classified in a paper by Torlakovic et al.:[6]

  • Class I:
    • Results used by pathologists.
    • Adjunct to histomorphology.
    • Examples: CD45, S-100.
  • Class II:
    • Used by clinicans for treatment decisions.
    • Considered independent of the other information in the pathology report; thus, cannot be derived from other information in the report.
    • Examples: ER, PR, HER2, Ki-67, CD117, CD20.

The implication of irregularies in the different classes are different. Problems in Class II tests are potentially more severe, as there is no internal control.

Work-up of suspected IHC problems

  • Review controls (internal and external).
    • Isolated to case vs. larger problem?
      • Discuss with lab/make other pathologists aware of the issue.
  • Repeat test - to identify the cause.

IHC process:

  1. Ischemia time - warm ischemia, preparation of specimen.
  2. Fixation - under, over, defective fixative, not enough fixative.
  3. Processing prior to antibody binding, usu. heating (antigen retrieval).
  4. Antibody-antigen binding.
  5. Reporter molecule binding.
  6. Counterstaining.
  7. Interpretation problem.
    • Known/expected epitope cross-reactions, e.g. CMV & HSV.[7]
    • Unknown/unexpected epitope cross-reactions.

Notes:

  • Problems can arise at any step.

Other

Failure-potential analysis

Adapted from Ullman:[8]

  1. Identify potential individual failures.
  2. Identify the consequences of those failures.
  3. Identify how the individual failures can arise.
  4. Identify the corrective action.

Biopsy size

Very small tissue fragments are associated with a decreased diagnostic yield and an increased diagnostic uncertainty.

See also

References

  1. URL: http://www.attorneygeneral.jus.gov.on.ca/inquiries/goudge/index.html. Accessed on: 1 March 2011.
  2. Judicial inquiry probes faulty breast cancer tests. CBC website. URL: http://www.cbc.ca/news/background/cancer/inquiry.html. Accessed on: 30 January 2012.
  3. Renshaw, AA. (Mar 2001). "Measuring and reporting errors in surgical pathology. Lessons from gynecologic cytology.". Am J Clin Pathol 115 (3): 338-41. doi:10.1309/M2XP-3YJA-V6E2-QD9P. PMID 11242788.
  4. Fabbretti, G. (Jun 2010). "Risk management: correct patient and specimen identification in a surgical pathology laboratory. The experience of Infermi Hospital, Rimini, Italy.". Pathologica 102 (3): 96-101. PMID 21171512.
  5. 5.0 5.1 Nakhleh, RE. (Nov 2009). "Core components of a comprehensive quality assurance program in anatomic pathology.". Adv Anat Pathol 16 (6): 418-23. doi:10.1097/PAP.0b013e3181bb6bf7. PMID 19851132.
  6. Torlakovic, EE.; Riddell, R.; Banerjee, D.; El-Zimaity, H.; Pilavdzic, D.; Dawe, P.; Magliocco, A.; Barnes, P. et al. (Mar 2010). "Canadian Association of Pathologists-Association canadienne des pathologistes National Standards Committee/Immunohistochemistry: best practice recommendations for standardization of immunohistochemistry tests.". Am J Clin Pathol 133 (3): 354-65. doi:10.1309/AJCPDYZ1XMF4HJWK. PMID 20154273.
  7. Balachandran, N.; Oba, DE.; Hutt-Fletcher, LM. (Apr 1987). "Antigenic cross-reactions among herpes simplex virus types 1 and 2, Epstein-Barr virus, and cytomegalovirus.". J Virol 61 (4): 1125-35. PMC 254073. PMID 3029407. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC254073/.
  8. Ullman, David G. (1997). The mechanical design process. Toronto: McGraw-Hill Companies Inc.. ISBN 0-07-065756-4.

External links