Next Page


Laboratory Quality Assurance


Printer Friendly Version

Philosophy and design

Simply put, the product or raison d’être of clinical laboratory practice is the acquisition and conversion of analytic data to information that clients (most often physicians) can use in contributing to the effective and efficient management of patient care. “Quality Assurance” is a broad term typically applied to the system of processes, procedures and programs organized to provide a means of monitoring, evaluating and improving that product.

The DBQ Pathology Associates Quality Assurance Program can be functionally defined through a description of operational directives detailed either in the analytical procedures themselves or in the dedicated operational documents listed below:

Backup; Analytical Systems

Calibration Verification; CLIA Specifications

Clinical Laboratory Testing Manual (CLTM)

Competency Assurance

Critical Value Chart & Procedure

Important Called Results (ICR)

Incidence Recording

Information Technology Problem Report (ITPR)

Inoperable or “down” Test Systems

Instrument Problem Reports (IPR)

Linearity Checks

Maintenance, Function Verification Protocol & Logs

Method Matching and Monitoring; Intersite

Method Verification Reports (MVR)

Micropipettor Technique

Outdated Materials, Use of

Package Insert File

Patient Parallel Testing

Performance Improvement Plan

Phoned Results Protocol

Precision Checks

Procedure Review Program

Product Notices and Safety Recalls

Proficiency Testing

Quality Control; General
Quality Control; defining control action limits

Quality Control; Equivalent QC with Unit Dose Testing

Quality Control; Hematology, “City Control”: Preparation, Application and Management

Quality Control; Histology/Cytology

Quality Control; Microbiology

Reagent Labeling and Storage

Reagent Overlap

Reference Range Evaluation

Reportable Result (RR Breach)

Reported Erroneous Results Log & Procedure (RERL)

Sample and Control Dilution Protocol

Special Maintenance Reports (SMR)

Specimen Handling: Identification, Integrity & Rejection

Specimen Integrity

Temperature Monitoring

Unit-dose or Unit-use testing

Unlikely Results

Quality Control; General

Quality Control can be conceptually, if not practically, segmented from Quality Assurance. For our purposes it can be thought of as that component of Quality Assurance which concerns itself directly and immediately with the analytical process. Extended discussions attempting to dissect the difference between Quality Assurance and Quality Control are, however, largely low-level cerebral exercises not particularly beneficial in terms of the effective implementation of either.

A more fruitful discussion might center on two, common, misguided notions which actually do affect the application of Quality Control:

Because it is considerably easier to transcribe data than it is to think about it and because the former is frequently substituted for the latter, it is necessary to emphasize that, while one of its fundamental aspects is indeed clerical, the purpose of Quality Control is not the transcription.

Neither is it the principle object of Quality Control to demonstrate how well things are going but, more nearly the opposite, to highlight anomaly and draw attention to any error that might be occurring. The focus of a well-staged program will be on early problem detection, definition and resolution; it should result ultimately in the systematic reduction of failure rate.

Quality Control; purpose/design:

The Quality Control program of DBQ Pathology Associates provides a means of detecting variance from expected analytical behavior and, if variance is detected, furnishes a straightforward framework or clear path of action which will:

  1. enable Testing Personnel to determine if corrective action, either immediate or deferred, is warranted,
  2. provide supporting documentation which will lend direction in terms of problem-solving if required and,
  3. neither delay the release of clinically correct results nor allow the release of clinically erroneous results.

Control Material:

Endogenous or “internal” controls are the best means of routine quality assessment on unit-dose testing.
Endogenous controls are built into the assay method itself and they provide an ostensible means of assessing integrity of reagent and proper execution of the procedure with each and every analytical trial. When available, endogenous controls will be formally integrated into testing verification and documentation.

Our Rapid Strep A procedure with custom log is a good example of effective endogenous control application.

[See excerpt from Strep A screen procedure, section X and its log template.]

     
    Unit-dose or Unit-use testing:

Biochemical, microelectronics and computer advancement have led to the development of many “Unit-dose” or “Unit-use” clinical laboratory methodologies.
Common to each is that different phases of the measurement process, e.g., matrix separation, reagent addition, incubation, tag, signal generation, etc. are all built into a discrete testing module. Design of these modules tends to be intricate, fairly clever and generally results in a one test – one trial, self-contained analytical system.

Advantages of this engineering genre are that it tends to be relatively portable and, even more importantly, it substantially insulates the operator from typical variables so that the expertise which has to be brought to bear in order to get consistent results is significantly reduced. (Unfortunately, along with the decreased technical expertise also comes a lack of training and discipline in terms of managing the data.)

Additional disadvantages need to be considered when administering this type of testing including much higher disposable costs per reportable result, typically underestimated training and ongoing competency review costs and batch or “run” control logistics. Exogenous controls cannot be included in a run to directly prove the integrity of the batch because every test is isolated from every other test; every test is a run of one. Traditional quality control mechanisms are not doable. Hence, we have evolution of the notion and practice of “Equivalent QC”.
Although cost and control can be problematic there are a number of applications where Unit-use testing may be best fit. “Bedside” applications where turnaround is critical to patient care, remote screening applications where trained laboratorians are not on board or available as with “walk-in” cholesterol, strep or Hgb testing at some physician offices, some very low volume applications where larger, multi-analyte instruments are not feasible, all come to mind.

    Exogenous controls:

Traditional, bilevel control material with a substantial delta will be assayed when available and when appropriate. QC testing is routinely rotated among the personnel who perform the test. (See next paragraph for frequency.)
Physical properties of control material such as viscosity, optical effect, dissolved and formed elements, etc. will, where technologically feasible, mimic or parallel the matrix and content of the specimen types typically analyzed. Limit numbers (or target values when the results being generated are non-numerical) will be clearly defined along with explicit instructions for Testing Personnel to follow when a Q.C. attempt “fails” or falls outside of what has been determined to be acceptable.

Routine frequency: Unless described differently in the individual analytical procedure, bilevel, exogenous or “external” controls will be run on the default schedule; i.e.:
  once every 24 hours on any day the test is performed
  when there is a reagent lot change
  immediately following calibration
  when stipulated in specific maintenance protocol.

The process of generating a series of replicate analyses on this traditional, exogenous control material can play a significant role in monitoring the “correctness” of an assay; it does not, however, directly address the issue of accuracy. The statistical leverage provided by the effort is a numerical representation of repeatability or consistency; i.e., precision, and as such it can be an effective indicator of deviation from performance history.

An assay can of course be historically incorrect, consistently wrong or precisely inaccurate.

Defining Control Action Limits

Action limits for quality control material are routinely based on two elements: statistical and cognitive. The statistical base is derived from locally established system performance of typically no fewer than 60 consecutive trials over at least one and a half months after baseline studies and initially set at +/- 3SD. This statistical range is then evaluated and possibly adjusted to strike a balance between two basic dynamics - annoyance alerts and the clinical ramifications of being wrong. Any variance from this general approach is detailed in individual analytical or operational procedure.

Many, but certainly not all, of our automated systems now produce analytical precision which is significantly beyond what is required or even helpful clinically. A simple reagent lot change can generate a minor, uncompensated shift in a control mean that will easily increase the incidence of breach in the traditional 2SD limit while not producing an “error” that translates to anything like clinical significance. On the other hand, there are some testing systems that still struggle to generate the kind of precision or stability required by clinical decisions potentially triggered by the results they produce. Ca++ at <3%, very low end troponins and ACTs come to mind. In these cases, limits may need to be tightened from the statistical margin as well as other precautions taken to ensure early warning.

Prior to releasing new control lots, means for each constituent at each level are derived from trials overlapped as unknowns against active control material. The number of trials necessary to establish reliable target means is directly dependent on the inherent precision of the analytical process; i.e., good precision requires fewer trials and vice versa. (A substantially abbreviated overlap is run on hematology controls due to the short shelf life of the material.)

Once defined, action limits as described above are stable and transfer from lot to lot of control typically over very long periods of time, often the life of the instrument, provided manufacture of the control product remains unchanged. Constituent means will vary a bit but the action limits or allowable ranges about constituent means stay the same.

Equivalent Quality Control with Unit Dose Testing

In response to the development and widespread deployment of Unit Dose (single trial) analytical testing and the material costs associated with each test trial, the Center for Medicare Medicaid Services (CMS) has modified regulation to allow decreased frequency of traditional control testing under specific conditions. This relaxation of rules with respect to control testing frequency is referred to as “Equivalent Quality Control Procedures” (EQC).

As described earlier, Unit Dose testing typically has control mechanism designed into or integral to the testing unit; i.e., the cartridge, testpack or card. There are a variety of “flavors” or design applications for resident control processes and the frequency of external control testing under the EQC rules is determined in large part by the adequacy or completeness with which a particular built-in control process challenges its testing system.

There are essentially three levels of EQC application with respect to external control testing frequency:
  Complete (First Level EQC): bi-level material assayed once per month on Unit Dose systems that challenge their complete analytic process.
  Partial (Second Level EQC): bi-level material assayed once per week on Unit Dose systems that challenge their analytic process only partially.
  None: bi-level material assayed once per week on Unit Dose systems that do not have internal control mechanism resident on the testpack but have demonstrated acceptable stability (as defined by the manufacturer and accepted by the Technical Director) over a time period of at least 60 consecutive days.

As with routine QC, if either trial from the bi-level control pair falls outside acceptable limits as defined in the written procedure, the variant control is repeated once only on a new testing unit. If the repeat attempt is successful, patient specimen testing may proceed. If the repeat trial is unacceptable, patient testing on the system is suspended, patient test results are not reported, an entry is made in the Incidence Log and the supervisor is informed.

Note: “Successful” or not, all repeat control trials along with the initial results – both levels – are recorded in the appropriate Incidence Log for pattern review by the Dept/Site manager.

As new Unit Dose testing comes on board an assessment of which EQC level is applied, if any, will be made. The EQC designation and rational for it will be stated specifically in the written procedure.

Baseline function checks on this testing modality will include but not necessarily be limited to precision and stability evaluation. These initial QC challenges will be made up of external, bi-level control assays over a minimum of 10, 30 or 60 consecutive testing days depending on the thoroughness with which the internal control mechanism monitors the analytic components of the system; i.e., Complete, Partial or None.

Raw data and summary of this baseline function check along with any pertinent comment, will be recorded in the MVR (Method Verification Report) format, reviewed by the Technical Director and then scanned and indexed for rapid retrieval.

Frequency reduction in external control iteration afforded by EQC status for any test is automatically suspended if a failed control attempt does not recover on initial repeat. Once the problem has been defined, remediation applied and baseline function checks re-established, the EQC status may be re-instated.
The Technical Director may suspend ECQ status for any reason:

“Since the purpose of control testing is to detect immediate errors and monitor performance over time, increasing the interval between control testing (i.e., weekly, or monthly) will require more extensive evaluation of patient test results when a control failure occurs. The director must consider the laboratory's clinical and legal responsibility for providing accurate and reliable patient results versus the cost implications of reducing the quality control testing frequency.
CLIA; Interpretive guidelines for Laboratories and Laboratory Services. D5445, 20040527


Control “out”; response:

Control performance tolerances or control limits are positioned to trigger early recognition of possible error before it translates to clinically significant effect. (See Defining Control Action Limits)

Single constituent (test) control failure:

Routinely, if a control trial is aberrant and the “problem” isn’t readily apparent, Testing Personnel are instructed to, before doing anything else, check the Incidence Log for evidence of increased imprecision in the recent performance history of the test in question. Early recognition of imprecision can often escape if it is dependent on isolated control trials.

If there is evidence of increased imprecision, the department or site Supervisor must be consulted in order to coordinate considerations concerning degree of potential error and determine whether trouble-shooting needs to take place immediately or if it may be deferred somewhat.

Should the Incidence Log provide no evidence of current increased imprecision, the tech will attempt to change nothing with respect to the analytical system and repeat the failed control one time only. If the repeat trial generates results that are within stated tolerances, patient results associated with that control may be released. If the repeat control trial again fails to fall within defined tolerances, testing results of “unknowns” are recorded but not reported and the analytical procedure in question is suspended until the on-site Supervisor is informed and/or the Pathology Associates Technical Director or designee orchestrates investigation and resolution of the problem.

On the other hand, if the “problem” is obvious, fix it; rerun the control to verify the solution and make a succinct entry in the Incidence Log.

Both initial and repeat control results are documented in the appropriate log. If the initial response to control trials that are “out” appears to resolve the issue, observations and supporting data are recorded in the Incidence Log. Observations and supporting data relevant to extended problem solving are further documented in either Special Maintenance or Method Verification reporting format depending upon the path of investigation.

Any authorized variation in this initial response to aberrant control trials is explicitly detailed in the individual, written procedures, which have been sanctioned, for use on-site. Initial response to hematology controls that are “out” is detailed in the Beckman Coulter LH750 Operate and Beckman Coulter LH500 Operate procedures.

Batch constituents (tests) control failure:

On routine batch control runs in which three or more trials on multiple constituents at any level are outside of their limits, cease testing on the analyzer. Consider the system “down”. Do not report patient results. Contact your site supervisor. The site supervisor will assess the situation, contact an Instrument Specialist for consultation and corroboration, notify a site pathologist and determine if Communication to Affected Services is required.

Accuracy:

Unless controls are going to be used as standards, questions about accuracy have to be resolved by some other route. The ability of an analytical method to recover the actual or “true” target can be assessed by direct assay of a standard when the analyte is mass-measured. Accuracy can also be evaluated indirectly, even for activity-measured analytes, by comparison to group specific means extracted from databases built on the results of proficiency testing challenges. The latter is our most common method for routinely proving accuracy for many tests.

Standards are also brought to bear in the routine calibration of mass-measured constituents and, occasionally, when trouble-shooting a test in which matrix effect of proficiency testing material may obfuscate the target.

Control trials; primary documentation:

Staff are instructed that it is absolutely essential to record the results of all analytical trials whether or not the values are “in” or “out”, “acceptable” or “unacceptable”, “expected” or “unexpected”. Custom logs have been designed to receive this primary record chronologically and by method so that judgments regarding the validity of individual results can be made in the context of proximal analyses, both known and unknown.

Reagent and control lots are recorded at the bench when they are activated. Provision for the documentation of this information might have been designed right into a Test Result Log if the procedure requires primary transcription of test results or it may appear on Levy-Jennings charts if they are being used to track test performance or it could be recorded in a special, separate log, or recorded in the test Incidence Log; the nature of the specific analytical method will dictate the medium.


3/7/77, S. Raymond; Technical Director, DBQ Pathology Associates
revised 5/27, 7/19/83 S. Raymond, J. Miller
revised 4/20/95 S. Raymond; QA format
revised 19991108 S. Raymond; clarification of routine control frequency
revised 20040614 S.Raymond; EQC and Unit-dose or Unit-use defined
revised 20051111 S.Raymond; Defining Control Action Limits

Number Rounding:

Rounding of a numeric value is sometimes necessary in the laboratory to conform to the reporting format stated in the analytical procedure. Some of the laboratory analyzers display or print results to tenths, hundredths or thousandths and, in order to report these results, it may be necessary to round them to report to the significant digit.

There are two basic rules to follow when rounding numbers:

  1. When the digit to the right of the significant digit is less than 5, always round down.
  2. When the digit to the right of the significant digit is greater than or equal to 5, always round up.
    Examples:
      1. INR: The analyzer reports an INR to the hundredth, however we report only to the tenth:
        2.75 is rounded to 2.8
        2.74 is rounded to 2.7
      2. APTT: The analyzer reports an APTT to the tenth, however we report as a whole number:
        25.5 is rounded to 26
        25.4 is rounded to 25

Refer to the individual analytical procedures to determine the significant digit.

 


July 1994 S. Hosch
revised 20060911 M. English; I., II.2.Examples

Result(s) Review:

Results produced by the Laboratory are reviewed and checked for clerical errors, absurd or unlikely results and critical values prior to reporting.

  1. Results produced by the laboratory are reported via CLICS information system. Technical staff performing tests review results for incongruent, absurd (unlikely) or critical values, and where instances of manual result entry occurs, transcription errors before releasing the results, whether using the Interfaced Instrument Result Review/Release or Transcription Result Review/Release screens. The technical staff releasing the result(s) records his/her initials in the “Performed by” and/or “Review by” documentation area on the requisition.
  2. Supervisors are responsible for the designation of laboratory technical personnel qualified by training and/or job experience to prepare, perform and review/report analytical testing. A list, organized by laboratory department and by tests within that department, is maintained to document the qualification of technical staff. The list of qualified personnel is reviewed and signed by a supervisor and pathologist. See sample page attached to this policy.

 


January 1997 J. Mueller
revised July 2001 J. Mueller
revised 20060911 M. English; sections 1 & 2

Procedure Review

Standard format:

Acceptable laboratory procedures will be written, practiced, referred to and reviewed from the standard format or template that has been developed by Pathology Associates for clinical laboratories under their direction. This established standardization will accommodate site-specific application while maintaining an important system-specific consistency.

[Click here to launch the Analytical Procedure Format template. (Gray text specifies appropriate pre-defined text editor styles).]

Package inserts are not acceptable substitutes for our written procedures.

Our laboratory policies and procedures have been written to be used. All of them have been converted to electronic medium so that they can be readily standardized, accessed, reviewed, improved and distributed throughout the system. They are subjected to a rigorous, formalized, bilevel review process.

Interim review:

The Analytical Procedure Review Program of DBQ Pathology Associates is a process operating continuously on two levels: Comprehensive and Interim.

Interim Review occurs on an annual cycle (calendar year) and is carried out by either Managers, Site/Department Supervisors or technical staff designated by them. It can be thought of as a polishing phase for established analytical procedures that have been driven through the much more rigorous Comprehensive Review process at least once.

At the Interim Review level focus is on those things that might have drifted from the procedure as it was authorized following Comprehensive Review:

  1. Procedural steps should describe the analytical process as it is being executed in the department.
  2. Language defining result recording and reporting should continue to be an accurate description of the actual process.
  3. Reagents, controls and materials being used need to be correctly represented in the written text.
  4. Reference values, units of measure and critical values must match the CLTM (Clinical Laboratory Testing Manual) and Critical Value Chart.
  5. Links within the procedure html document (webpage) are in working order and display the intended document or location.

The Manager or Site/Dept. Supervisor will determine whether discrepancies discovered are minor in nature or whether they should trigger the Comprehensive Review process for the procedure in question. (The DBQ PA Technical Director will be available to resolve any doubt.) If, for example, the written procedure should be modified to reflect practice rather than the other way around or if language in the written procedure should be reworked to clarify steps that can potentially impact analytical outcome, then the review should be raised right up to the Comprehensive level.

Interim Review procedures that require minor changes are handled in the following way:

  1. No ad hoc minor changes will be made, i.e., any minor change will trigger the full Interim Review process.
  2. Minor changes are indicated on a copy of the current procedure.
  3. The copy of the current procedure, with changes indicated, is sent to the LIS Project Coordinator.
  4. The electronic copy is revised by the LIS Project Coordinator. The revision date will indicate when the procedure was last through the comprehensive review cycle. An Interim Review date, name of reviewer, changes made if any will be noted below the signature section of the procedure to track the reviews that occur. When more than one Interim Review is performed on a procedure within a calendar year only the latest recorded review will be listed below the signature section of the procedure; up to three successive (annual) interim review dates will be listed.
    1. The HTML document is re-translated to update the web page and the LIS Project Coordinator notifies the appropriate supervisors when the new version is available on the UCL web page via e-mail.

Comprehensive review:

The Comprehensive Review process is cyclic but doesn’t conform to a preset time interval; some procedures are reviewed and revised multiple times within a cycle while others, because of their straightforward, stable nature, may undergo scrutiny but once in the same cycle. A thumbnail sketch of the Comprehensive Review follows:

  1. A complete list of all policies and procedures is subdivided into groups and the groups are then assigned to Reviewers who will manage the process at its initial level and, in general, keep it moving from there.
  2. Reviewers for analytical procedures must come from the pool of Testing Personnel who are actually performing the test on the bench.
  3. The Reviewer reads the current procedure and performs the test exactly as written carefully noting any differences, which may have crept in between the written instruction and the way the test has actually come to be performed.
  4. Discrepancies are then evaluated and resolved; i.e., either the text of the procedure is modified to reflect the practice or related staff behavior is readjusted to match the written protocol. In this process of reconciliation, observations, ideas and rewrites may bounce back and forth between the Reviewer and the PA Technical Director before a final draft emerges. All significant changes are reviewed and sanctioned by the Technical Director.
  5. Remaining idiosyncrasies related to site-specific applications are ironed out between the laboratory Site Supervisor/Manager and the Technical Director. Philosophical pressure to establish and maintain uniformity between sites usually prevails; site-specific modifications are kept to a minimum.
  6. The copy of the current procedure, with changes indicated, is sent to the LIS Project Coordinator for revision of the electronic source document.
  7. The revised procedure is sent to a Laboratory Medical Director. The working copy of all analytical procedures is reviewed and signed by a Medical Director.
  8. The procedure is forwarded to the PA Technical Director for final review and signature.
  9. The signed copy is returned to the LIS Project Coordinator; hard copies are distributed to appropriate testing sites and the original of the signed procedure is filed. The LIS Project Coordinator maintains a distribution record of procedures.
  10. The HTML document is re-translated to update the web by the LIS Project Coordinator.
  11. Communications of policies and procedures to the laboratory staff proceeds by one of three methods:
      1. *In-house staff member’s individual mailbox.
      2. Posting of the policy or procedure.
      3. Laboratory meetings
        (*Principle method of communication: Each staff member is responsible to check their mailbox each schedule shift; they are responsible for all communications delivered to them via the mailbox.)
  12. Mailbox communications are kept in a monthly file by the laboratory supervisor. This file will serve as a reference source of written communications. There are times when the management will deem it necessary to have documentation that lab personnel received the written communication. Date and initial sheets serve as this documentation and are kept in the same monthly file.

The Procedure Manual TOC (table of contents) will appear at the beginning of each analytical procedure binder. Procedures will be grouped by Department and listed alphabetically. [See the following Procedure Manual TOC model.]


POCT procedure review:

The director named on the CLIA certificate or a qualified designee approves procedures before initial use of the test for patient testing, and then once every three years. (JCAHO Standard PC.16.40)


4/9/92 S. Raymond; Technical Director, DBQ Pathology Associates
revised: 11/2/93 S. Raymond
revised: 12/13/94 S. Raymond: Interim review, TOC
revised: 4/30/96 L. McGovern, S. Raymond: Interim review
revised: July 2002 M. English; S. Raymond: Interim & Comprehensive review, procedure format
revised: January 2007 L. McGovern: Added POCT procedure review
revised: September 2009 M. English, S. Raymond: Interim Review recording, procedure format, TOC, define annual review


Incidence Recording

The formal documentation of aberrance encountered in the course of performing clinical laboratory analyses is “Incidence Recording”.

Incidence Recording; purpose/design:

Because testing procedures are rotated from tech to tech with some frequency and because technologists are generally performing more than one procedure at any given time, problems that arise often appear isolated or spurious and many times go unattended.

As it turns out, most problems are not manifested by a single event but are accompanied by additional evidence that something is going, or has gone, awry.

The Incidence Recording System of DBQ Pathology Associates provides for the consolidation of apparently disparate events; the collective and chronological properties of the system can prompt early recognition of problems, accent their sometimes-chronic nature, and often lend direction in terms of solution.

What should be recorded:

Essentially, ANY unexpected, test-related abnormality observed by a technologist while performing an analysis should be documented in the Incidence Log; e.g.:

  • control performance outside of posted limits,
  • drift or shift within posted limits,
  • duplicate imprecision,
  • replicate imprecision,
  • overt reagent condition,
  • unusual pattern in patient results,
  • isolated, bizarre patient result,
  • any aberration in the test that is not obviously instrument related.

If a problem has already been noted by a Supervisor or the Technical Director (or designee) and formal measures toward a solution are underway, it may not be necessary to document additional observation. When in doubt, however, an entry must be made.

How to record:

  • Write a brief description of the abnormality. (Include control data if pertinent and, if it is, note performance history on both controls.)
  • Review recent Incidence Log entries for related history.
  • Comment briefly about possible cause(s). If the cause is unclear, simply state this fact.
  • Describe any action taken. If you repeat the analysis, specify any changes that may have been made.
  • Initial and date the entry.

An example of a reasonably good Incidence Log entry:

9/12/79: “Osmolalities: 750 control came in at 770 w/a posted mean of 747 and a range of 727 - 767. Opened fresh 750 and it came in at 732. Patient values reported. In 10 trials, we have crossed the mean on the upper control only once. The high control appears to have deteriorated.” MJB

“The problem is actually worse than it looks. It is affecting patient results. The Lo control is drifting below its’ target and the Hi control has deteriorated more than recent values indicate since the standard is undergoing parallel deterioration. We need to get both controls and the standards into much smaller bottles w/ better sealing caps." SR

Review:

Sometime, preferably early in the shift, Testing Personnel should attempt to skim through Work Area Incidence Logs looking for recent entries in sections that cover analyses they are likely to perform on their “watch”. This will alert them to problems they might encounter and can save time while reducing confusion and frustration in the long run.

The appropriate section of the relevant Incidence Log very definitely needs to be reviewed by testing personnel in their initial reaction to aberrant control results.

Once per day, Department Managers need to go over recent entries in the Incidence Log(s) covering their Work Areas. They should be looking for patterns to the problems that are logged and they need to satisfy themselves that reasonable progress is being made towards resolution. Overview by Site Supervisors or Managers can occur at lesser frequency; e.g., weekly or monthly, but it is very important that management at this level maintain periodic and personal re-assurance that entries are being made properly and that chronic problems are not going unattended. Problem solving can be ramped up to the Technical Director or Assistant to the Technical Director at any time by the Site Supervisor or Manager.


S. Raymond; 12/2/79
revised: 12/12/88 S .Raymond, format
revised 4/26/95 S. Raymond, review responsibilities


Temperature Monitoring

Temperature Monitoring; purpose/design:

There are a variety of specific storage or reaction thermal conditions that need to be created and sustained in the typical clinical laboratory setting. Tolerances or the amount of allowable deviation within these specific thermal conditions need to be defined and, in each case, procedures are needed which provide both a method for tracking the on-going effectiveness of each temperature-control system as well as directive for reacting to breaches in the defined tolerances.

Common refrigerators/freezers/incubators:

Standard laboratory refrigerators, freezers and incubators will be “fitted” with the solid state, LED Min-Max thermometers having remote probes (Sentry Hi/Lo®: 020-934). The crystal display of these monitors will show the current temperature along with the lowest and highest reading captured since the unit was last reset. The display measurements are updated every 10 seconds.

Probes of these monitors will be immersed in an aqueous solution of ethylene glycol and located within the refrigerator/freezer/incubator compartment away from any circulating air fans. This arrangement should provide measurements which are easy to read and considerably more representative of the object temperature than the artifactual, transient thermal swings of ambient air which occur when the door is opened briefly to obtain or stock chilled contents.

Immersion Solution will typically be provided "ready to go" in an Erlenmeyer flask by the Materials Management department.

Preparation of the probe solution:

  1. Materials used:
      1. Erlenmeyer flask: Scientific Products - PN# F4253-50

      2. Bored rubber stopper; Scientific Products - PN# R5145-2

      3. Ethylene Glycol;CAS:646-06-0

      4. Type I water

      5. Thymol crystals; CAS: 89-83-8

      6. Tire plugs: Western Auto

  2. Decant ~25 ml of Ethylene Glycol into a 50 ml Erlenmeyer flask; fill to the 50 ml mark with Type I water. Add two to three crystals of Thymol. Plug the flask with the provided stopper. Label it; insert the thermometer probe and close the opening using a tire plug. Place the flask in the refrigerator/freezer/incubator compartment. This solution is stable indefinitely and doesn't need to be changed unless it is showing signs of microbial growth or has evaporated to 90% of the original volume.

    Temperature monitor verification:

All temperature monitors in use must be verified annually against an N.B.S. standard thermometer over their range of use. All temperature monitors purchased must be verified against an N.B.S. thermometer over their range of use before being put into use.

  1. Obtain the N.B.S. thermometer from the Instrument Specialist Department at the UCL Cathedral Square Site.
  2. Place the N.B.S. thermometer and the temperature monitors that are being verified into the same medium for 30-60 minutes.
  3. Record the temperatures of both the N.B.S. thermometer and the temperature monitors being verified on the Temperature Monitor Verification Log.
    [See template of Temperature Monitor Verification Log.]

  1. If the temperature monitor being verified has a temperature bias, indicate this on the temperature monitor label and on the Temperature Monitor Verification Log.
  2. Temperature monitors may vary from the N.B.S. thermometer by no more than ±1.5°C.
  3. All spirit glass thermometers should be checked at this time for clarity of numbers, chips, etc. and discarded if damaged.
  4. Label the temperature monitor. The label must follow this template:

    General storage refrigerators and freezers:

  1. All refrigerators and freezers must be numbered or named in a manner that will clearly identify them on their Temperature Recording Log. Temperature tolerance limits will be posted on the Temperature Recording Log for each unit.
  2. All general storage refrigerators and freezers will be monitored using a minimum - maximum temperature monitor (obtained from the UCL Materials Management Supervisor).
  3. Temperature readings will be taken and recorded in the appropriate Temperature Recording Log once per day on each day of regularly scheduled work. The display will be reset right after the readings are documented by pressing the reset button below the display. [See Standard Temperature Recording Log.]

  1. Ranges for allowable temperature fluctuation will be clearly marked on each log. Deviation from nominal will be handled in the following manner:
      1. If the current and/or minimum and/or maximum temperature reading is </= 5°C outside of the posted limits, fill out an IPR and notify the Instrument Specialists.

      2. If the current and/or minimum and/or maximum temperature reading is > 5°C outside of the posted limits, fill out an IPR, contact the Technical Director, the Assistant to the Technical Director or the Pathologist on site before attempting to use reagent or control from the refrigerator/freezer.

  2. At the end of the month the Temperature Recording Logs must be placed on the Supervisor's desk for review.

    Incubators:

  1. All incubators must be numbered or named in a manner that will clearly identify them on the Temperature Recording Log. Temperature tolerance limits will be posted on the Temperature Recording Log for each unit.
  2. All general storage incubators will be monitored using a minimum - maximum temperature monitor (obtained from the UCL Materials Management Supervisor).
  3. Temperature readings will be taken and recorded in the appropriate Temperature Recording Log once per day on each day of regularly scheduled work. The display will be reset right after the readings are documented by pressing the reset button below the display.
  4. Ranges for allowable temperature fluctuation will be clearly marked on each log. Deviation from nominal will be handled in the following manner:
      1. If the current and/or minimum and/or maximum temperature reading is </= 2°C outside of the posted limits, fill out and IPR and notify the Instrument Specialists.

      2. If the current and/or minimum and/or maximum temperature reading is > 2°C outside of the posted limits, fill out and IPR, contact the Technical Director, the Assistant to the Technical Director or the Pathologist on site.

  5. At the end of the month the Temperature Recording Logs must be placed on the Supervisor's desk for review.

Waterbaths, Drybaths, Heating blocks, Temperature-controlled cuvettes, Automated chemistry analyzers with reaction baths:

  1. All waterbaths, dry-baths, heating blocks, temperature-controlled cuvettes and automated chemistry analyzers with reaction baths must be numbered or named in a manner which will clearly identify them on the Temperature Recording Log. Temperature tolerance limits will be posted on the Temperature Recording Log for each unit.
  2. Temperature readings will be taken and recorded in the appropriate Temperature Recording Log once per day on each day of regularly scheduled work.
  3. Ranges for allowable temperature fluctuation will be clearly marked on each log. Deviation from nominal will be handled in the following manner:
      1. If the current and/or minimum and/or maximum temperature reading is </= 2°C outside of the posted limits, fill out and IPR and notify the Instrument Specialists.

      2. If the current and/or minimum and/or maximum temperature reading is > 2°C outside of the posted limits, fill out and IPR, contact the Technical Director, the Assistant to the Technical Director or the Pathologist on site before attempting to use the unit.

  4. At the end of the month the Temperature Recording Logs must be placed on the Supervisor's desk for review.

Room temperature:

  1. Room Temperature is monitored in Hematology departments to verify acceptable tolerance limits for WSR testing, and also in Blood Bank departments to verify acceptable storage conditions for platelet packs. All Room Temperature Recording Logs must be clearly labeled to identify the area monitored. Temperature tolerance limits will be posted on each Room Temperature Recording Log based on the application.
  2. All areas to be monitored will use a minimum - maximum temperature monitor (obtained from the UCL Materials Management Supervisor).
  3. Temperature readings will be taken and recorded in the appropriate Room Temperature Recording Log once per day on each day of regularly scheduled work. The display will be reset right after the readings are documented by pressing the reset button below the display.
  4. Ranges for allowable temperature fluctuation will be clearly marked on each log. Deviation from nominal will be handled in the following manner:
      1. If the current and/or minimum and/or maximum temperature reading is outside of the posted limits notify a supervisor immediately and contact the Technical Director, the Assistant to the Technical Director or the Pathologist on site.

  5. At the end of the month the Temperature Recording Logs must be placed on the Supervisor's desk for review.

Blood bank refrigerators:

  1. All refrigerators must be numbered or named in a manner that will clearly identify them on the Temperature Recording Log/Blood Bank Refrigerator Temperature Recording Log. Temperature tolerance limits will be posted on the Temperature Recording Log/Blood Bank Refrigerator Temperature Recording Log for each unit. [See template of Blood Bank Refrigerator Recording Log.]

  1. Temperatures are to be checked and recorded daily on the applicable log. (The Blood Bank Refrigerator Temperature Recording Log is used for banked blood storage refrigerators and the Temperature Recording Log is used for blood bank reagent storage refrigerators.) If the temperature is not within the acceptable range, refer to the Blood Bank Alarm; Operational Policy. At the end of the month the Temperature Recording Log must be placed on the Supervisor's desk for review.

    Banked Blood storage refrigerators:

Since these refrigerators have a recording chart, the Sentry Hi/Lo® temperature devices are not used.

  1. All refrigerators in which donor blood is stored are to have recording thermometers with visual and audible alarms. The sensors for the temperature recording and alarm systems are placed in immersion solution in a container with a volume similar to the blood containers being used. The alarm is activated when the temperature falls outside the acceptable 1°C to 6°C range. The electrical source of the alarm system is separate from the refrigerator. The alarm has a back-up battery to power it in the event that the electrical power is cut off.
  2. Two Spirit type thermometers placed in immersion solution are also used to monitor temperatures. One of these is placed on the top shelf of the refrigerator in the container with the recording sensor, the other is on the lowest shelf containing blood.
    The temperature of the thermometer on the top shelf (in the container with the recording sensor) is to be within 1°C of that shown on the recorder chart. These temperatures are to be recorded on the Blood Bank Refrigerator Temperature Recording Log.

Note: For those refrigerators having an electronic temperature monitor, the monitor may be used in place of one of the spirit thermometers. These monitors must be verified with an N.B.S. thermometer prior to use and on an annual basis as is done with all Sentry Hi/Lo temperature devices and spirit type thermometers.

  1. Temperature charts from the seven-day mechanical recording devices are to be changed weekly, dated, initialed and labeled for proper identification (refrigerator ID/site). For the recorders that require winding the recording device must be wound weekly when the chart is changed. All markings showing a departure from nominal temperature are to be explained on the chart. The individual changing the chart must sign the chart and place it on the Supervisor's desk for review.
  2. The alarm power light is checked to verify that the alarm is operational. This is documented on the Blood Bank Refrigerator Temperature Recording Log.
    The temperatures of alarm activation are checked monthly. Refer to the "Blood Bank Alarm Check Procedure".
  3. If the Blood Banking Department is not continuously staffed, the alarm must be wired into a remote station that is staffed twenty- four hours a day:

        Finley: Switchboard
        MMC-DBQ: Switchboard
        MMC-DV: Nursing Station

    Blood Bank reagent storage refrigerator:

  1. All refrigerators in which Blood Bank reagents are stored (with the exception of those units that have a recording thermometer) are to be monitored using a minimum - maximum temperature monitor (obtained from the UCL Materials Management Supervisor).
  2. Temperature readings will be taken and recorded in the appropriate Temperature Recording Log daily. The display will be reset right after the readings are documented by pressing the reset button below the display.
  3. The blood bank reagents are to be stored at 2°C - 8°C. If the temperature is not within the acceptable range refer to the Blood Bank Alarm; Operational Policy. At the end of the month the Temperature Recording Log must be placed on the Supervisor's desk for review.

Blood bank freezers:

All freezers in which blood components for infusion are stored need to have recording thermometers with visual and audible alarms.

  1. All freezers must be numbered or named in a manner that will clearly identify them on the Blood Bank Freezer Temperature Recording Log. Temperature tolerance limits will be posted on the Blood Bank Freezer Temperature Recording Log for each unit. [See template of Blood Bank Freezer Temperature Recording Log; enclosure #7.]
  2. MMC and Finley freezers: the sensors for the temperature recording and alarm systems are mounted on the inside wall of the freezers.
    SMU: the sensors for the temperature recording and alarm systems are immersed in a container of immersion solution.
    The alarm is activated when the temperature falls outside of the acceptable range. (MMC-DBQ freezer -50°C to -60°C, Finley freezer -18°C to -25°C, MMC-DV freezer < -18°C.)
    The electrical source of the alarm system is separate from the freezer. The alarm has a back-up battery to power it in the event that the electrical power is cut off.
  3. The temperatures from the recorder and display temperature, if applicable, are to be checked and recorded daily on the Blood Bank Freezer Temperature Recording Log. If the temperature is not within the acceptable range refer to the Blood Bank Alarm; Operational Policy. At the end of the month the Blood Bank Freezer Temperature Recording Log must be placed on the Supervisor's desk for review.
  4. Temperature charts from the seven-day recording devices are to be changed weekly, dated, initialed and labeled for proper identification (freezer ID/site). All markings showing a departure from nominal temperature are to be explained on the chart. The individual changing the chart must sign the chart and place it on the Supervisor's desk for review.
  5. The alarm is checked daily to verify that the alarm is operational. This is documented on the Blood Bank Freezer Temperature Recording Log.
    To verify operation the following are checked:

        Finley: alarm power light
        MMC-DV: alarm power light
        MMC-DBQ: alarm power light and audible alarm (If the audible alarm does not sound, the 6-volt battery needs to be replaced. A replacement battery is obtained from the UCL Instrument Specialist office.)

  1. The temperatures of alarm activation are checked every six months. Refer to the "Blood Bank Freezer Alarm Check Procedure".
  2. If the Blood Banking Department is not continuously staffed, the freezer alarm must be wired into a remote station that is manned twenty- four hours a day:

        Finley: Switchboard
        MMC-DBQ: Switchboard
        MMC-DV: Nursing Station

Blood warmers:

It is the responsibility of the hospital’s Biomedical Department to ensure that all blood warmers in use at their facility are functioning properly.

  1. They must keep an accurate inventory of blood warmers used in their facility.
  2. Each blood warmer must have a unique number attached to identify it.
  3. The Biomedical Department will perform function verification checks according to procedures established by the Biomedical Department for the various types of blood warmers their facility has in operation.
  4. Function verification checks will be documented by the Biomedical Department.
  5. Function verification documentation will be reviewed by the Laboratory Site Supervisor at least annually.
  6. After review, the function verification records are returned to the Biomedical Department in house for storage.

    Function Verification:

The following areas will be verified on a quarterly basis.

  1. Plate/water bath temperature
  2. Effluent temperature
  3. Alarm activation [For some units it is not possible or desirable to raise the temperature of the heating units in order to test the alarm. If alarm activation is not possible for a particular type of blood warmer, NA (not applicable) must be indicated for the alarm check.]

    References:

  1. AABB Technical Manual. 11th Edition. 8101 Blenbrook Road, Bethesda, Maryland, 20814. 1993.
  2. JCAHO 1996 Comprehensive Accreditation Manual for Pathology and Clinical Laboratory Services.
  3. CAP Guidelines.

1980; S. Raymond: General Policy: Temperature Verification and Temperature Recording
1987; M. English: Blood Bank Refrigerator Alarm procedure
2/20/95; S. Raymond: Temperature Monitoring; Standard Laboratory Refrigerators and Freezers (Sentry Hi/Lo®)
5/23/95; S. Raymond, L. McGovern: revised to QA format
2/8/96; L. McGovern: note in Banked Blood storage refrigerators section.
4/22/96; L. McGovern: revised Room Temperature section
8/29/96; L. McGovern: added Blood warmer section

Instrument Problem Report/ IPR

IPR Purpose/design:

The Instrument Problem Reporting program with its IPR template [see IPR template following this paragraph] provides a standardized framework against which laboratorians can use the sensitivity they typically develop to nominal operational characteristics of the instrumentation they are charged with running to recognize, compare and record deviation so that as many instrument-related problems as possible can be trapped early on and resolved while they are still relatively minor and before they pose serious interruption of services and consumption of resources.

The IPR articulates closely with the routine Maintenance & Function Verification Logs/Procedures and the SMR (Special Maintenance Report).

IPR Content/layout:

The IPR is divided into 8 sections:

  1. Demographic: This is the header section and includes the name and location of the instrument along with the date, time and name of the tech filing the report.
  2. Symptoms: This section requires that the operator attempt to register the instrument performance deviation as essentially associated with control aberrance, increased imprecision, a physically apparent malfunction, or some other broad category in an effort to trigger consideration of the nature of the error.
  3. Initial Attempts at Resolution: Laboratorians are typically encouraged to make initial attempts at resolving instrument problems they encounter rather than just passing them off. This section asks for a brief description of any of those initial attempts.
  4. Evidence of Resolution: In order to obviate assumptions in terms of problem resolution, Section 4 of the IPR requires some evidence that initial or front-line remedial action was indeed effective. This evidence could be simply a description of physical phenomenon such as resumption of target pneumatic pressure or it could be a presentation of data such as control values or precision replicates. Evidence for resolution will, of course be dependent upon the nature of the problem and the symptoms that triggered its recognition.
  5. Call Activation: In those cases when initial attempts at problem-solving are not successful or if the results of those attempts are uncertain, Section 5 is followed. Recognition and action recorded on the IPR to this point is communicated to subsequent instrument operators by posting the IPR on or near the instrument and a specific call list is then activated in order to bring the problem to the attention of someone in management who will be responsible to assess the situation and bring additional resources to bear if warranted. If an Instrument Specialist is contacted at this point, the SMR (Special Maintenance Reporting) system is triggered.
  6. Status: Shutdown/Monitor: The operator, management personnel or an Instrument Specialist can shut the instrument down or elect to allow the analyzer to continue in use while formally calling for additional, specific observation. The choice is indicated by a check mark in the box provided at Section 6 of the IPR. Space has been provided on the back of the Instrument Problem Report for documentation of continued observation.
  7. Instrument Replaced/Major Component Replaced: It is indicated on the IPR if an instrument was replaced or if a major component is replaced. In either of these cases, calibration verification must be performed before the instrument can be put into service. Three levels of control or calibration verification material, that is on hand, is run. If only two levels of control material is on hand, then a third level is prepared by making a 1:1 dilution of the two controls and calculating a target value. Results obtained on the 1:1 dilution must be within ±10% of the calculated target. The calibration verification results are recorded in the SMR.
  8. Review: As with most of the QA programs of DBQ Pathology Associates, the Instrument Problem Reporting system is designed to disseminate information and then close the circle; i.e., bring information, observation, pattern recognition, resolution, conclusion, etc. back to the individual(s) who initiated the process. There are several levels of review incorporated into the IPR:

        It is always reviewed by an Instrument Specialist.
        It is frequently reviewed by the Technical Director.
        It is always returned to the originating site for review by the site Supervisor and staff in the relevant department.

      Section 8 allows space for formal comment from Instrument Specialists and/or the Technical Director before the IPR is cycled back to its place of origin.

IPR and SMR Storage

The IPRs and SMRs are filed together chronologically and alphabetically by instrument where readily accessible to the technical staff.


1974; S. Raymond: Instrument Specialist Program and IBM template
1986; S. Raymond: Macintosh template
9/24/95; S. Raymond; written description revised to QA format
August 2008 L. McGovern/S. Raymond, (Revised: 7 added, IPR form revised


Method Verification Report/MVR

Method Verification Report; purpose/design:

The Method Verification Report/MVR is used to document baseline studies on new assays as well as performance verification as determined by the Technical Director or the Assistant to the Technical Director on assays in use.
Any test method along with its performance specifications is available through the office of the Technical Director at any time to any client who requests it. The information will typically be transferred formally by means of the Method Verification Report.

Method Verification Report; content:

The MVR will include the assay/instrument/location/date/method verifier. The symptom or issue that triggered the report will be defined. All action taken and data collected will be listed step by step in chronological order. [see following MVR example.]

Method Verification Report; review:

MVRs will be reviewed and signed by the Technical Director and then routinely sent to the applicable Site Supervisor for review.

Method Verification Report; storage:

MVRs are to be stored on site in the applicable assay file for 2 years.


5/30/95; S. Raymond, L. McGovern
revised 1/18/98; method and performance spec. availability; S. Raymond


Competency Assurance

Inservice/Initial Competency Assessment:

With the release of new instrument systems, new or substantially changed procedures and integrated into the orientation of new employees is a structured program of Competency Assessment.

Inservice documentation is prepared by the instructor and consists of an Inservice Outline, Inservice Report, Pre and post testing (intermittent) and Competency Evaluation. These formatted documents are used to affirm inservice content, participation and participants' ability to understand and perform material covered during each inservice.

    Inservice Outline:

An outline lists the material to be covered in the inservice. The inservice outline is reviewed by the Technical Director and distributed by the instructor to the staff prior to the inservice.

Inservice Report; content/review:

    The Inservice Report includes the subject/location/date/participants/instructor's name/material covered.

    Inservice Reports are reviewed and signed by the UCL Technical Director.

Inservice Report; storage:

  1. For staff at UCL sites, the Inservice Reports are sent to the Education Specialist and are placed in each staff member’s education file.
  2. At Ancillary Testing sites; e.g., Medical Associates, MMC Surgery, MMC Cardiac Cath Lab, etc., the Inservice Reports are sent to the site Manager/ Supervisor for review and are placed in each staff members personnel file.

Competency Evaluation; content/review:

  1. Competency Evaluations are check-off lists itemizing key elements and documenting a working knowledge of procedure principle, technique, associated maintenance/function verification and data management relevant to the functional execution of the protocol.
  2. Competency Evaluations are reviewed and signed by the UCL Technical Director.

Competency Evaluation; storage:

  1. For staff at UCL sites, the Competency Evaluations are sent to the Education Specialist and are placed in each staff member’s education file.
  2. Ancillary Testing sites (i.e. Medical Associates, Dubuque Pediatrics, MMC Surgery, MMC Cardiac Cath Lab, etc.)
    Competency Evaluations are sent to the Supervisor for review and are placed in each staff member’s personnel file.

Recurring Competency Review:

Competency Review is performed on an annual basis to comply with Joint Commission Standard HR.01.06.01

The staff member’s competency assessment includes the following:
-Direct observations of routine patient test performance, including patient preparation, if applicable, and specimen collection, handling, processing, and testing.
-Monitoring recording and reporting of test results.
-Review of intermediate test results or worksheets, quality control, proficiency testing, and preventative maintenance performance.
-Direct observation of performance of instrument maintenance function checks and calibration.
-Test performance as defined by laboratory policy (for example, testing previously analyzed specimens, internal blind testing samples, external proficiency, or testing samples.)
-Problem-solving skills as appropriate to the job.
While the goal of the program is serious, there is no reason whatever for implementation to be oppressive in nature or punitive in any way. The idea is to provide a platform for technical staff to refresh memory, fine-tune proficiency, learn something and feel good about the process. The framework is self-instructional with deliberate margin for personalized scheduling and implementation at a pace that is not at odds with an already pressured profession. It will be managed largely at the site/department level and, as indicated earlier, cycled annually.

Recurring Competency Review program specifics:

  1. Laboratory procedures have been sorted into different Test Systems.
    Fortunately, almost all of the procedures in the clinical laboratory can be grouped by specific method. That is, the analytical system, whether it be in instrument operation, reagent manipulation, specimen handling or data management, is essentially common to the individual procedures within the group. If a laboratorian is well-trained initially, proficiency verification against a procedure representative of a specific method group can safely be extended to the other procedures within the group. In other words, if you can “get it right” on gentamicin analyses for example, you can “get it right” on digoxin, carbamazepine, phenobarbital, phenytoin, theophylline, tobramycin, valproic acid and so on since they are, in our environment, identical in terms of order processing, sample acquisition/handling, reagent management, instrument operation, critical value recognition/results reporting, LIS interface, biochemical method and so on.
    A procedure from each category is singled out for specific attention in a given competency review cycle. Different procedures are selected in each cycle.
  2. Cover sheets are prepared to include Test Systems appropriate for each Site and or Department. Cover sheets list the procedure selected for each Test System and the requirements that must be fulfilled to comply with Competency Review.
    UCL Competency Review includes the following six assessments as defined by the Joint Commission. For ease of reference, all documentation collected to fulfill these assessments is labeled with the assigned assessment number.
      1. Assessment #1 Direct observations of routine patient test performance, including patient preparation, if applicable, and specimen collection, handling, processing, testing, safety and exposure control. Direct observation includes wearing proper Personal Protective Equipment (PPE) and using designated engineering controls.
      2. Assessment #2 Monitoring recording and reporting of test results.
      3. Assessment #3 Review of intermediate test results or worksheets, quality control, proficiency testing, and preventative maintenance performance.
      4. Assessment #4 Direct observation of performance of instrument maintenance function checks and calibration. Staff choose which maintenance procedure to use. It is completed as part of scheduled maintenance. A maintenance inservice attended or impromptu inservice given can be used if it included hands-on experience that was directly observed. Staff cannot choose the same maintenance procedure two years in a row.
      5. Assessment #5 Test performance as defined by laboratory policy (for example, testing previously analyzed specimens, internal blind testing samples, external proficiency, or testing patient samples.)
      6. Assessment #6 Problem-solving skills as appropriate to the job.
  3. Design Nature of Multiple Choice Questions and Answers:
    Test questions are prepared for each procedure selected for each Test System.
    One question must pertain to the written procedure, one must pertain to quality control and one must evaluate problem solving skills. The multiple-choice element is designed to be straightforward with correct answers that are obvious.

    Test Procedure Name
    (Fill in the circle that represents the correct answer.)

 Question?
 o  answer option 1
 o  answer option 2

 Question?
 o  answer option 1
 o answer option 2

 Question?
 o  answer option 1
 o  answer option 2

  1. Technical staff receive a self-instructional Competency Review Program Packet and are responsible for completing it within the subsequent 5 months. The packet includes a cover sheet that lists the procedure for each Test System included in the review cycle and test questions for each Test System. Any staff member who has successfully completed the previous cycle of Competency Review is authorized to perform Direct Observation (Assessment #1).
  2. Supervisors perform the final review of each completed Competency Packet. This includes making sure the review is complete, all documentation is appropriate and labeled, and the test questions corrected. A date and initial, and the word “Reviewed” is written on the cover page. The supervisor reviews and explains any test questions answered incorrectly and documents on the cover page that the staff person understands what was reviewed.

5/30/95; S. Raymond, L. McGovern: Inservice/Initial Competency Assessment
10/13/96; S. Raymond: Recurring Competency Assessment
January 2009 R. Schaefer: Design Nature of Multiple Choice Questions and Answers
September 2009 L. McGovern: updated for current procedures
January 2012 L. McGovern/R. Schaefer Revised for Joint Commission Standard HR.01.06.01


Proficiency Testing

Process and design:

In our context, the common Proficiency Testing Program (Survey) is a subscription service created to provide large-scale, comparative assessment of clinical laboratory testing by constituent (analyte). These programs usually begin with big batches of commercially prepared material which have been made to mimic the human specimens typically encountered in the clinical setting. These preparations are aliquotted and distributed by the manufacturer to participating laboratories across the land where they are assayed for those constituents included in the menus of the assaying laboratories. Testing results are then submitted to a central data processing facility to be collated.

Handling:

Unless specific directive attending the proficiency testing material contraindicates, the samples themselves are handled within the context of established "Universal Precautions" (See Infection Control Policy manual); i.e., as common biologics capable of harboring and transmitting disease.

    Pre-analytical preparation and precaution:

Most survey material is both extrinsically and intrinsically quite different from the specimens routinely being assayed and it requires peculiar pre-analytical processing such as, rehydration, reconstitution, equilibration, etc. Since the majority of Proficiency Testing results are graded and variance from nominal accuracy triggers punitive action, since the government and certifying agencies are sometimes not sophisticated enough to differentiate artifactual error from error actually associated with patient testing, it is absolutely essential that survey material be prepared very carefully and in exact accordance with its package instructions. If there is a problem with the packaging that might affect the properties of the testing material or if there is any kind of problem in the pre-analytical preparation of the testing material, do not proceed with testing. Contact the Technical Director or Assistant to the Technical Director.

    Assay conditions:

Once survey specimens are ready to assay, they are not handled substantively different from the way patient specimens are handled. It is our policy, in as much as possible, to integrate survey samples in a routine fashion into the regular workload; e.g., we don’t want to run them in duplicate or surround them by “known” trials if we’re not doing the same by written procedure to patient samples on which the target constituent(s) is being measured. The material is analyzed on site and there is no kibitzing between sites with respect to the results obtained prior to reporting to the subscription provider. Proficiency Testing is rotated among the personnel who perform the testing. In accordance with regulations, Subpart H 493-801(b)(1) of the Federal Register 19920228, individuals performing analyses, as well as laboratory director or designee, will bear witness to the execution of this protocol by filling in and filing an Attestation form with data submission on each survey subscription event.

    Primary Records, data submission forms and attestation forms:

Raw data is copied from the laboratory’s primary record to the data submission forms provided. This occasionally requires some kind of unit conversion and almost always involves several transcription types unrelated to patient testing and it is, therefore, another very troublesome source of artifactual error. Transcriptions to the report form must be checked by someone other than the transcriptionist before the data form is either forwarded directly or the data is submitted electronically to the provider of the survey subscription.
A copy of the data submission form along with any relevant primary records is retained for two years. The form can be helpful in terms of tracking down clerical errors from the primary record onto the data submission form itself and from the data submission form into the proficiency data processing system. Once the Evaluation Summary has been received back from the survey facility and reviewed, the data submission form is of little, if any, use; retention at this point is purely a matter of compliance with the certifying agency.

Attestation Forms:

CAP: Technical personnel directly contributing to the assay of survey samples will sign the form provided with the data submission sheets. The form will be forwarded to the Office of the Technical Director where it will be counter-signed, scanned and indexed for rapid retrieval throughout the enterprise. Do not keep a copy on site.
API: Before testing data is submitted electronically, names of technical staff that participated in the testing are transcribed from the Attestation signature form onto the appropriate API e-form. Using the button built into the web page, print a copy. Send this copy along with the original Attestation signature form to the Office of the Technical Director where they will be counter-signed by the Technical Director, scanned and indexed for rapid retrieval throughout the enterprise. Do not keep copies of these documents on site.

Data Summary; evaluation and review:

Once the raw data has been received, entered, collated and computer-graded at the central data processing facility, an Evaluation Summary is printed and mailed back to the participants. One copy goes to the submitting laboratory site and another copy is forwarded to the Technical Director for review. The site/department Supervisor/Manager and the Technical Director are particularly sensitive to any survey trials or challenges which deviate from the group specific (method/reagent/instrument) performance pattern either in a glaring, punctate fashion or in a more subtle, general bias configuration.

Results which do not “fit” with their comparison group are assessed in terms of whether or not they are “wrong” and if so why. Obvious error like clerical mistakes can usually be recognized and resolved quickly by referencing the relevant primary record and data submission forms. Other deviations from nominal can be substantially more difficult to decipher; additional testing may be part of the investigation. If remedial action or adjustment is warranted, it is triggered by the Technical Director along with the appropriate follow-up.

The Site/Department Supervisor completes a pdf “Proficiency Testing – initial scripted response to ‘unsuccessful’ survey challenges” for each result that does not “fit” with the comparison group. The completed form is printed and forwarded to the Technical Director for review and completion of section VIII. The form is signed by the Technical Director and returned to the enrolled sites. The Proficiency Testing Scripted Response is attached to the Proficiency Testing Evaluation Summary. The results are reviewed and signed by the Site/Department Supervisor(s) and a Pathologist. The document is then posted for a period of time allowing the staff opportunity for review.

A single addendum is prepared by the Technical Director evaluating survey trials that are “not graded”. This includes testing events with less than 10 participants and those that do not obtain the agreement required for scoring.

Upon notification by the Joint Commission of an unsuccessful proficiency testing status, the Site Supervisor will notify the Technical Director and submit an appropriate plan of action within ten calendar days.

Evaluation Summaries are, of course, kept for a minimum of two years.

Analytes not covered by traditional proficiency testing programs:

There are a number of laboratory tests which, because of properties associated with either the analytical target or the matrix, do not lend themselves to the logistics of commercial Proficiency Testing and are not included in Surveys offered by “sanctioned” agencies. CLIA ‘88 regulations mandate the biannual “accuracy verification” of these procedures. This, of course, is not completely possible either but, where it is, a formal program exists. Refer to Intersite Analytical Method Parallel.


9/20/95; S. Raymond; Technical Director; DBQ Pathology Associates

2/24/04; S. Raymond; revised for handling Attestation sheets
9/18/07; S. Hosch/S. Raymond; revised Data Summary eval & review “Proficiency Testing-initial scripted response…”

Inoperable or “down” Test Systems & Backup

Purpose/design:

When analytical systems are not working properly tension typically mounts. Staff will immediately focus on local cause, effect and remediation and sometimes forget to warn recipients that their routine laboratory service has been compromised. The purpose of this policy is to help assure that accountable Management is formally involved, that compensatory systems work smoothly and that practitioners dependent upon the affected service can adjust their expectations in accordance with the situation.

Further more, because circumstances surrounding a “down” analytical system vary greatly from one event to the next, it is difficult if not altogether counterproductive to attempt pre-configuration of response in terms of backup instrumentation, testing location, timing, sample handling, results reporting logistics, etc. A general policy, on the other hand, which sets the stage for defining when a test system is inoperable and then describes guidelines for the design and implementation of the proper course of action in the face of “down” status is meaningful.

Backup instruments and methods:

Written test procedures generally will not include reference to backup methods because the laboratory system will not typically maintain methods dedicated or designated specifically as “backup”. Any exception to this general posture will be covered in the appropriate test procedure; i.e., cross-references to particular backup method/instruments will be made.

Procedure:

    Recognition:

Sometimes recognition that a test system is inoperable is painfully straightforward; the fact that there’s just no available reagent or that the system has by itself gone into shutdown/fatal error mode or that the analyzer is stone cold to touch could certainly be enough to settle any doubt. More often, symptoms will be subtler as with insidious imprecision or trace carryover and the realization that a system is inoperable can be considerably less explicit.

    Initial action:

  1. Cease all testing; do not report patient results.
  2. Notify the supervisor. The site supervisor will assess the situation and then contact an Instrument Specialist for consultation and corroboration, notify a site pathologist and determine if Communication to Affected Services is required.

    “down” status; declaration:

An analytical system will be considered “down” if a physical malfunction renders it obviously inoperable, when system status has gone to auto shutdown (fatal error) and/or when one of the following personnel has determined that the analytical outcome, controls or otherwise, is unacceptable.
List of personnel who can officially declare an analytical system “down”:

  1. Instrument Specialists
  2. Technical Director or Assistant to the Technical Director
  3. Pathologist
  4. Supervisor or Technician in concert with any of the above.

If the “down” status of an analytical system is instrument-related, an Instrument Specialist, if not already aware of the situation, is to be notified immediately.

    Management intervention; response:

The appropriate response for Management will be largely ad hoc; i.e., dictated by prevailing circumstance and fabricated in real time; a number of variables will have to be defined and many details may have to be attended in order to effect an orderly, efficient process. The management group will sort through the following considerations and orchestrate the warranted activity.

Communication to affected Services:

  1. Estimate the length of time the system is likely to be down. (The situation may require constant monitoring; this estimate may very well have to be adjusted a number of times over the course of the event.)
  2. Determine the clinical urgency typically associated with the affected tests.
  3. Specifically appoint technician(s) on-site to phone those critical areas served by the laboratory that will be affected by the “down” system.

      a. The communication must include identification of those tests the contacted department is likely to become anxious about, an estimate of how long they can expect to wait for results on the affected tests and a projection to recovery time; i.e., when they can expect the system to be back “up”.
      b. In the Incidence Log of the affected test system record the time of the communication, the name(s) of the department(s) and client(s) that have been informed along with the name(s) of the person(s) contacted.

  1. List of stations and personnel that need to be formally considered for direct contact depending on the location and the influenced tests:

      a. Pathologist on-site or on-call
      b. Emergency Room (hospital)
      c. Intensive Care (hospital)
      d. House Supervisor, Nursing Service (hospital)
      e. Radiology
      f. Acute Care, Pediatrics, Oncology (Medical Associates)

Alternate Testing sites and re-routing of specimens:

  1. Decide if some or all of the specimens on which affected tests have been ordered need to be routed to alternate testing sites.
    Most of the time the alternate testing site resides in a local laboratory under the direction of Dubuque Pathology Associates where matched analytical systems have been a long-standing, deliberate feature of the Quality Assurance Plan. A “matched analytical system” in this context usually connotes the same instrumentation, the same reagents (including lot numbers), the same calibrators or standards (including lot numbers), the same control material (including lot numbers) and the same written protocol covering both analytical aspects and support.
  2. If re-routing of specimens is deemed necessary, specify which will be sent and where they will be sent.

      a. Contact supervisory staff at the alternate site(s) to let them know that specimens will or may be routed to them.
      b. Determine if the alternate site(s) can absorb the extra workload with existing staff configurations.
      c. If additional staff or extended hours will be required to manage the modified workflow at either the original testing site or the alternate testing site(s), make those arrangements.
      d. Bolster the courier staff if necessary; e.g., rearrange routine work assignments of courier staff to cover the additional trafficking of specimens, retain the local cab company, commandeer testing, clerical or management personnel, contract with a professional courier service, etc.

Re-release:

The Technical Director or Assistant to the Technical Director will determine when an analytical system can be released from a “down” status. If the “down” status is instrument specific, an Instrument Specialist can make this determination in real time and then communicate the specifics subsequently to the office of the Technical Director.

A Technician on site will be appointed on an ad hoc basis to re-contact appropriate stations and personnel in order to inform them that the system is back in operation.


5/12/95; S. Raymond: Technical Director; DBQ Pathology Associates
10/10/95; revised to include “matched analytical system” definition; S. Raymond
1/27/96; revised specific language: Purpose/design, “down” declaration & stations/personnel to be contacted; S. Raymond
July 2010; revised: added Radiology; L. McGovern


Reagent Overlap, “serology” kits

Purpose:

New reagent lots must be checked against current reagent lots or with selected reference materials before being placed into service.
While not as prevalent as in the past, there can still be some significant variance in the avidity and affinity between batches of "serology" reagent due to the nature of manufacture and application of the biologic components inherent in the method group. "Overlapping" controls and/or reagents is a technique that can provide some modicum of assurance that the new material will behave comparable to the material that is already in use. It is noted that exogenous controls can vary independently of the reagent lot and that while the reagent lot may change with a new kit lot, controls may remain the same. In these circumstances, the value of the overlap exercise is diminished. We will nevertheless, for the sake of simplicity, run the routine overlap protocol anyway.
“Serology Testing” has been divided into categories based on kit methodology, and packaging of reagents and controls.

Immunoassays with endogenous controls:

These assays have endogenous controls in every test cartridge to verify that the cartridge functioned properly and that the test was performed correctly.
QuickVue Flu A+B, Rapid Strep A, ßhCG, Mono, RSV, Rotavirus.
Overlap testing is not required on waived test kits (Rapid Strep A, BhCG, QuickVue Influenza A+B, Mono, RSV). We have elected not to perform overlaps on waived test kits since requirements state it is not necessary and there is no added value in doing so.

Latex Agglutination Assays/Other Assays:

    Control/Reagent Kit packaged together:

Rubella, RA, FDP, Directigen Meningitis Panel, Teichoic Acid, Cryptococcal Murex, Immunocard H. plylori, Immunocard Clostridium difficile Toxins A & B, Mycoplasma pneumonia, RPR.

Standard Overlap Protocol:

  1. When the current kit lot is within several runs of depletion, open the new kit lot and assay the current exogenous controls against the new reagent lot and controls.
  2. Results of the control sets and the new kit lot number are recorded in the designated test log.
  3. Reaction of the current controls against the new reagent lot is compared against their recorded reactivity with the current reagent lot. Results of the control sets should be equal in strength. If they are not, contact the site supervisor and/or the office of the Technical Director before placing the new kit lot into service.
  4. If the overlap is acceptable, mark the date "opened and overlapped" along with initials of the person performing the overlap on the new kit.

1983; J. Schultz: Policy Regarding Overlapping of Controls for any Serological Procedure
9/19/96; L. McGovern, S. Raymond: Policy revised for QA language, explanation and forma
2/18/98; S. Raymond: standardized routine overlap protocol between serology packaging configurations


Micropipettor Pipetting Techniques

Micropipettor Pipetting Techniques; principle:

There are three basic techniques that will be allowed when using two-stop micropipettors. All test procedures will clearly specify which technique is to be used when micro pipetting is called for and the language used to refer to the technique will conform to this protocol.
The Blood Bank Multiple Dispense Micropipettor (ID-TipMaster) is designed to deliver multiple volumes of 12.5, 25, and 50µl. The ID-TipMaster pipettor is intended for blood bank "Gel" procedure use only.

Micropipettor Pipetting Techniques; Two-Stop Micropipettors:

This technique will usually be specified when delivering a sample into a dry receptacle.

  1. Attach a pipette tip firmly to the instrument.
  2. Depress the plunger all the way through the first stop to the second stop.
  3. Immerse the pipette tip a few millimeters below the meniscus of the sample and then allow the plunger to return slowly and smoothly to its original, or rest, position. Remove it from the sample.
  4. Wipe the pipette tip with a soft tissue being careful not to wick any sample from the bore of the tip.
  5. Touch the tip of the pipette to the side of the receiving vessel and with one even motion; depress the plunger to the first stop.
  6. While holding at the first stop, withdraw the pipette.
  7. Once the pipette has been withdrawn, the plunger may be allowed to return to its original, or rest, position.

Note: In the "Reverse" mode several deliveries of the same sample may be made with a single pipette tip. The actual number of tubes that can be obtained from one tip will vary with the pipette size. Care must be taken to prevent the increasing residual sample from reaching the pipette tip filler.

    "Forward Rinse" mode:

This technique is used when delivering sample into solution.

  1. Attach a pipette tip firmly to the instrument.
  2. Depress the plunger to the first stop.
  3. Immerse the pipette tip a few millimeters below the meniscus of the sample and then allow the plunger to return slowly and smoothly to its original, or rest, position. Remove it from the sample.
  4. Wipe the pipette tip with a soft tissue being careful not to wick any sample from the bore of the tip.
  5. Immerse the pipette tip a few millimeters below the meniscus of the receiving solution and then in an unhurried and continuous motion deliver the specimen in the following manner:

      a. Depress the plunger all the way to the second stop and then while keeping slight pressure on the plunger allow it to return to its original, or rest, position.
      b. Repeat step "A" four more times (total of five rinses) all the while keeping the pipette tip in the solution.
      c. Depress the plunger all the way to the second stop and then while holding at this position, remove the pipette tip from the solution.

  1. Allow the plunger to return to its original, or rest, position.
  2. Eject the tip.

Note: Use a new tip for each sample delivery.

    "Forward" or "To contain" mode:

This technique is rarely used.

  1. Attach a pipette tip firmly to the instrument.
  2. Depress the plunger to the first stop.
  3. Immerse the pipette tip a few millimeters below the meniscus of the sample and then allow the plunger to return slowly and smoothly to its original, or rest, position.
  4. Wipe the pipette tip with a soft tissue being careful not to wick any sample from the bore of the tip.
  5. Touch the tip to the inner wall of the receiving container and with one even motion; depress the plunger fully to the second stop.
  6. With the plunger fully depressed, withdraw the pipette and discard the tip.
  7. Allow the plunger to return to its original, or rest, position.

Note: Use a new tip for each sample delivery.


Blood Bank Multiple Dispense Micropipettor (ID-TipMaster):

    Modified "Reverse" or "To Deliver" mode:

  1. Set the intended volume by turning the Volume Selector.
  2. Attach a pipette tip firmly to the instrument.
  3. Immerse the pipette tip approximately 3mm into the sample.
  4. Pull up on the filling lever to its uppermost position to fill the tip with sample.
  5. Withdraw the tip from the sample, touching the edge of the sample container to remove any excess sample. Do not wipe the tip.
  6. Dispense by pressing the dispenser knob down until it stops. Dispense the first stroke back into the sample container in order to prime the tip properly and to reach maximum accuracy.
  7. Align the tip with the receiving Gel card microtube, being careful not to touch the tip to the side of the microtube.
  8. Press the dispenser knob down. This operation dispenses the preselected volume of sample. Allow the dispensing lever to return to its original position.
  9. Continue to dispense sample from the pipette tip into each Gel card microtube that requires that sample. When a full dispense stroke is no longer possible, refill the pipette tip by repeating steps 1 through 4.
  10. Eject the tip.

Note: With the ID-TipMaster pipettor several deliveries of the same sample may be made with a single pipette tip.


4/19/78; S. Raymond
8/24/91; S. Hafenbredl
11/26/96; S. Wallace, S. Rodriguez: added FP-2 micropipettor
1/28/97; L. McGovern: revised to QA format
August 2001; S. Wallace: added ID-TipMaster micropipettor


Reference Range Evaluation/Validation

Reference Ranges; kinds:

Within the context of clinical laboratory analytical testing, there are four major kinds of “reference range”.

  1. There are reference ranges based upon the presence of the analyte as it is distributed within a defined population; these are largely the ranges that have traditionally been known as “normal ranges”.
  2. We also use ranges that hinge upon the biologic effect of the test target either directly as with drug intervention or indirectly as with Protime or APTT for coumadin or heparin effect respectively in coagulation management.
  3. There are some reference ranges that are not statistically defined; they can best be thought of as simply experiential and “expected” as the gross appearance of normal urine or the RBC surface antigen possibilities in routine blood bank group and typing.
  4. And then there are reference ranges that are really a matter of decree by some recognized legal or public health authority such as National Lipid Research Center out of CDC or National Glycohemoglobin Standardization Program (NGSP).

The nature of the reference range can effect both its initial and on-going evaluation processes. For example, if the reference range is decreed, as with cholesterol at less than 200 mg/dl, it’s important to prove the in-house method against the analytical system that was used to establish the target and probably inappropriate to launch some sort of local study either in a mechanical attempt to comply with regulation or in some misguided effort to validate the target rather than the method. Likewise, grouping and typing of RBCs in routine blood banking is limited to combinations of A, B, AB and O with positive, negative or weak D for Rh antigen. This is the “range” of expected values. Should it be checked periodically? No.

Reference Ranges; new tests:

For all new analytical methods or procedures in the clinical laboratory system under the direction of DBQ Pathology Associates, reference range considerations are made prior to installation of the test. The Technical Director or designee and a Medical Director sign off on the written procedure. Generally, factors such as our specific geographic location and population mix have no clinically significant impact on the disposition of the manufacturer’s statistical assessment and reference ranges are adopted directly from the supporting literature. With the “normal range” type of reference interval, if there is some reason to suspect a valid bias between results produced out of the local population and the manufacturer’s projection, we perform an expected range check generally using a Gausian model initially and then adopting NonParametric Percentile Estimate statistics if the early data seems to warrant it. [See HbA1c example following.]


Reference Ranges; established tests:

If the analytical method is well-established in our laboratories, if it remains stable and, if the data generated by the test is providing unchallenged, expected and effective diagnostic leverage for our clients (the clinicians), existing reference ranges are tacitly validated.

On-going, formal assessment of posted reference ranges, including Critical Values and Phoned Result triggers, takes place on a number of levels both periodically and on a situational basis.

Periodic formal validation:

In the Procedure Review cycles, Interim as well as Comprehensive, the Incidence Logs are checked for problem patterns associated with the method. If there have been unresolved complaints of a nature that would implicate the posted reference range, an investigation is initiated by the site manager or supervisor in concert with the office of the Technical Director in response to observations noted in the Procedure Review Report Form.

The Procedure Review program also provides a planned occasion for the re-alignment of reference range postings on each test. The number and variety of places within the laboratory enterprise where ranges on a given analyte are strewn is surprising and it tends to foster a certain amount of drift from nominal that has to be countered.

A check-off protocol embedded within the Procedure Review process is designed to chase down the ranges of each analyte and make sure they match between written procedures, the Clinical Laboratory Testing Manual (CLTM), internal instrument parameters, LIS filters, interim and final reports, etc., at all laboratory locations under the direction of DBQ Pathology Associates.

Situational formal validation:

Occasionally, manufacturer/suppliers will make adjustments in their analytical methods that are designed to either optimize a particular system element such as sensitivity, specificity, dynamic range, etc. or simply offset their system so that its results are brought into alignment with some recognized standard or group. The former will sometimes impact the reference range of the revised method; the latter will always affect the reference range. In either case, the change is usually announced in advance by the manufacturer/supplier in a bulletin or a flagged package insert.

Our response to these announcements varies with the degree of anticipated impact. Generally, if the change is product-related rather than procedural, revised material will be acquired prior to its official release and the expected shift will be confirmed with a brief patient parallel. From there, the Technical Director in concert with Medical Directors will determine how the issue will be managed internally. It may or may not be appropriate, for example, to write a memo to all or a group of targeted physicians and then coordinate the memo mailing with:

      a. revision to all of the analytes reference range postings,
      b. launch date of the revised method and
      c. timing of computer result entry so that analytical results generated before the change will be matched with the previous range and results generated after the method change will appear with the new reference range. [Example: MVR re. Triglyceride offset to CDC reference method.]

      d. In some instances it has been necessary to sequester and store specimen aliquots and maintain the original method conditions for a period of time so that either the serial testing being conducted on some patients can be completed in context or so that analyses from patients being serially tested can still be actively paralleled with the method that has been replaced until the physician is comfortable with the transition.

There are also occasions when reference range validation is prompted by physician query based on a feeling that test results appear to be running high or low to average expectation on a given analyte. We encourage physician feedback and this type of question almost always elicits a rapid, open response. Assuming normal patterns in routine control data, we would very likely pull a representative number of sequentially reported results on the “suspect” analyte from recent archive and compare them to the reference range. If that comparison is unclear or if it tends to support the notion that there’s been departure from nominal, additional reported results are recovered from the record and statistically reviewed. While it’s not at all common, the investigation might lead to a more elaborate patient population distribution study and eventuate in some adjustment in the method or the posted reference range or both. [I.Phos example.]

Analytical Range vs. Linear Limits

“Analytical Range” is a set of numbers which comes from the manufacturer and describes the outer markers at which a test method is reliable under optimum conditions. It has its roots in the K-1, FDA approval criteria for the specific analytical procedure. A wide range of variables including chemical kinetics, wavelengths, sample to reagent ratio, timing, etc., etc., are adjusted in order to effect a repeatable, discrete instrument response which is directly attributable to the target constituent and which will cover the kind of values that are likely to be encountered in the clinical setting with any frequency. Analytical Ranges are primarily of technical interest.

    Linear Limits

“Linear Limits”, on the other hand are numbers that have been derived by us. They define the lowest and highest constituent level that we can routinely and independently prove will generate an instrument response which is mathematically identical to the response generated by the analyte standard(s) per unit of measure; i.e., concentration, activity, number, etc. These are the lowest and the highest test values we can report without effecting some manipulation of the sample or the system or both. We look to the “Linear Limits” as stated in our written procedures to determine whether we will have to dilute, either directly or indirectly, the sample and rerun it.


8/9/97, S. Raymond; Technical Director, DBQ Pathology Associates
1/16/98 S. Raymond; revised
September 2003 S. Raymond; revised

Intersite Analytical Results Comparability Verification:

Comparing Test Results between sites; purpose/principle:

Occasionally, it becomes expedient or even necessary to temporarily route samples for testing that is routinely done at the site of initial processing to another laboratory site within the enterprise doing the same testing. Any number of circumstances might trigger this transfer but typically it is driven by an effort to optimize workload, to trouble-shoot some element of the analytical system or data capture process or to compensate for some short term staffing imbalance. Whatever the cause, it is important to ensure that the results from clinical testing protocol used for this kind of backup are comparable.

The analytical systems of laboratory sites under the direction of Dubuque Pathology Associates; i.e., Mercy Dubuque, Finley, Cathedral Square, Mercy Dyersville, Medical Associates East and West campuses, are carefully matched and monitored under the design and direction of the group Technical Director.

Distributed Standardization:

Not only analytical methods, but also the instruments, collection/storage systems, written policies, procedures, reagent lots, control lots, calibrator lots and documentation formats are the same for testing that is common between sites.

This program of Distributed Standardization provides considerable stability in the local clinical laboratory community. Routine method bias is virtually eliminated, reference ranges and reports are essentially identical, rapid problem recognition and remedy is enabled.

Monitoring:

Proficiency Testing subscriptions are also matched. Consultant copies of the Tri-annual Summary reports are forwarded to the Technical Director where the results of these challenges are formally evaluated primarily for deviation from group-specific means as well as bias throughout clinically appropriate ranges within our own group.

Investigation of any statistically significant discrepancy can take the form of control or reagent swapping, instrument component exchange, linearity checks, patient parallels or any combination of the above depending upon the nature of the apparent departure from the expected.

Intersite Analytical Method Parallel (JCAHO Method Validation")

The Analytical Method Parallel is performed twice per year at the direction of the office of the Technical Director, to comply with joint commission regulations that state:

      ● “The laboratory performs correlations to evaluate the results of the same test performed with different methodologies or instruments or at different locations. (QSA.02.08.01).”
      ● “The laboratory verifies the accuracy and reliability obtained for nonregulated analytes and for those regulated analytes for which compatible proficiency testing samples are not available. (QSA.01.05.01)”

Daily Hematology Review:
The City Control (Refer to “Hematology City Control: Preparation, Application and Management” procedure) is reviewed daily to monitor the performance of the different Hematology instruments in use at sites under the direction of Pathology Associates.

Survey Review:

      ● Where same constituents are assayed by more than one method within the system, results from third party proficiency testing, if common to the methods, will be formally compared. (e.g., ACT, Blood Gases, HbA1c, Hematology)
      ● Where same constituents are assayed by the same method at different sites, results from third party proficiency testing are formally compared. (Since this is automatically done, these tests are not listed in table.)

Site Parallels:

      ● Where same constituents are assayed by different methods within the system and different third party proficiency testing is performed, two samples will be assayed at the sites performing the testing and the results will be formally compared (e.g., [Gem Premier Hematocrit / LH750 Hematocrit], [PXP Glucose /DxC Glucose], [DxC Sodium, Potassium, Glucose / Gem Premier Sodium, Potassium, Glucose], [Access Troponin / i-STAT1 Troponin], [Xpand Plus Creatinine / i-STAT Creatinine], [i-STAT1 iCa / Gem Premier iCa], [DxC HbA1c / A1cNow],. [ABO Rh gel method / ABO Rh tube method], [Antibody Screen gel method / Antibody Screen tube method]. This parallel is not required for waived testing, however it is performed for internal purposes where possible.
      ● Where same constituents are assayed by the same method at different sites within the system and third party proficiency testing is not available, five samples are split and assayed at the sites performing the testing and the results are formally compared. Since all sites performing this testing are CLIA certified, this fulfills requirement QSA.01.05.01 for the tests in this category (e.g., Cryoglobulin, Methylene Blue Stain, MPV).

Other:

      ● For MRSA/SA-BC and MRSA/SA-SSTI and VanA third party proficiency testing is not available. According to Cepheid, the Microbiologics controls can be used for proficiency testing. The Microbiologics controls are routinely run with each new lot number and each shipment.

 

Intersite Analytical Method Parallel (JCAHO “Method Validation)

Test

    Method/Instrument

Site/Dept

Sample Type

Action

ACT

PCL

MMC-DBQ Lab

MMC-DBQ Cardiac Cath

ACT survey

Formally compare survey results

 

Hepcon

MMC-DBQ Surgery

   

Blood Gases

i-STAT

MMC-DBQ

MMC-DV

Aqueous blood gas survey

Formally compare survey results

 

Gem Premier 3000

MMC-DBQ Surgery

   

HbA1c

DxC

Finley

EDTA whole blood

Formally compare survey results - GH2 survey

 
 

DCA2000+ HbA1c

MAE

   
 

Xpand Plus

MAE/MAW

   
 

Metrika A1c Now

Finley Diabetes Center

EDTA whole blood

2 samples run on DxC and A1c Now.

Hematocrit / Hemoglobin

Beckman Coulter LH750

MMC-DBQ

Finley

City control

City control run daily

 

Beckman Coulter LH500

MMC-DV

MAE/MAW

   
 

Hematocrit Centrifuge

MMC-DBQ

MMC-DV

   
 

Gem Premier HCT

MMC-DBQ Surgery

Heparin whole blood collected at different times during bypass procedure

Two samples will be run on the Gem Premier 3000 and then taken to the MMC-DBQ Lab to be run on the LH750

 

Hemocue Hgb

MA Satellites/Pediatrics

 

City control run weekly

Hemocue 201 Hgb

MMC Respiratory Care

Clayton County VNA
(Parallel not possible CCVNA)

EDTA Whole Blood

Two samples are run on the MMC LH750 & Hemocue.

Glucose

CX/DxC/Xpand Plus

MMC-DBQ

MMC-DV

Finley

MAE/MAW

Heparin plasma

Formally compare survey results two samples run on DxC and a representative PXP Meter.

 

Precision PXP

MMC Nursing Service

Finley Nursing Service

Cascade

Tri-State Surgery

Heparin whole blood

 

Na/K/Glucose/Chloride

CX/DxC/Xpand Plus

MMC-DBQ

MMC-DV

Finley

MAE/MAW

Multiqual

Formally compare survey results two samples are run on MMC DxC & Gem Premier.

 

Gem Premier

MMC Surgery

   

Troponin I

Access

MMC-DBQ/MMC-DV/Finley

Lithium heparin plasma

Two samples are run on the Access and i-STAT1 at each site.

 

i-Stat1

MMC-DBQ

MMC-DV

Finley

Lithium heparin whole blood

 

Creatinine

/Xpand Plus

MAE/MAW

Lithium heparin plasma

Formally compare survey results two samples are run on Xpand Plus and i-STAT at MAW

 

i-STAT

MAE

Lithium heparin whole blood

 

 

Cryoglobulin

Wintrobe

MMC-DBQ

Finley

Serum

Site parallel
5 samples

Methylene Blue Stain

Difco TB Methylene Blue

CS-Micro

MMC-DBQ

MMC-DV

Finley

CSF

Site parallel
5 samples

         

Ionized Calcium

i-STAT1 Chem 8+

MMC-DBQ

Finley

Lithium heparin whole blood

Site parallel
2 samples

 

Gem Premier

MMC Surgery

Lithium heparin whole blood

 

Na/K/Cl/tCO2/BUN/Glu/
Creat

i-STAT1 Chem 8+

MMC-DV

Lithium heparin whole blood/plasma

Site parallel
2 samples

 

CX5

MMC-DV

Lithium heparin plasma

 

MPV

Beckman Coulter LH750/LH500

MMC-DBQ
MMC-DV
Finley
MAE/MAW

EDTA whole blood

Site parallel
5 samples

MRSA/SA-BC

Cepheid

MMC-DBQ

Finley

Microbiologics Kwik Stik Controls

Assayed controls run with every new lot number and every shipment

MRSA/SA-SSTI

Cepheid

MMC-DBQ

Finley

Microbiologics Kwik Stik Controls

Assayed controls run with every new lot number and every shipment

VanA

Cepheid

MMC-DBQ
Finley

Microbiologics Kwik Stik Controls

Assayed controls run with every new lot number and every shipment

1. All results are recorded on the Data Acquisition Log and sent to the office of the Technical Director at United Clinical Labs-Cathedral Square.
2. Results will be evaluated on a case-by-case basis by the UCL Technical Director to determine if method match is adequate in the specific clinical setting.
3. Records will be scanned and indexed for recovery.

References:

  1. Proficiency Testing and Control Material available for Xpert MRSA, Xpert GBS, Smart GBS and Xpert EV. Cepheid. 4/8/2009.

2/16/00; S. Raymond: Technical Director; DBQ Pathology Associates
December 2002, S. Raymond/L. McGovern (Revised: Intersite Analytical Method Parallel added)
July 2004 L. McGovern (Revised: Intersite Analytical Method Parallel)
February 2005 L. McGovern (Revised: Intersite Analytical Method Parallel)
July 2005 L. McGovern (Revised: Intersite Analytical Method Parallel)
September 2005 L. McGovern (Revised: Intersite Analytical Method Parallel)
October 2006 L. McGovern (Revised: Intersite Analytical Method Parallel) *1 Megan Sawchuk, MT JCAHO agent in phone conference follow-up to UCL PPR, Oct. 2006
July 2007 L. McGovern (Revised: Intersite Analytical Method Parallel)
October 2007 L. McGovern (Revised: Intersite Analytical Method Parallel)
February 2008 L. McGovern (Revised: Intersite Analytical Method Parallel)
August 2008 L. McGovern (Revised: removed Primidone)
October 2008 L. McGovern (Revised: added Acetone)
April 2009 L. McGovern (Revised: added MRSA/SA-BC, MRSA/SA-SSTI)
July 2009 L. McGovern (Revised: added APT)
September 2009 L. McGovern (Revised: MRSA/SA-BC, MRSA-SA-SSTI, added iCa)
March 2010 L. McGovern (Revised: Intersite Analytical Method Parallel)
September 2010 L. McGovern (Revised: Intersite Analytical Method Parallel)
September 2011 L. McGovern (Revised: Intersite Analytical Method Parallel)
March 2012 L. McGovern (Revised: Intersite Analytical Method Parallel)


Phoned Result and/or Order

Protocol For Phoning A Result

  1. Identify yourself and the laboratory (This is Sue at United Clinical Laboratories - Mercy).
  2. Identify the test and the patient; use two patient identifiers for patient identification, the patient’s name and either their date of birth or a unique identifying number (I have a Glucose result on Martha Smith, birthdate February 22, 1922).
  3. State the date and time the test was collected. (The specimen was drawn at 0815, on September 16th).
  4. Give the result including the units (The Glucose is 210 mg/dl)
  5. Ask the person to repeat the entire result back to you, including the date and time of collection.
  6. Ask the person to identify himself or herself.
  7. Document in the CLICS “Called Results” function. Refer to “Phoned Results Documentation; CLICS” protocol. When CLICS is not operational document on the primary record.

          ● Who called the result.
          ● The name of the person who is recording/reading back the result and their location.
          ● The date and time the result was called.
          Example: SW to BF (ER) 9/16/05 0900

  1. For critical values, follow the "Protocol for Reporting Critical Values” located on the UCL web page.

Protocol For Taking A Phoned Result

  1. Phoned results should be written on the special Laboratory Results form.
  2. Space is provided for:
    Make sure all information is complete.

          ● Patient name and unique identifier number or date of birth
          ● Date and time of call
          ● Test name
          ● Date and time of test collection
          ● Result
          ● Called by
          ● Called to
          ● Special notes

  1. Repeat the entire result back to the caller. Record any action taken.
  2. For critical values, follow the "Protocol for Reporting Critical Values” located on the UCL web page.

Protocol for Phoned Microbiology Result:

Call the following test results to the nurse’s station (if inpatient or nursing home resident) or Doctor’s office (if outpatient):

  1. Any gram stain ordered "STAT"
  2. Positive direct fluorescent antibody and/or culture for Bordetella pertussis from the reference laboratory.
  3. Positive Penicillin Binding Protein (PBP2’) test; also contact the appropriate Epidemiologist (if inpatient).
  4. MRSA (Methicillin resistant S. aureus), VRE (Vancomycin resistant, Penicillin intermediate S. pneumoniae, Enterococcus), VRSA (Vancomycin resistant S. aureus), and MIC Penicillin resistant or intermediate S. pneumoniae isolates, as well as Neisseria meningitidis isolates from CSF and blood. Contact the appropriate Epidemiologist, nursing station and Doctor's office.
  5. Positive Stool Culture.
  6. Positive Shiga Toxin test.
  7. Positive Ova and Parasite
  8. Positive Giardia/Cryptosporidium Screen
  9. On Sundays and holidays call only positive Strep Screen and positive Urine Culture results to the physician on call.
  10. Positive Strep screen results to Finley Employee Health
  11. Strep screen results to Loras and Clarke Colleges
  12. Vet cultures as requested
  13. Positive Acid Fast Smear/Culture
    Contact the appropriate nurse’s station or doctor’s office and Epidemiologist.
  14. Positive Fungus Culture for Histoplasma capsulatum, Coccidioides immitis, Cryptococcus neoformans, Blastomyces dermatitidis and Sporothrix schenckii.
  15. Follow the Phoned/Faxed Result Documentation; CLICS procedure for documenting the called result.
  16. For critical values, follow the "Protocol for Reporting Critical Values” or “Critical Values Reporting, Microbiology” located on the UCL webpage.

Protocol for Phoned Order:

Federal regulations require that a laboratory have written confirmation of a physician's verbal order. When a physician or his designee calls United Clinical Laboratories with patient orders:

  1. Transcribe verbal physician orders on an outpatient order form. Read the entire transcribed verbal order back to the person calling the physician’s order for verification.
  2. Request that the order be confirmed in writing within 30 days. Record confirmation was requested along with the date of the request on the patient requisition.
  3. Direct the person calling the order to either send the written confirmation with the UCL courier or fax the confirmation to the site where the orders were called.
  4. Staple the written confirmation to the UCL patient requisition for permanent filing.

June 1988 M. Bonifas CAP Inspection Committee;
September 1992 S. Hosch (Revised: Added section I.3)
January 1994 C. Sullivan (Microbiology phoned result protocol)
December 1995 M.J. Bonifas (Phoned orders)
August 2000 M. English (Revised: combined all phoned result/order protocols)
August 2005 J. Mueller (Revised: patient identifiers)
April 2009 J. Wedig (Revised: protocol for phoned Microbiology results)

-*s

Outdated Materials, Use of

Principle:

The continued use of any reagent, standard, control or other time-dated laboratory consumable past it’s expiration date will require the authorization of the Technical Director. If the Technical Director is not available, a department/site supervisor may grant tentative authorization for the extended use of outdated materials but this tentative authorization must be formally approved by the Technical Director at the earliest opportunity.

Procedure:

    1. Check every chemically or biochemically degradable laboratory product for its expiration date every time it is used.
    2. If the outdate is approaching and if a routine switch to fresh-dated material is not anticipated, notify the department head or laboratory manager.
    3. If the outdate has been exceeded and authorization by the Technical Director for continued use of the product is not posted, do NOT use it. Contact the department head or laboratory manager.
    4. Never assume that past authorization for the continued use of an outdated product will serve as tacit permission to do it again.
    5. Occasionally a manufacturer will “extend dating” on a product. Continued use of these products will still require authorization from the Technical Director.


June 1986 S. Raymond


Package Insert File Program

Principle:

It is fairly common for products used in the clinical laboratory to undergo modifications in manufacturing, handling instructions, material components, information, etc. Occasionally, one of these changes will have a considerable impact on the procedure governing the use of the product. In order to bolster our efforts at recognizing and reacting appropriately to these changes, the Package Insert File program has been implemented.

Program:

  1. All package inserts are filed alphabetically by product name (as it appears on the package insert) in a file cabinet or drawer, which is clearly labeled and readily accessible to the laboratory staff at all times.
  2. The package inserts are to be filed by work area. All package inserts for products used in a given work area must be filed in that work area. This may result in some duplication of package inserts within the laboratory. The following work areas have been identified:

    MMC-DBQ work areas:

    Blood Bank

    remainder of laboratory

     

    MMC-DV work areas:

    entire laboratory

    Finley work areas:

    entire laboratory

    Cath. Square work areas:

    Histology/Cytology

    Microbiology

     

    Medical Assoc. work areas:

    entire laboratory

  1. Anytime a primary package or box of any reagent or consumable used in the laboratory is opened, remove the package insert and compare the revision date of the current insert to the one on file:

      A. If the revision dates are the same, discard the insert from the package just opened.
      B. If the revision dates do not match:

          a. Compare the revisions of the current insert to the one on file to see what has changed.
          b. If the change is likely to affect the procedure, consult with the Site Supervisor/Manager before using the product.
          c. Clip the current insert to the one that has been on file and leave them on the Site Supervisor’s/Manager’s desk for review, information dissemination and re-filing.

  1. Sometimes the manufacturer will include a special product notice or bulletin with the package in lieu of changing the insert. In this case, the notice or bulletin should be stapled to the package insert and the pair should be treated as a revision. See “Product Notices and Safety Recall” for handling policy."

 


June 1988 S. Raymond, CAP Inspection Committee
August 1994 L. McGovern (Revised: II.2.)

Specimen Integrity

Citrated Plasma For Coagulation Studies

    (Protime, APTT, Thrombin Time, Fibrinogen, DVVT And D-Dimer)

Principle:

Plasma is the blood component used in laboratory tests to evaluate clotting. Soluble citrates act as anticoagulants by combining with the calcium in whole blood to form insoluble calcium salt. Plasma thus obtained can be studied by the addition of sufficient calcium to neutralize the anticoagulant that was added.

Materials:

  1. Greiner brand (CardinalHealth #GR-454334B), 3.0 ml draw, tubes containing 0.3 ml 3.2% sodium citrate.
  2. Greiner brand (CardinalHealth #GR-454322), 2.0 ml draw tubes, containing 0.2 ml 3.2% sodium citrate.
    (Use of the 2 ml sodium citrate tubes must be approved by the Site Supervisor for each patient. The 2 ml sodium citrate tubes are not to be stored in the phlebotomy trays.)
  3. 100 µl Rainin pipette
  4. Rainin pipette tips (Rainin #201)
  5. 5.0 ml sterile plastic syringes (CardinalHealth #SY35005LL)
  6. Centrifuge

Procedure:

    Plasma/Anticoagulant Ratio:

  1. The ratio of plasma to citrate solution is of primary concern:
    1 volume anticoagulant to 9 volumes whole blood.
  2. Volume comparators for the 2.0 ml and 3.0 ml tubes have been provided at the bench as standards against which all coagulation samples must be evaluated for adequate fill and plasma/packed cell ratio.
    (See
    Citrate Comparator Prep for instructions on how these are made.)
    A comparator tube has two indicators drawn on it. The top indicator represents the meniscus of a properly filled tube. Patient samples must fill within this area.
    A patient who has a high Hematocrit will have a reduced ratio of plasma to whole blood with a relative increase in anticoagulant. The net effect will be the same as an under-filled tube.
    The lower indicator line of the comparator tube represents a Hematocrit of 55%. The red cell layer of a properly centrifuged patient sample should fall at or below this line. If the red cell layer falls above this line the patient must be redrawn according to the following procedure:

      A. 3.0 ml tube:

        a. Using a 100 µl Rainin pipettor, remove and discard 100 µl of the sodium citrate from an unused 3.0 ml draw Greiner vacuum tube.
        b. Using a sterile, plastic syringe, draw 3.0 ml of blood from the patient.
        c. Remove the needle from the syringe and immediately transfer 2.7 ml of the blood to the citrate tube that has been specially prepared.
        d. Replace the blue stopper and mix by inverting 10 times.
        e. Assay the sample within one hour.

      B. 2.0 ml tube:

        a. Using a 50 µl Rainin pipettor, remove and discard 50 µl of the sodium citrate from an unused 2.0 ml draw Greiner vacuum tube.
        b. Using a sterile, plastic syringe, draw 2.0 ml of blood from the patient.
        c. Remove the needle from the syringe and immediately transfer 1.8 ml of the blood to the citrate tube that has been specially prepared.
        d. Replace the blue stopper and mix by inverting 10 times.
        e. Assay the sample within one hour.

  1. Any samples prepared with the reduced anticoagulant should be assayed with the original sample when possible. If a patient known to have a Hematocrit > 55% has a regularly scheduled Coagulation test, only a reduced anticoagulant tube is used. Newborns with a Hematocrit of > 55% can be drawn using a 3.0 ml or 2.0 ml reduced anticoagulant tube.
      1. If the results of the “reduced anticoagulant sample” are less than or equal to the results of the “original sample”, report the result from the “reduced anticoagulant sample”.
      2. If the results of the “reduced anticoagulant sample” are greater than the results of the “original sample”, consult with a pathologist before reporting a result and contact the office of the Technical Director.
  2. If an order is received on a newborn less than 14 days old, for any of the tests listed above, check to see if a CBC was run on the newborn patient. If the Hematocrit was >55% collect a 3.0 ml or 2.0 ml draw reduced anticoagulant tube.
  3. If a patient with a known high Hematocrit has a regularly scheduled Coagulation test, a 3.0 ml reduced anticoagulant tube can be drawn first. Be sure to check the Hematocrit of the sample collected against a comparator tube in case there is a change in the patient’s Hematocrit.

    Venipuncture:

  1. The sample must not be hemolyzed or contaminated with tissue juices. If the venipuncture has been difficult, with much probing for the vein, the first tube filled should be discarded and a second tube should be filled and used as the test sample.
  2. If a syringe is used to draw the blood, it must be a new, disposable plastic syringe.
  3. It is not permissible to draw blood for coagulation tests from an arm with an I.V. solution containing heparin, except when there is no other alternative. Results of coagulation studies drawn above or below a heparin I.V. should be reported with the disclaimer: "Disclaimer: drawn above/below I.V. containing heparin." It is not necessary to notify a pathologist unless a problem or question arises that needs his/her input.

    Preparation of specimens for testing:

  1. Specimens should be centrifuged as soon as possible after collection. Centrifuge the specimen according to instructions on the centrifuge label for blood collection tubes. The plasma must be absolutely cell-free with the platelets and white cells packed out in a firm buffy coat.
  2. Specimens should be unstoppered only during testing and then only long enough to remove plasma for the assay(s). Specimens that are hemolyzed, under-filled or even slightly clotted, must be rejected. Normally lipemic or icteric specimens will not affect coagulation results on photo-optical instruments.
    If a result cannot be obtained on a lipemic sample the sample must be ultracentrifuged and rerun.
  3. In the event samples must be transferred to another site to be tested, the citrated plasma must be carefully removed from the centrifuged tube using a plastic dispopipette and placed in a plastic transfer tube. Be very careful not to disturb the nebecula while removing the plasma. Citrate samples, that have been resuspended following centrifugation should not be recentrifuged and tested. If samples are resuspended, platelet activation in the sample may occur, which could affect results.

    Storage:

  1. Protime (PT):
      1. Specimens for PT assays centrifuged or uncentrifuged with plasma remaining on top of the cells in an unopened tube at 18-24°C; should be tested within 24 hours from the time of specimen collection.
        Note: Storage at 2-4°C may result in cold activation of Factor VII and therefore alter the PT results.
      2. If the testing for PT specimens is not completed within 24 hours the plasma should be removed from the cells and transferred to a plastic tube. Samples are frozen at -20°C for up to two weeks. Frozen samples should be rapidly thawed at 37°C while gently mixed. Samples should be centrifuged and tested immediately. If testing cannot be performed immediately the sample may be held for a maximum of 2 hours refrigerated, until tested.
  2. APTT:
      1. Specimens for APTT assay stored at 2-4°C or 18-24°C should be centrifuged within one hour of collection and the plasma tested within four hours from the time of specimen collection.
        Note: The APTT may be affected on specimens that have been frozen.
      2. If the specimen is to be transported the plasma should be removed within one hour of collection and tested within four hours from the time of specimen collections.
  3. Fibrinogen, Thrombin Time, D-Dimer and DVVT:
      1. Specimens stored at 2-4°C or at 18-24°C should be centrifuged and tested within four hours of specimen collection.
      2. If the testing is not completed within 4 hours the plasma should be removed from the cells and transferred to a plastic tube. Samples are frozen at -20°C for up to two weeks. Frozen samples should be rapidly thawed at 37°C while gently mixed. Samples should be centrifuged and tested immediately. If testing cannot be performed immediately the sample may be held for a maximum of 2 hours refrigerated, until tested.

References:

  1. Davidsohn, Israel, M.D.; John Bernard Henry, M.D.; Clinical Diagnosis by Laboratory Methods, 15th Edition, p. 435; 1969.
  2. Koepke, J.A., M.D.; M.J. Olivier M.T.; American Journal of Clinical Pathology: “Pre-Instrumental Variables in Coagulation Testing”. Vol. 64, pp 591-596; 1975.
  3. NCCLS Vol. 17 No. 18: H21-A3; “Collection, Transport and Processing of Blood Specimens for Coagulation Testing and General Performance of Coagulation Assays”; approved guideline; 3rd Ed.
  4. Steve Raymond, Technical Director Pathology Associates; Dubuque, IA.
  5. NCCLS Vol. 23 No. 18: H21-A4; “Collection, Transport and Processing of Blood Specimens for Coagulation Testing and General Performance of Coagulation Assays”; approved guideline; 4th Ed.

 


2-25-81 S. Raymond
1-10-91 L. Kelley, MT(ASCP)
12-1-93 J. Schultz, MT(ASCP)(Revised: note under III-3-B)
6-28-96 L. Kelley, MT(ASCP) (Revised: added III.2.C.)
4-9-98 L. McGovern, (Revised: III.1. and III.4.)
February 1999 L. McGovern (Revised: III.3.Note)
April 2002 L. McGovern/S. Raymond (Revised: III.1.B-D., 3.B.Note, 4.; IV.3-4.)
May 2002 L. McGovern (Revised: II., III.B,D.)
June 2008 L. McGovern (Revised: updated for ACL Elite Pro)
August 2009 L. McGovern (Revised: updated for NCCLS guidelines Storage 1-3.; Reference 5.)
September 2009 L. McGovern (Revised: for Greiner 3.0 ml sodium citrate tubes)
April 2010 L. McGovern (Revised: for Greiner 2.0 ml sodium citrate tubes)

Specimen Integrity;
Hematology

Principle:

Hematology Specimen Integrity addresses criteria for inspecting, accepting and rejecting specimens received into the hematology section. It should be read and understood by all phlebotomists as well as anyone performing analysis in Hematology.

General Guidelines:

    Obtaining a Greiner vacuum tube or microtainer specimen:

  1. Tube and anticoagulant:
      1. CardinalHealth #GR-454209B EDTA (K2) 4 ml
      2. CardinalHealth #GR-454246B EDTA (K2)(HG) 3 ml
      3. CardinalHealth #363706 Dipotassium (K2) EDTA Microtainer MAP (Microtube for Automated Process)

Note: Heparin specimens from cardiac surgery cases may be accepted for hemoglobin only.

  1. Sample: whole blood.
  2. Volume:
      1. Optimum: full tube
      2. Minimum: tube ¼ full
      3. Minimum Microtainer: 375 µl (Minimum blood volume required)
      4. Maximum Microtainer: 500 µl (Do not fill above 500 µl line)
  3. Specimens MUST BE INVERTED, gently but completely, 5 to 8 times immediately following collection. (If multiple Greiner tubes are being drawn, the filled tube must be mixed prior to the collection of the next tube or while the next tube is filling.) COMPLIANCE HERE IS ABSOLUTELY IMPERATIVE!
  4. Special Handling: DO NOT ASSAY COLD SPECIMENS! If a specimen is cold, it must be allowed to equilibrate to room temperature before mixing. DO NOT WARM SPECIMEN BY PLACING IT INTO A 37°C WATERBATH OR AN INCUBATOR. If the whole blood specimen is brought to room temperature quickly there is a tendency for the platelets to clump upon mixing.
  5. If a specimen was in any way difficult to obtain, make note of the type of difficulty on the requisition.

    Processing all specimens:

  1. Refer the “Specimen Handling: Integrity, Identification & Rejection” policy in the Quality Assurance Manual.
  2. Checking for clots in specimens for any parameter of a CBC/Diff:

    Open tube sampling systems:

      1. Do a visual check for clots that may adhere to the stopper or the cap.
      2. Prior to running the sample, if clotting is questionable, insert two wooden applicator sticks into the tube and extend to the bottom of the tube. Twist the sticks and withdraw while watching for small clots adhering to the surface of the sticks. Separate the sticks to determine if a clot is present between the sticks. If clotting is present, the specimen must be rejected. If no sign of clotting is present, proceed with testing.
      3. Microtainer tubes are handled as described above. but only one stick is inserted into the microtainer when checking for clots.

    Closed tube sampling systems:

      1. Check the requisition for notes indicating any difficulty obtaining the specimen. (If difficulty was indicated then remove the stopper and proceed as described above.)
      2. Visibly check the closed tube for any signs of clotting. If clotting is present, the specimen must be rejected. If no sign of clotting is present, proceed with testing.
  1. If a specimen must be rejected once it has reached the lab:
      1. Call the appropriate nurse's station. Inform them why the specimen was rejected and ask for a new requisition if appropriate. Arrange for another specimen to be collected.
      2. If the specimen is on an outpatient, notify the doctor's office so they can notify the patient or obtain the phone number of the patient and arrange with the patient to recollect the specimen. If the specimen is from a non-UCL site, (i.e.: Medical Associates Lab/veterinary office) notify them of the need to reject and they will contact the patient and/or doctor's office. If the specimen is on a nursing home patient, notify the nursing home and determine with them when the test may be rescheduled for collection.
      3. Write the initials of the individual that was informed on the current requisition along with the time and date called, and your initials.
      4. Make a note on the current requisition why the specimen was rejected and file the accession copy with the current day's accession copies.
      5. In some cases, our office staff will need to be notified if a patient will be coming in for recollection or if a test needs to be credited.
  2. Using an EDTA tube that has been centrifuged:
    An EDTA tube that has been centrifuged may be used for CBC analysis as long as plasma has not been removed from the tube and all other specific guidelines are met. Before use, the EDTA tube must be placed on a rocker for 15 minutes to allow formed elements to be resuspended.

Specific Guidelines:

    CBC/Diff:

  1. Storage stability: up to 24 hours at room temperature, up to 48 hours at 2 - 8°C.
  2. Microtainer specimens are stable for 12 hours at room temperature.
  3. Hemolyzed specimens are generally unacceptable. (Hemolysis may be acceptable in cases where the patient is hemolyzing intravascularly. Check with a pathologist.) Current instrumentation will generate a flag when hemolysis is not acceptable. Refer to “Peripheral Blood Smear (PBS) Review, Technologist/Pathologist” procedure, section IV.1.
  4. Hemoglobin, MCH, and MCHC on grossly lipemic specimens must be corrected. (Refer to the “Hemoglobin, Correction for Lipemia” procedure.) Current instrumentation will generate a flag when lipemia is not acceptable. Refer to “Peripheral Blood Smear (PBS) Review, Technologist/Pathologist” procedure, section IV.1.

    Hemoglobin and Hematocrit:

  1. Storage stability: up to 24 hours at room temperature, up to 48 hours at 2°C to 8°C.
  2. Microtainer specimens are stable for 12 hours at room temperature.
  3. Hemolyzed specimens are generally unacceptable. (Hemolysis may be acceptable in cases where the patient is hemolyzing intravascularly. Check with a pathologist.) Current instrumentation will generate a flag when hemolysis is not acceptable. Refer to “Peripheral Blood Smear (PBS) Review, Technologist/Pathologist” procedure, section IV.1.
  4. Hemoglobin on grossly lipemic specimens must be corrected. (Refer to the “Hemoglobin, Correction for Lipemia” procedure.) Current instrumentation will generate a flag when lipemia is not acceptable. Refer to “Peripheral Blood Smear (PBS) Review, Technologist/Pathologist” procedure, section IV.1.

    Platelet Count:

  1. Storage stability: up to 24 hours at room temperature, up to 48 hours at 2 - 8°C.
  2. Microtainer specimens are stable for 12 hours at room temperature.

    Reticulocyte count:

  1. Storage stability: up to 24 hours at room temperature, up to 48 hours at 2° - 8°C.
  2. Microtainer specimens are stable for 12 hours at room temperature.
  3. Hemolyzed specimens are generally unacceptable. (Hemolysis may be acceptable in cases where the patient is hemolyzing intravascularly. Check with a pathologist.)

    WBC and Peripheral Blood Smear:

  1. Storage stability: up to 24 hours at room temperature, up to 48 hours at 2 - 8°C.
  2. Microtainer specimens are stable for 12 hours at room temperature.

References:

  1. Bayer Advia 120 Operator Manual
  2. Miale, John B., Laboratory Medicine Hematology, 6th Ed., 1982, pp 351-357.
  3. Clinical Laboratory Handbook for Patient Preparation and Specimen Handling Fascicle II Hematology, CAP Sept., 1982.
  4. BD Microtainer MAP (Microtube for Automated Process) with K2 EDTA. Becton Dickenson Company. Franklin Lakes, NJ. 4/2010.
  5. LH700 Series Operator’s Guide. Beckman Coulter. PN4277249C.

 


December 1985 J. Mueller
April 1997 J. Schmitz
January 2001 L. McGovern/S. Raymond (Revised: II.4. added)
May 2006 S. Hosch/L. McGovern (Revised: for new hematology analyzers)
March 2011 L. McGovern (Revised for K2 EDTA Microtainer MAP tubes, storage & stability info, ref 4-5 added)
April 2011 L. McGovern (Revised: Processing all specimens section)

Specimen Integrity;
Veterinary Specimens

Principle:

Criteria are outlined for inspecting, accepting and rejecting veterinary specimens received in Specimen Processing.

Protocol:

Refer the “Specimen Handling: Identification, Integrity & Rejection” policy in the Quality Assurance Manual.

Note: This policy is limited to veterinary specimens and is not applicable to human specimens.

  1. The assumption will be made that the veterinarian who has placed the order has evaluated and accepted the condition of the specimen(s) being submitted. An attempt will be made to complete analysis on all veterinary specimens unless one of the following conditions is noted (see 2 in this section.).
  2. The veterinarian’s office should be contacted if:
      1. insufficient sample is received to perform the analysis
      2. specimen for CBC is clotted
      3. specimen for coagulation studies is either improperly filled or clotted
  3. Specimen Processing will routinely adhere to the following guidelines in the handling of veterinary specimen(s) without consulting with a pathologist:
    Comments relating to specimen integrity are entered into the Report Comment field in CLICS and noted on the requisition.
    Examples are:
      1. hemolyzed
      2. >48 hours old
      3. serum bloody; respun
      4. lipemic
      5. lipemic; airfuged
  4. If there is no identification on the specimen, call the submitting office and attempt to get verbal identification. If the office cannot verify the identification over the phone, send the specimen back to them.

 


October 1993 L. Kelley/S. Raymond
November 2004 L. McGovern/S. Raymond (Revised: II.3.B.)

Specimen Integrity
Turbidity Assessment, Serum

Turbidity Scale:

Note: "Standardized print" refers to font attributes as they appear in this printed statement or in the body text of our analytical procedure template; i.e., Arial 10 - 12 - plain.

  1. Negative:
    Clear serum. Standardized print can be seen through serum, and letters are distinguishable. Light is transmitted through serum.
  2. Trace:
    Slight haze in serum. Standardized print can be seen through serum, and letters are distinguishable. Light is transmitted through serum.
  3. 1+:
    Light turbidity in serum. Standardized print can be seen through serum, but letters are not distinguishable. Light is transmitted through serum.
  4. 2+:
    Turbid serum. Standardized print cannot be seen through serum. Light is transmitted through serum.
  5. 3+:
    Heavy turbidity in serum. Standardized print cannot be seen through serum. Light is barely transmitted through serum.
  6. 4+:
    Very dense, milky white serum. Standardized print cannot be seen through serum. No light is transmitted through serum. Totally opaque.

Reference:

  1. Lipid Turbidity Scale. Beckman CX3 Operation Manual Appendix. Beckman Instruments, Brea, CA.
  2. Steve Raymond. Dubuque Pathology Associates.

 


L. McGovern; March 1996

Specimen Handling: Identification, Integrity & Rejection

Principle:

All specimens received in the laboratory must be properly identified and meet specific integrity requirements to assure the best patient care. This policy establishes guidelines for acceptability and steps to take to resolve problems.

Procedure:

All specimens must be labeled in the patient's presence per required standards (Joint Commission; College of American Pathologists).

The following criteria must be met for any specimen to be tested. Any deviations are documented on the Specimen Problem Form (SPF) form and filed in the appropriate Incidence Log in the specimen processing area.

Specimen processing personnel are responsible for ensuring all criteria have been met before referring any specimen. If a referred specimen’s identification and/or collection documentation is incorrect or incomplete, call the referring laboratory for resolution and complete and file an SPF.

Certain instances require a client's participation in the completion of the SPF and are indicated in the instructions that follow.

1. Identification

      A. Laboratory Collectibles

        a. Specimen

          1. Two patient identifiers are required: patient name AND unique identification number (e.g., medical record number; Typenex number; birth date; patient's account number; client number)
          No patient identifiers: Reject unless extreme circumstances exist (e.g., unable to redraw patient). If questionable, consult a supervisor, pathologist or designee. Complete and file an
          SPF.
          One patient identifier: Return to phlebotomist/collector for complete patient identification. Complete and file an
          SPF.
          2. Collection date, time or collector’s initials missing
          Transcribe the information from the accompanying requisition if the accession # matches, or return the specimen to the collector for completion. If the collector is not available use the appropriate disclaimer(s). Complete and file an
          SPF.

        b. CLICS requisition
        All collector information must be completed. Transcribe the information from the specimen to the requisition if the specimen has the missing information, or return the requisition to the collector for the missing information. No further documentation needed.

      B. UCL Client Collectibles (excluding Veterinary)

        a. Specimen

          1. Two patient identifiers are required: patient name AND unique identification number (e.g., medical record number; Typenex number; birth date; patient's account number; client number)
          No patient identifier: Call the client and reject the specimen unless extreme circumstances exist (e.g., unable to obtain another suitable specimen).
          Rejection agreed to by client; complete an
          SPF and attach a copy of the SPF to the order for office tracking. File the original SPF.
          Client requests the unlabeled specimen be tested:
          If the specimen does not have time restrictions, send the specimen (only if easily transported; e.g., a culturette swab), office paper work and the
          SPF to the client for follow-up. Save a copy of the SPF in the pending area at the receiving site awaiting the return of the specimen and order. File the returned completed SPF.
          If the specimen must be tested the day of arrival for appropriate patient care, FAX the
          SPF to the client for identification confirmation, noting that the client’s signature denotes responsibility for the identification of the specimen. Results are held until the client returns (preferably by FAX the same day) the completed SPF. File the returned completed SPF.
          Enter the comment “Specimen received unlabeled; client verified.” in the Report Comment section of the result.

          2. One patient identifier:
          Call the client and inform them the specimen was not adequately identified. FAX an
          SPF to the client for identification verification. Hold the results until the completed SPF is returned (preferably by FAX the same day). File the returned completed SPF.
          3. Specimen does not have the date/time of collection:
          Use the appropriate disclaimer(s) and complete and file an
          SPF. No client contact is necessary.
          4. Specimen does not have the collector’s initials:
          Use “UNK” as the collector. No further documentation is necessary.

          5. Specimen requires a fixative and was received with no fixative label:
          Call the client and ask if a fixative was added. Add a fixative label if the client added fixative or add the fixative (if appropriate) and fixative label if the client did not add fixative. Complete and file an
          SPF.
          6. Specimen and client request do not match:
          Name or identification number mismatch must be clarified by the client. Call the client to inform them of the discrepancy and FAX the
          SPF. Hold the results until the completed SPF is returned. File the returned completed SPF.

        b. Client request (order)

          1. Call the Client for missing information and complete and file an SPF:
          Insurance Information
          Gender
          Date of Birth
          ICD9 code
          Ordering Physician
          Tests requested
          Specimen Source (type or body site) if not blood

          2. Date and/or Time of Collection:
          If not provided on the specimen use the appropriate disclaimer(s) and complete and file an
          SPF. No client contact is necessary.
          3. Initials of Collector:
          If not provided on the specimen use “UNK” as the collector. No further documentation is necessary.

          4. Priority Level:
          No action is necessary.

      C. Veterinary Collectibles
      Note: Veterinary testing is not regulated by UCL’s accrediting agencies. Due to the unique nature of veterinary testing, only one patient identifier is required.

        a. Veterinary specimens require only one identifier (animal name or identification number). Call the veterinary office for clarification and complete and file an SPF if:

          1. Unlabeled specimen received with identifying paperwork.
          2. Specimen label does not match the request.

        b. If necessary information is not included on the request (or on the specimen) call the veterinary office for clarification and complete and file an SPF. Necessary information includes:

          1. Animal type
          2. Specimen source (type or body site) if not blood
          3. Owner’s name
          4. Ordering veterinarian

        c. If the Date and/or Time of collection are not provided use the appropriate disclaimer(s) and complete and file an SPF. No client contact is necessary.
        d. If the collector’s initials are not provided use “UNK” as the collector. No further documentation is necessary.

      D. Backup Testing Form
      If any information is missing, call the referring site. No further documentation is necessary.

        a. Name of the referring site
        b. Initials of person notifying the receiving site
        c. Date and time of notification
        d. Receiving site
        e. Accession numbers of each specimen
        f. Test(s) requested

      E. Aliquotted (separated) specimen

        a. If a specimen is accompanied by a requisition or backup testing form, only the accession number is required. An unlabeled aliquot is rejected unless extreme circumstances exist. Call the referring site and ask for a new, labeled aliquot. Complete and file an SPF.
        b. If no requisition or backup testing form accompanies the specimen all the following information is required: Call the referring site if more information is needed. No further documentation is necessary.

          1. Patient name
          2. Patient identification number
          3. Accession number
          4. Date and time of collection
          5. Initials of phlebotomist/collector.

      F. Blood Bank Specimen (refer to the “Patient Identification” policy)
      Blood Bank Specimens are rejected if the required identifiers are missing. Request a redraw and complete and file an
      SPF.

        a. The medical record number must be handwritten from the patient's hospital armband onto the specimen or the accompanying requisition.
        b. A Typenex number must be placed on the specimen (or handwritten) if the Typenex band is the patient's identification.

2. Integrity

      A. Retrievable specimens not meeting the required integrity for testing are rejected.

        a. Arrange for recollection.
        b. Cancel and/or reorder the test if necessary.
        c. Document the reason for rejection on the SPF and include all follow-up. File the completed SPF.

      B. Irretrievable specimens (e.g., CSF, biopsy, pre-antibiotic specimen) not meeting the required integrity for testing are brought to the attention of a supervisor, pathologist or designee (with the exception of veterinary specimens).

        a. Run the ordered test(s) if possible.
        b. Include an appropriate disclaimer comment with the result.
        c. Document the event on the SPF and include all follow-up. File the completed SPF.

      C. QNS specimens – consult with the ordering physician as to which tests to perform. Document the physician response on the SPF and cancel all tests that could not be performed. File the completed SPF.

 


1-20-89 M. English
8-13-93 M. English (Revised: format; II.3.B.)
3-14-95 S. Raymond (Revised: added II.3.B.c.)
6-23-96 M.J. Bonifas (Revised: II.1.B.1-6.)
4-3-97 J.A. Schmitz (Revised: added II.1.A.g.-i. & II.3.C.)
12-17-97 S. Hosch (Revised: added II.1.A.j. & II.1.B.g.)
1-13-98 E. Steiner (Revised: added II.1.B.h. & II.3.B.e.)
January 2007 L. McGovern (Revised: adding a test requiring new accession no.)
January 2009 S. Hosch (Revised: Nurse/Physician/UCL Client collectibles-patient identifiers)
July 2010 S. Rodriguez, S. Raymond (Revised: for two patient identifiers; mod. sample relabeling)

Reported Erroneous Results

Reported Erroneous Results

Principle:

The Reported Erroneous Results policy directs and documents correction and follow-up activity when an erroneous result has been released from the laboratory by phone, computer or charted report. The purpose of the policy is to prevent harm to the patient and the occurrence of similar errors in the future.

Protocol:

When a reported error is discovered notify the supervisor (or acting supervisor) immediately. The supervisor will take the following steps in the following order:

  1. Make an initial determination as to whether the error was isolated or was likely to have involved more than one patient; e.g.: specimen or result interchange, sequence error in the run, etc.
  2. If there is any question about whether the erroneous result(s) should be corrected, contact a pathologist at this time.
  3. Immediately launch a plan for retesting any other patient result(s) that may have been in error. Proceed to the next step without waiting for these results.
  4. Take the following actions for each report that is known to be in error:
      1. Phone the physician’s office or nursing station and ask the nurse or ward clerk to determine if the incorrect report has been charted.
      2. If it has not been charted:
          Instruct the nurse or ward clerk to intercept the incorrect report and send it back to the lab to your attention.
      3. If it has been charted:
          Make absolutely certain that the nurse or ward clerk is looking at the incorrect report, then
          Have that person draw a single
        red line through the erroneous result(s) and write the word “ERROR” next to the erroneous result(s). Instruct the nurse or ward clerk to record the date & time and initial the entry.
      4. Ask the nurse or ward clerk whether the attending physician or any other physician has or may have received the report.
      5. If the physician(s) has (have) received the report or if there is any uncertainty whether it has been received by the physician(s), contact each physician and inform him/her that:
        The reported “
        name of the test” result on ”patient name” in “room #”, at “hospital” (if appropriate) drawn at “time and date” was incorrect.
        If the corrected result is available at the moment, it may be given verbally to the physician.
      6. If a copy of the incorrect result has been forwarded to a data entry station for entry into the computer, either personally go to that data entry station and remove it or contact data entry personnel and have them intercept the report and return it your attention.
      7. If the incorrect result has been entered into the computer, either replace the result with the text value “rslt removed-error” in the computer or direct a data input person to replace the result with the text value “rslt remove-error”.
      8. Check the laboratory’s appropriate primary record, if there is one, to see if it contains the incorrect result. If it does, write in red ink next to the result “WRONG RESULT” and initial the entry.
  5. Take steps now to obtain the correct result if it was not immediately available. If you are able to obtain the correct result:
      1. Call the nurse’s station or physician’s office and inform the nurse or ward clerk that:
        The corrected “
        name of the test” on ”patient name” drawn at “time and date” in “room #” (if appropriate), is “correct result”.
      2. If the incorrect result(s) were removed from the computer:
        Input the correct result(s) and enter the following comment in the “Report Comment” section of the LIS:
        The “
        name of test” drawn at “time and date” has been corrected for error.
        Note: The CLICS application will append an additional comment “Test results have been amended;” followed by a date/time stamp of when the change occurred. This may seem somewhat redundant and is purposely so.

        If the original report contained test values in addition to the incorrect result(s), make sure that those original values are re-reported at this time.
        Do this personally or designate a data entry person to do it.
      3. During downtime scenario:
        Issue a new requisition/report (if appropriate) with the correct result and indicate on the requisition/report that
        The “
        name of test” drawn at “time and date” has been corrected for error.
        If the original requisition/report contained test values in addition to the incorrect result(s), make sure that those original values are re-reported at this time.
  6. Fill out the “Reported Erroneous Result Log” (RERL) and make sure that:
      1. A pathologist has been informed of the situation.
      2. The principal individual(s) involved with the generation of the incorrect result(s) have filled in the appropriate section. Each individual involved should print their name following their entry.
  7. Send the RERL to the Technical Director/CIO. Send the original; do NOT make a copy since this can result in divergent documentation in those cases where additional follow-up is deemed necessary. The single source RERL is digitized and indexed and can be retrieved either upon request to the Technical Director/CIO or directly by managers/supervisors who are authorized for ON-Base access.

Reported Erroneous Results
RERL Log; page 1

Reported Erroneous Result Log
RERL Log;
page 2

 


6/20/89 S. Raymond/J. Brennan, M.D.
7/10/86 T. Edmonds, M.D. (Revised: II.2.D.a.)
September 2001 M. English/S. Raymond (Revised: II.2.E.b-c. & F.; format)
April 2007 M. English/S. Raymond (Revised: protocol 4.g. & RERL)

Calibration Verification; CLIA Specifications

Regulations for Calibration Verification:
Joint Commission Standard QSA.02.03.01

  1. The laboratory has a written procedure for calibration verification that includes the following, at a minimum:
      1. The requirements established by the instrument manufacturer.
      2. The number of calibration verification levels.
      3. The type of calibration verification materials used.
      4. The concentration of the calibration verification materials.
      5. The frequency of calibration verification.
      6. The acceptable performance limits for the calibration verification.
  2. The laboratory tests the reportable range of results during the calibration verification process, including a minimal value, a midpoint value and a maximum value based on the manufacturer’s directions and instrument history.
    Note: The Joint Commission does not require the purchase of commercial linearity kits to meet this requirement. Quality control materials, previously tested proficiency testing samples with known results, and calibration materials are acceptable to use for calibration verification.
  3. Calibration verification is performed every six months.
    Note: Semiannual calibration verification is not required when the laboratory performs calibration at least once every six months using three or more levels of calibration materials that include a low, mid and high value.
  4. Calibration verification is performed whenever the following events occur:
      1. A complete change of reagents for a procedure is introduced, unless it is demonstrated that changing reagent lot numbers does not affect the range used to report patient test results, and control values are not adversely affected by reagent lot number changes.
      2. Major preventative maintenance is performed or critical parts are replaced that may influence test performance.
      3. Quality control results indicate that there may be a problem with the test system.
      4. An environmental change occurs, including instrument relocation.
      5. An instrument is replaced.
      6. Quality control materials reflect an unusual trend or shift or are outside the laboratory’s acceptable limits, and other means of assessing and correcting unacceptable quality control values fail to identify and correct the problem.
  5. The laboratory follows its procedure for calibration verification. The calibration verification performance is documented.

Calibration Verification Policy
Reference: CAMLAB Update 1, March 2012.
Reportable Range is verified during the Calibration Verification process on those procedures to which Calibration Verification applies.

For Calibration Verification, tests are broken down into categories and acceptable materials determined for each test.

Reportable Ranges are adjusted in the relevant analytical procedures to correspond with the six month AMR (Analytic Measurement Range - Calibration Verification).


Test List with assigned Calibration Verification Categories & material (pgs 1-5)

Calibration Verification Categories:

  1. If the laboratory performs a calibration protocol with 3 or more levels of calibration materials that include a low, mid, and high value at least every 6 months, the calibration verification requirement is met.
    • Tests included:
      DxC
      : AMPH, BARB, BENZ, COCM, OP2, PCP, THC5, UCRP, CAR, DIGN, GEN, PHNB, PHNY, THE, TOB, VPA, VANC, ACTM
      DxI/Access
      : T4 FREE, PSA, Myoglobin, Troponin I, CKMB, Estradiol, Progesterone, Cortisol, BhcG, TSH, FSH, LH, Folate, Ferritin, Prolactin, Free PSA, CA125, CEA, B12, T4, PTH, PTHio, Testosterone, AFP
      Architect
      : Homocysteine, Anti-CCP, 25-OH Vitamin D2/D3.
      Xpand Plus
      : ALB, ALP, ALT, AST, BUN, CA, CHOL, CK, CRE, CRP, DBILI, GGT, GLU, HDLD, HBA1C, LD, LDLD, MA, PHOS, TBILI, TP, TG.
  2. The calibration requirement does not apply to a variety of procedures, which include, but are not limited to: Procedures involving an instrument in which calibration is not practical, e.g., APTT procedure. These tests are exceptioned.
    • Tests included: APTT, Thrombin Time, Protime, Fibrinogen, DVV, Hepcon ACT, Gem PCL ACT, TU, Platelet Function testing (P2Y12), Sed Rate.
  3. Qualitative tests. Calibration determines a cutoff value that sample results are compared to and then results are reported as Positive or Negative. Calibration Verification not required.
    • Tests included:
      Architec
      t: Hepatitis C Ab, Hepatitis BsAg, Hepatitis BsAb, Hepatitis A Ab, HIV Combo
      QuantiFERON TB Gold, Aptima Chlamydia & GC, Polymedco iFOB
      BacT/Alert
      : Blood Culture
  4. Unit Use Waived Tests: Calibration Verification not required.
    • Tests included: Hemocue HGB, PXP Glucose, Coaguchek XS INR, Cholestech (Chol/TG/HDL/Glucose), Bilichek, Metrika A1c, DCA 2000+ HgbA1c (Bayer), INRatio 2 INR.
  5. Unit Use – not waived/qualitative Calibration Verification not required.
    • Test included: Urine TCA (qualitative), Amnisure, GeneXpert: MRSA-Nasal Screen PCR, MRSA/SA Blood Culture PCR, MRSA/SA Deep Tissue, Strep B, C. diff, VanA, FLU.
  6. Semiquantitative Calibration Verification not required.
    • Tests included:
      Clinitek Status
      : UA Chemical Screen
  7. Unit Use – not waived. Calibration Verification required.
    • Tests included: BNP, iSTAT blood gases, iSTAT1 Troponin I, iSTAT Creatinine, i-STAT Lactate, i-STAT Chem 8+ (Na, K, Cl, TCO2, BUN, Glucose, Creatinine, iCa)
  8. Tests that require Calibration Verification.
    • Tests included:
      Beckman DxC/CX5
      : ALB, ALP, ALT, AMM, AMY, ASO, AST, BUN, CA, CHOL, CL, CK, CO2, CRE, DBIL, ETOH, FE, GGT, Glu, HDLD, HPT, IBCT, IGA, IGG, IGM, K, LD-L, LDLD, Lipase, LITH, MG, Na, PO4, RF, TBIL, TP, M-TP, TG, URIC ACID, CRP, MA, SALY, CRPH, Prealbumin, HbA1c
      Xpand Plus
      : NA, K, CL,
      Elite Pro
      : D-Dimer
      Nanoduct Sweat Chloride
      LH500/LH750
      : RBC, HGB, WBC, PLT, Retic
      Gem Premier
      : pH, pO2, pCO2, Na, K, Cl, iCa, Glucose, Hgb, Hct
      Lead
      Osmolality
      Gem OPL
      : Total Hgb

      Note:
      “The requirements for analytical measurement range verification apply only to those parameters that can be calibrated and that are measured directly. These requirements do not apply to calculated parameters. Most oximeters can be calibrated for only total hemoglobin. Therefore, AMR verification should be performed for total hemoglobin, but it is not needed for derived quantities, such as carboxyhemoglobin fraction or methemoglobin fraction.”

      (CAP; Laboratory Accreditation Newsletter, Queries and Comments. 05/31/2011)

20050322 L. McGovern/S.Raymond
20050922 L. McGovern
20081021 L. McGovern/S.Raymond
20091119 L. McGovern
20100427 L. McGovern
20101012 L. McGovern
20110504 L. McGovern
20111007 L. McGovern
20111121 L. McGovern
20111216 L. McGovern/S.Raymond

Information Technology Problem Report/ ITPR

ITPR Purpose/design:

The Information Technology Problem Reporting program with its ITPR template [see ITPR template following this paragraph] enables laboratorians in the process of pattern recognition, documentation and response to IT problems they might encounter. It, like the IPR (Instrument Problem Report), is an evidence-chain format designed to coordinate detection, recognition, and remediation of LIS problems early on, hopefully while they are still relatively minor and before they pose serious interruption of services and consumption of resources.

ITPR Content/layout:

The ITPR is divided into 8 sections:

  1. Demographic: This is the header section and includes the name and location of the I.T. hardware/software along with the date, time and name of the tech filing the report.
  2. Symptoms: This section requires that the operator attempt to register the hardware/software performance deviation as essentially associated with an operational malfunction, physically apparent malfunction, or some other broad category in an effort to trigger consideration of the nature of the error.
  3. Initial Attempts at Resolution: Laboratorians are typically encouraged to make initial attempts at resolving information technology problems with a reboot or restart of the I.T. hardware. This section asks for a brief description of any initial attempts.
  4. Status: Shutdown/Monitor: The operator, management personnel or I.T. Support Staff can shut the hardware/software down or elect to allow the it to continue in use while formally calling for additional, specific observation. The choice is indicated by a check mark in the box provided at Section 4 of the IPR. Space has been provided on the back of the Instrument Problem Report for documentation of continued observation.
  5. Evidence of Resolution: In order to obviate assumptions in terms of problem resolution, Section 5 of the ITPR requires some evidence that initial or front-line remedial action was indeed effective. This evidence could be simply a description of physical phenomenon such as resumption operational activity of hardware/software. Evidence for resolution will, of course be dependent upon the nature of the problem and the symptoms that triggered its recognition.
  6. Call Activation: In those cases when initial attempts at problem-solving are not successful or if the results of those attempts are uncertain, Section 6 of the ITPR is filled out. Recognition and action recorded on the ITPR to this point is communicated to LIS Project Coordinator or the on call IT Support Staff who will be responsible to assess the situation and bring additional resources to bear if warranted.
  7. Review: As with most of the QA programs of DBQ Pathology Associates, the Information Technology Problem Reporting system is designed to disseminate information and then close the circle; i.e., bring information, observation, pattern recognition, resolution, conclusion, etc. back to the individual(s) who initiated the process. There are several levels of review incorporated into the IPR:

        It is always reviewed by the LIS Project Coordinator.
        It is frequently reviewed by the LIS Director and/or Technical Director/Chief Information Officer (CIO).
        Final resolution is communicated by the LIS Project Coordinator to the Supervisor for dissemination of information to staff.

      Section 7 allows space for formal comment from LIS Director and/or the Technical Director/CIO.


2005 S. Raymond (Technical Director/CIO); M. English (LIS Project Coordinator)

Product Notices and Safety Recalls

Not infrequently, almost every day in our system, a manufacturer/provider of clinical laboratory material (instrumentation, reagent, hard consumables, software) will have to communicate an important message or directive to the end user regarding the handling of a specific product.

Content of these notices ranges from the purely informational, requiring no action, through procedural workarounds to discontinuance or even recall on occasion.

Notices are sent by the manufacturer/provider to the laboratory site(s) where the product in question is registered with them. The format is usually a letter by mail, fax or email (sometimes all three) and is almost always accompanied by a form to be filled out and returned to the supplier documenting that the target site has received the message and understood the content. (See example)[4 pgs thumbnail hyperlink]

Handling Product Notices

The laboratory site manager or department manager reviews the notice, reacts and formally responds to the provider. Initial activity prompted by the information, including the fact that the loop has been closed with the manufacturer/provider is documented on the Product Notice. It is dated and signed by the manager and then forwarded to the Office of the Technical Director for review, further action if warranted and scanning/indexing for future reference and retrieval.

Often times a Product Notice sent to one site will have longer term ripple effect on the rest of the enterprise triggering responses such as but not limited to:

    ● lot discontinuance
    ● product recovery
    ● redistribution
    ● procedural compensation
    ● consolidation of specific testing
    ● results retrieval and review
    ● physician notification
    ● etc.


20070228 S. Raymond

Unlikely Results

In addition to Reference (normal) Range, Critical values, Important Called Results (ICR), Reportable Range and Delta flagging, CLICS screens results of select analytes for Unlikely values. An “Unlikely” result is statistically improbable under any circumstances and needs to be investigated.

  1. If the result is a manual entry, review the raw data. Often transcription of the wrong number, decimal or entry into the wrong field is causative.
  2. Make sure the sample type matches the requisition and, if applicable, is properly selected on the instrument.
  3. Carefully inspect the sample itself for clots, extreme cloudiness, very high hematocrit, etc. If any of these are present, request a repeat sample acquisition.
  4. Check with the person who obtained the specimen and try to verify that it was collected properly. IV contamination and improper aliquoting continue to contribute to this kind of error. If there is any doubt here, request a repeat sample acquisition.
  5. Verify system integrity by reviewing surrounding results of the same constituent. If surrounding results are suspect, discontinue testing for that analyte and immediately contact an instrument specialist.
  6. Repeat the trial with a control close to the same concentration and then review the results with the supervisor/site pathologist.
  7. Once sample integrity is verified and the repeat trial verifies the result, refer to the analytical procedure for steps to follow when a result breaches the Reportable Range.

20120215 S. Raymond (Technical Director/CIO)
20120306 S. Raymond (Technical Director/CIO)

Printer Friendly Version