Laboratory Quality Assurance
Philosophy and design
Simply put, the product or raison d’être of clinical laboratory practice is the acquisition and conversion of analytic data to information that clients (most often physicians) can use in contributing to the effective and efficient management of patient care. “Quality Assurance” is a broad term typically applied to the system of processes, procedures and programs organized to provide a means of monitoring, evaluating and improving that product.
The DBQ Pathology Associates Quality Assurance Program can be functionally defined through a description of operational directives detailed either in the analytical procedures themselves or in the dedicated operational documents listed below:
Important Called Results (ICR)
Maintenance, Function Verification Protocol & Logs
Patient Parallel Testing
Quality Control; Histology/Cytology
Reportable Result (RR Breach)
Special Maintenance Reports (SMR)
Quality Control; General
Quality Control can be conceptually, if not practically, segmented from Quality Assurance. For our purposes it can be thought of as that component of Quality Assurance which concerns itself directly and immediately with the analytical process. Extended discussions attempting to dissect the difference between Quality Assurance and Quality Control are, however, largely low-level cerebral exercises not particularly beneficial in terms of the effective implementation of either.
A more fruitful discussion might center on two, common, misguided notions which actually do affect the application of Quality Control:
Because it is considerably easier to transcribe data than it is to think about it and because the former is frequently substituted for the latter, it is necessary to emphasize that, while one of its fundamental aspects is indeed clerical, the purpose of Quality Control is not the transcription.
Neither is it the principle object of Quality Control to demonstrate how well things are going but, more nearly the opposite, to highlight anomaly and draw attention to any error that might be occurring. The focus of a well-staged program will be on early problem detection, definition and resolution; it should result ultimately in the systematic reduction of failure rate.
Quality Control; purpose/design:
The Quality Control program of DBQ Pathology Associates provides a means of detecting variance from expected analytical behavior and, if variance is detected, furnishes a straightforward framework or clear path of action which will:
Endogenous or “internal” controls are the best means of routine quality assessment on unit-dose testing.
Endogenous controls are built into the assay method itself and they provide an ostensible means of assessing integrity of reagent and proper execution of the procedure with each and every analytical trial. When available, endogenous controls will be formally integrated into testing verification and documentation.
Our Rapid Strep A procedure with custom log is a good example of effective endogenous control application.
[See excerpt from Strep A screen procedure, section X and its log template.]
Biochemical, microelectronics and computer advancement have led to the development of many “Unit-dose” or “Unit-use” clinical laboratory methodologies.
Common to each is that different phases of the measurement process, e.g., matrix separation, reagent addition, incubation, tag, signal generation, etc. are all built into a discrete testing module. Design of these modules tends to be intricate, fairly clever and generally results in a one test – one trial, self-contained analytical system.
Advantages of this engineering genre are that it tends to be relatively portable and, even more importantly, it substantially insulates the operator from typical variables so that the expertise which has to be brought to bear in order to get consistent results is significantly reduced. (Unfortunately, along with the decreased technical expertise also comes a lack of training and discipline in terms of managing the data.)
Additional disadvantages need to be considered when administering this type of testing including much higher disposable costs per reportable result, typically underestimated training and ongoing competency review costs and batch or “run” control logistics. Exogenous controls cannot be included in a run to directly prove the integrity of the batch because every test is isolated from every other test; every test is a run of one. Traditional quality control mechanisms are not doable. Hence, we have evolution of the notion and practice of “Equivalent QC”.
Although cost and control can be problematic there are a number of applications where Unit-use testing may be best fit. “Bedside” applications where turnaround is critical to patient care, remote screening applications where trained laboratorians are not on board or available as with “walk-in” cholesterol, strep or Hgb testing at some physician offices, some very low volume applications where larger, multi-analyte instruments are not feasible, all come to mind.
Traditional, bilevel control material with a substantial delta will be assayed when available and when appropriate. QC testing is routinely rotated among the personnel who perform the test. (See next paragraph for frequency.)
Physical properties of control material such as viscosity, optical effect, dissolved and formed elements, etc. will, where technologically feasible, mimic or parallel the matrix and content of the specimen types typically analyzed. Limit numbers (or target values when the results being generated are non-numerical) will be clearly defined along with explicit instructions for Testing Personnel to follow when a Q.C. attempt “fails” or falls outside of what has been determined to be acceptable.
Routine frequency: Unless described differently in the individual analytical procedure, bilevel, exogenous or “external” controls will be run on the default schedule; i.e.:
once every 24 hours on any day the test is performed
when there is a reagent lot change
immediately following calibration
when stipulated in specific maintenance protocol.
The process of generating a series of replicate analyses on this traditional, exogenous control material can play a significant role in monitoring the “correctness” of an assay; it does not, however, directly address the issue of accuracy. The statistical leverage provided by the effort is a numerical representation of repeatability or consistency; i.e., precision, and as such it can be an effective indicator of deviation from performance history.
An assay can of course be historically incorrect, consistently wrong or precisely inaccurate.
Defining Control Action Limits
Action limits for quality control material are routinely based on two elements: statistical and cognitive. The statistical base is derived from locally established system performance of typically no fewer than 60 consecutive trials over at least one and a half months after baseline studies and initially set at +/- 3SD. This statistical range is then evaluated and possibly adjusted to strike a balance between two basic dynamics - annoyance alerts and the clinical ramifications of being wrong. Any variance from this general approach is detailed in individual analytical or operational procedure.
Many, but certainly not all, of our automated systems now produce analytical precision which is significantly beyond what is required or even helpful clinically. A simple reagent lot change can generate a minor, uncompensated shift in a control mean that will easily increase the incidence of breach in the traditional 2SD limit while not producing an “error” that translates to anything like clinical significance. On the other hand, there are some testing systems that still struggle to generate the kind of precision or stability required by clinical decisions potentially triggered by the results they produce. Ca++ at <3%, very low end troponins and ACTs come to mind. In these cases, limits may need to be tightened from the statistical margin as well as other precautions taken to ensure early warning.
Prior to releasing new control lots, means for each constituent at each level are derived from trials overlapped as unknowns against active control material. The number of trials necessary to establish reliable target means is directly dependent on the inherent precision of the analytical process; i.e., good precision requires fewer trials and vice versa. (A substantially abbreviated overlap is run on hematology controls due to the short shelf life of the material.)
Once defined, action limits as described above are stable and transfer from lot to lot of control typically over very long periods of time, often the life of the instrument, provided manufacture of the control product remains unchanged. Constituent means will vary a bit but the action limits or allowable ranges about constituent means stay the same.
Equivalent Quality Control with Unit Dose Testing
In response to the development and widespread deployment of Unit Dose (single trial) analytical testing and the material costs associated with each test trial, the Center for Medicare Medicaid Services (CMS) has modified regulation to allow decreased frequency of traditional control testing under specific conditions. This relaxation of rules with respect to control testing frequency is referred to as “Equivalent Quality Control Procedures” (EQC).
As described earlier, Unit Dose testing typically has control mechanism designed into or integral to the testing unit; i.e., the cartridge, testpack or card. There are a variety of “flavors” or design applications for resident control processes and the frequency of external control testing under the EQC rules is determined in large part by the adequacy or completeness with which a particular built-in control process challenges its testing system.
There are essentially three levels of EQC application with respect to external control testing frequency:
Complete (First Level EQC): bi-level material assayed once per month on Unit Dose systems that challenge their complete analytic process.
Partial (Second Level EQC): bi-level material assayed once per week on Unit Dose systems that challenge their analytic process only partially.
None: bi-level material assayed once per week on Unit Dose systems that do not have internal control mechanism resident on the testpack but have demonstrated acceptable stability (as defined by the manufacturer and accepted by the Technical Director) over a time period of at least 60 consecutive days.
As with routine QC, if either trial from the bi-level control pair falls outside acceptable limits as defined in the written procedure, the variant control is repeated once only on a new testing unit. If the repeat attempt is successful, patient specimen testing may proceed. If the repeat trial is unacceptable, patient testing on the system is suspended, patient test results are not reported, an entry is made in the Incidence Log and the supervisor is informed.
Note: “Successful” or not, all repeat control trials along with the initial results – both levels – are recorded in the appropriate Incidence Log for pattern review by the Dept/Site manager.
As new Unit Dose testing comes on board an assessment of which EQC level is applied, if any, will be made. The EQC designation and rational for it will be stated specifically in the written procedure.
Baseline function checks on this testing modality will include but not necessarily be limited to precision and stability evaluation. These initial QC challenges will be made up of external, bi-level control assays over a minimum of 10, 30 or 60 consecutive testing days depending on the thoroughness with which the internal control mechanism monitors the analytic components of the system; i.e., Complete, Partial or None.
Raw data and summary of this baseline function check along with any pertinent comment, will be recorded in the MVR (Method Verification Report) format, reviewed by the Technical Director and then scanned and indexed for rapid retrieval.
Frequency reduction in external control iteration afforded by EQC status for any test is automatically suspended if a failed control attempt does not recover on initial repeat. Once the problem has been defined, remediation applied and baseline function checks re-established, the EQC status may be re-instated.
The Technical Director may suspend ECQ status for any reason:
“Since the purpose of control testing is to detect immediate errors and monitor performance over time, increasing the interval between control testing (i.e., weekly, or monthly) will require more extensive evaluation of patient test results when a control failure occurs. The director must consider the laboratory's clinical and legal responsibility for providing accurate and reliable patient results versus the cost implications of reducing the quality control testing frequency.”
CLIA; Interpretive guidelines for Laboratories and Laboratory Services. D5445, 20040527
Control “out”; response:
Control performance tolerances or control limits are positioned to trigger early recognition of possible error before it translates to clinically significant effect. (See Defining Control Action Limits)
Single constituent (test) control failure:
Routinely, if a control trial is aberrant and the “problem” isn’t readily apparent, Testing Personnel are instructed to, before doing anything else, check the Incidence Log for evidence of increased imprecision in the recent performance history of the test in question. Early recognition of imprecision can often escape if it is dependent on isolated control trials.
If there is evidence of increased imprecision, the department or site Supervisor must be consulted in order to coordinate considerations concerning degree of potential error and determine whether trouble-shooting needs to take place immediately or if it may be deferred somewhat.
Should the Incidence Log provide no evidence of current increased imprecision, the tech will attempt to change nothing with respect to the analytical system and repeat the failed control one time only. If the repeat trial generates results that are within stated tolerances, patient results associated with that control may be released. If the repeat control trial again fails to fall within defined tolerances, testing results of “unknowns” are recorded but not reported and the analytical procedure in question is suspended until the on-site Supervisor is informed and/or the Pathology Associates Technical Director or designee orchestrates investigation and resolution of the problem.
On the other hand, if the “problem” is obvious, fix it; rerun the control to verify the solution and make a succinct entry in the Incidence Log.
Both initial and repeat control results are documented in the appropriate log. If the initial response to control trials that are “out” appears to resolve the issue, observations and supporting data are recorded in the Incidence Log. Observations and supporting data relevant to extended problem solving are further documented in either Special Maintenance or Method Verification reporting format depending upon the path of investigation.
Any authorized variation in this initial response to aberrant control trials is explicitly detailed in the individual, written procedures, which have been sanctioned, for use on-site. Initial response to hematology controls that are “out” is detailed in the Beckman Coulter LH750 Operate and Beckman Coulter LH500 Operate procedures.
Batch constituents (tests) control failure:
On routine batch control runs in which three or more trials on multiple constituents at any level are outside of their limits, cease testing on the analyzer. Consider the system “down”. Do not report patient results. Contact your site supervisor. The site supervisor will assess the situation, contact an Instrument Specialist for consultation and corroboration, notify a site pathologist and determine if Communication to Affected Services is required.
Unless controls are going to be used as standards, questions about accuracy have to be resolved by some other route. The ability of an analytical method to recover the actual or “true” target can be assessed by direct assay of a standard when the analyte is mass-measured. Accuracy can also be evaluated indirectly, even for activity-measured analytes, by comparison to group specific means extracted from databases built on the results of proficiency testing challenges. The latter is our most common method for routinely proving accuracy for many tests.
Standards are also brought to bear in the routine calibration of mass-measured constituents and, occasionally, when trouble-shooting a test in which matrix effect of proficiency testing material may obfuscate the target.
Control trials; primary documentation:
Staff are instructed that it is absolutely essential to record the results of all analytical trials whether or not the values are “in” or “out”, “acceptable” or “unacceptable”, “expected” or “unexpected”. Custom logs have been designed to receive this primary record chronologically and by method so that judgments regarding the validity of individual results can be made in the context of proximal analyses, both known and unknown.
Reagent and control lots are recorded at the bench when they are activated. Provision for the documentation of this information might have been designed right into a Test Result Log if the procedure requires primary transcription of test results or it may appear on Levy-Jennings charts if they are being used to track test performance or it could be recorded in a special, separate log, or recorded in the test Incidence Log; the nature of the specific analytical method will dictate the medium.
3/7/77, S. Raymond; Technical Director, DBQ Pathology Associates
revised 5/27, 7/19/83 S. Raymond, J. Miller
revised 4/20/95 S. Raymond; QA format
revised 19991108 S. Raymond; clarification of routine control frequency
revised 20040614 S.Raymond; EQC and Unit-dose or Unit-use defined
revised 20051111 S.Raymond; Defining Control Action Limits
Rounding of a numeric value is sometimes necessary in the laboratory to conform to the reporting format stated in the analytical procedure. Some of the laboratory analyzers display or print results to tenths, hundredths or thousandths and, in order to report these results, it may be necessary to round them to report to the significant digit.
There are two basic rules to follow when rounding numbers:
Refer to the individual analytical procedures to determine the significant digit.
July 1994 S. Hosch
revised 20060911 M. English; I., II.2.Examples
Results produced by the Laboratory are reviewed and checked for clerical errors, absurd or unlikely results and critical values prior to reporting.
January 1997 J. Mueller
revised July 2001 J. Mueller
revised 20060911 M. English; sections 1 & 2
Acceptable laboratory procedures will be written, practiced, referred to and reviewed from the standard format or template that has been developed by Pathology Associates for clinical laboratories under their direction. This established standardization will accommodate site-specific application while maintaining an important system-specific consistency.
[Click here to launch the Analytical Procedure Format template. (Gray text specifies appropriate pre-defined text editor styles).]
Package inserts are not acceptable substitutes for our written procedures.
Our laboratory policies and procedures have been written to be used. All of them have been converted to electronic medium so that they can be readily standardized, accessed, reviewed, improved and distributed throughout the system. They are subjected to a rigorous, formalized, bilevel review process.
The Analytical Procedure Review Program of DBQ Pathology Associates is a process operating continuously on two levels: Comprehensive and Interim.
Interim Review occurs on an annual cycle (calendar year) and is carried out by either Managers, Site/Department Supervisors or technical staff designated by them. It can be thought of as a polishing phase for established analytical procedures that have been driven through the much more rigorous Comprehensive Review process at least once.
At the Interim Review level focus is on those things that might have drifted from the procedure as it was authorized following Comprehensive Review:
The Manager or Site/Dept. Supervisor will determine whether discrepancies discovered are minor in nature or whether they should trigger the Comprehensive Review process for the procedure in question. (The DBQ PA Technical Director will be available to resolve any doubt.) If, for example, the written procedure should be modified to reflect practice rather than the other way around or if language in the written procedure should be reworked to clarify steps that can potentially impact analytical outcome, then the review should be raised right up to the Comprehensive level.
Interim Review procedures that require minor changes are handled in the following way:
- The HTML document is re-translated to update the web page and the LIS Project Coordinator notifies the appropriate supervisors when the new version is available on the UCL web page via e-mail.
The Comprehensive Review process is cyclic but doesn’t conform to a preset time interval; some procedures are reviewed and revised multiple times within a cycle while others, because of their straightforward, stable nature, may undergo scrutiny but once in the same cycle. A thumbnail sketch of the Comprehensive Review follows:
The Procedure Manual TOC (table of contents) will appear at the beginning of each analytical procedure binder. Procedures will be grouped by Department and listed alphabetically. [See the following Procedure Manual TOC model.]
POCT procedure review:
The director named on the CLIA certificate or a qualified designee approves procedures before initial use of the test for patient testing, and then once every three years. (JCAHO Standard PC.16.40)
4/9/92 S. Raymond; Technical Director, DBQ Pathology Associates
revised: 11/2/93 S. Raymond
revised: 12/13/94 S. Raymond: Interim review, TOC
revised: 4/30/96 L. McGovern, S. Raymond: Interim review
revised: July 2002 M. English; S. Raymond: Interim & Comprehensive review, procedure format
revised: January 2007 L. McGovern: Added POCT procedure review
revised: September 2009 M. English, S. Raymond: Interim Review recording, procedure format, TOC, define annual review
The formal documentation of aberrance encountered in the course of performing clinical laboratory analyses is “Incidence Recording”.
Incidence Recording; purpose/design:
Because testing procedures are rotated from tech to tech with some frequency and because technologists are generally performing more than one procedure at any given time, problems that arise often appear isolated or spurious and many times go unattended.
As it turns out, most problems are not manifested by a single event but are accompanied by additional evidence that something is going, or has gone, awry.
The Incidence Recording System of DBQ Pathology Associates provides for the consolidation of apparently disparate events; the collective and chronological properties of the system can prompt early recognition of problems, accent their sometimes-chronic nature, and often lend direction in terms of solution.
What should be recorded:
Essentially, ANY unexpected, test-related abnormality observed by a technologist while performing an analysis should be documented in the Incidence Log; e.g.:
- control performance outside of posted limits,
- drift or shift within posted limits,
- duplicate imprecision,
- replicate imprecision,
- overt reagent condition,
- unusual pattern in patient results,
- isolated, bizarre patient result,
- any aberration in the test that is not obviously instrument related.
If a problem has already been noted by a Supervisor or the Technical Director (or designee) and formal measures toward a solution are underway, it may not be necessary to document additional observation. When in doubt, however, an entry must be made.
How to record:
- Write a brief description of the abnormality. (Include control data if pertinent and, if it is, note performance history on both controls.)
- Review recent Incidence Log entries for related history.
- Comment briefly about possible cause(s). If the cause is unclear, simply state this fact.
- Describe any action taken. If you repeat the analysis, specify any changes that may have been made.
- Initial and date the entry.
An example of a reasonably good Incidence Log entry:
9/12/79: “Osmolalities: 750 control came in at 770 w/a posted mean of 747 and a range of 727 - 767. Opened fresh 750 and it came in at 732. Patient values reported. In 10 trials, we have crossed the mean on the upper control only once. The high control appears to have deteriorated.” MJB
“The problem is actually worse than it looks. It is affecting patient results. The Lo control is drifting below its’ target and the Hi control has deteriorated more than recent values indicate since the standard is undergoing parallel deterioration. We need to get both controls and the standards into much smaller bottles w/ better sealing caps." SR
Sometime, preferably early in the shift, Testing Personnel should attempt to skim through Work Area Incidence Logs looking for recent entries in sections that cover analyses they are likely to perform on their “watch”. This will alert them to problems they might encounter and can save time while reducing confusion and frustration in the long run.
The appropriate section of the relevant Incidence Log very definitely needs to be reviewed by testing personnel in their initial reaction to aberrant control results.
Once per day, Department Managers need to go over recent entries in the Incidence Log(s) covering their Work Areas. They should be looking for patterns to the problems that are logged and they need to satisfy themselves that reasonable progress is being made towards resolution. Overview by Site Supervisors or Managers can occur at lesser frequency; e.g., weekly or monthly, but it is very important that management at this level maintain periodic and personal re-assurance that entries are being made properly and that chronic problems are not going unattended. Problem solving can be ramped up to the Technical Director or Assistant to the Technical Director at any time by the Site Supervisor or Manager.
S. Raymond; 12/2/79
revised: 12/12/88 S .Raymond, format
revised 4/26/95 S. Raymond, review responsibilities
● Temperature Monitoring; purpose/design
● Common refrigerators/freezers/incubators
● Waterbaths, Drybaths, Heating blocks, Temperature-controlled cuvettes, Automated chemistry analyzers with reaction baths
● Room temperature
● Blood bank refrigerators
● Blood bank freezers
● Blood warmers
Temperature Monitoring; purpose/design:
There are a variety of specific storage or reaction thermal conditions that need to be created and sustained in the typical clinical laboratory setting. Tolerances or the amount of allowable deviation within these specific thermal conditions need to be defined and, in each case, procedures are needed which provide both a method for tracking the on-going effectiveness of each temperature-control system as well as directive for reacting to breaches in the defined tolerances.
Standard laboratory refrigerators, freezers and incubators will be “fitted” with the solid state, LED Min-Max thermometers having remote probes (Sentry Hi/Lo®: 020-934). The crystal display of these monitors will show the current temperature along with the lowest and highest reading captured since the unit was last reset. The display measurements are updated every 10 seconds.
Probes of these monitors will be immersed in an aqueous solution of ethylene glycol and located within the refrigerator/freezer/incubator compartment away from any circulating air fans. This arrangement should provide measurements which are easy to read and considerably more representative of the object temperature than the artifactual, transient thermal swings of ambient air which occur when the door is opened briefly to obtain or stock chilled contents.
Immersion Solution will typically be provided "ready to go" in an Erlenmeyer flask by the Materials Management department.
Preparation of the probe solution:
All temperature monitors in use must be verified annually against an N.B.S. standard thermometer over their range of use. All temperature monitors purchased must be verified against an N.B.S. thermometer over their range of use before being put into use.
Waterbaths, Drybaths, Heating blocks, Temperature-controlled cuvettes, Automated chemistry analyzers with reaction baths:
Blood bank refrigerators:
Since these refrigerators have a recording chart, the Sentry Hi/Lo® temperature devices are not used.
Note: For those refrigerators having an electronic temperature monitor, the monitor may be used in place of one of the spirit thermometers. These monitors must be verified with an N.B.S. thermometer prior to use and on an annual basis as is done with all Sentry Hi/Lo temperature devices and spirit type thermometers.
MMC-DV: Nursing Station
Blood bank freezers:
All freezers in which blood components for infusion are stored need to have recording thermometers with visual and audible alarms.
Finley: alarm power light
MMC-DV: alarm power light
MMC-DBQ: alarm power light and audible alarm (If the audible alarm does not sound, the 6-volt battery needs to be replaced. A replacement battery is obtained from the UCL Instrument Specialist office.)
MMC-DV: Nursing Station
It is the responsibility of the hospital’s Biomedical Department to ensure that all blood warmers in use at their facility are functioning properly.
The following areas will be verified on a quarterly basis.
1980; S. Raymond: General Policy: Temperature Verification and Temperature Recording
1987; M. English: Blood Bank Refrigerator Alarm procedure
2/20/95; S. Raymond: Temperature Monitoring; Standard Laboratory Refrigerators and Freezers (Sentry Hi/Lo®)
5/23/95; S. Raymond, L. McGovern: revised to QA format
2/8/96; L. McGovern: note in Banked Blood storage refrigerators section.
4/22/96; L. McGovern: revised Room Temperature section
8/29/96; L. McGovern: added Blood warmer section
Instrument Problem Report/ IPR
The Instrument Problem Reporting program with its IPR template [see IPR template following this paragraph] provides a standardized framework against which laboratorians can use the sensitivity they typically develop to nominal operational characteristics of the instrumentation they are charged with running to recognize, compare and record deviation so that as many instrument-related problems as possible can be trapped early on and resolved while they are still relatively minor and before they pose serious interruption of services and consumption of resources.
The IPR articulates closely with the routine Maintenance & Function Verification Logs/Procedures and the SMR (Special Maintenance Report).
The IPR is divided into 8 sections:
It is always reviewed by an Instrument Specialist.
It is frequently reviewed by the Technical Director.
It is always returned to the originating site for review by the site Supervisor and staff in the relevant department.
Section 8 allows space for formal comment from Instrument Specialists and/or the Technical Director before the IPR is cycled back to its place of origin.
IPR and SMR Storage
The IPRs and SMRs are filed together chronologically and alphabetically by instrument where readily accessible to the technical staff.
1974; S. Raymond: Instrument Specialist Program and IBM template
1986; S. Raymond: Macintosh template
9/24/95; S. Raymond; written description revised to QA format
August 2008 L. McGovern/S. Raymond, (Revised: 7 added, IPR form revised
Method Verification Report/MVR
Method Verification Report; purpose/design:
The Method Verification Report/MVR is used to document baseline studies on new assays as well as performance verification as determined by the Technical Director or the Assistant to the Technical Director on assays in use.
Any test method along with its performance specifications is available through the office of the Technical Director at any time to any client who requests it. The information will typically be transferred formally by means of the Method Verification Report.
Method Verification Report; content:
The MVR will include the assay/instrument/location/date/method verifier. The symptom or issue that triggered the report will be defined. All action taken and data collected will be listed step by step in chronological order. [see following MVR example.]
Method Verification Report; review:
MVRs will be reviewed and signed by the Technical Director and then routinely sent to the applicable Site Supervisor for review.
Method Verification Report; storage:
MVRs are to be stored on site in the applicable assay file for 2 years.
5/30/95; S. Raymond, L. McGovern
revised 1/18/98; method and performance spec. availability; S. Raymond
Inservice/Initial Competency Assessment:
With the release of new instrument systems, new or substantially changed procedures and integrated into the orientation of new employees is a structured program of Competency Assessment.
Inservice documentation is prepared by the instructor and consists of an Inservice Outline, Inservice Report, Pre and post testing (intermittent) and Competency Evaluation. These formatted documents are used to affirm inservice content, participation and participants' ability to understand and perform material covered during each inservice.
An outline lists the material to be covered in the inservice. The inservice outline is reviewed by the Technical Director and distributed by the instructor to the staff prior to the inservice.
Inservice Report; content/review:
The Inservice Report includes the subject/location/date/participants/instructor's name/material covered.
Inservice Report; storage:
Competency Evaluation; content/review:
Competency Evaluation; storage:
Recurring Competency Review:
Competency Review is performed on an annual basis to comply with Joint Commission Standard HR.01.06.01
The staff member’s competency assessment includes the following:
-Direct observations of routine patient test performance, including patient preparation, if applicable, and specimen collection, handling, processing, and testing.
-Monitoring recording and reporting of test results.
-Review of intermediate test results or worksheets, quality control, proficiency testing, and preventative maintenance performance.
-Direct observation of performance of instrument maintenance function checks and calibration.
-Test performance as defined by laboratory policy (for example, testing previously analyzed specimens, internal blind testing samples, external proficiency, or testing samples.)
-Problem-solving skills as appropriate to the job.
While the goal of the program is serious, there is no reason whatever for implementation to be oppressive in nature or punitive in any way. The idea is to provide a platform for technical staff to refresh memory, fine-tune proficiency, learn something and feel good about the process. The framework is self-instructional with deliberate margin for personalized scheduling and implementation at a pace that is not at odds with an already pressured profession. It will be managed largely at the site/department level and, as indicated earlier, cycled annually.
Recurring Competency Review program specifics:
Test Procedure Name
(Fill in the circle that represents the correct answer.)
o answer option 1
o answer option 2
o answer option 1
o answer option 2
o answer option 1
o answer option 2
5/30/95; S. Raymond, L. McGovern: Inservice/Initial Competency Assessment
10/13/96; S. Raymond: Recurring Competency Assessment
January 2009 R. Schaefer: Design Nature of Multiple Choice Questions and Answers
September 2009 L. McGovern: updated for current procedures
January 2012 L. McGovern/R. Schaefer Revised for Joint Commission Standard HR.01.06.01
Process and design:
In our context, the common Proficiency Testing Program (Survey) is a subscription service created to provide large-scale, comparative assessment of clinical laboratory testing by constituent (analyte). These programs usually begin with big batches of commercially prepared material which have been made to mimic the human specimens typically encountered in the clinical setting. These preparations are aliquotted and distributed by the manufacturer to participating laboratories across the land where they are assayed for those constituents included in the menus of the assaying laboratories. Testing results are then submitted to a central data processing facility to be collated.
Unless specific directive attending the proficiency testing material contraindicates, the samples themselves are handled within the context of established "Universal Precautions" (See Infection Control Policy manual); i.e., as common biologics capable of harboring and transmitting disease.
Most survey material is both extrinsically and intrinsically quite different from the specimens routinely being assayed and it requires peculiar pre-analytical processing such as, rehydration, reconstitution, equilibration, etc. Since the majority of Proficiency Testing results are graded and variance from nominal accuracy triggers punitive action, since the government and certifying agencies are sometimes not sophisticated enough to differentiate artifactual error from error actually associated with patient testing, it is absolutely essential that survey material be prepared very carefully and in exact accordance with its package instructions. If there is a problem with the packaging that might affect the properties of the testing material or if there is any kind of problem in the pre-analytical preparation of the testing material, do not proceed with testing. Contact the Technical Director or Assistant to the Technical Director.
Once survey specimens are ready to assay, they are not handled substantively different from the way patient specimens are handled. It is our policy, in as much as possible, to integrate survey samples in a routine fashion into the regular workload; e.g., we don’t want to run them in duplicate or surround them by “known” trials if we’re not doing the same by written procedure to patient samples on which the target constituent(s) is being measured. The material is analyzed on site and there is no kibitzing between sites with respect to the results obtained prior to reporting to the subscription provider. Proficiency Testing is rotated among the personnel who perform the testing. In accordance with regulations, Subpart H 493-801(b)(1) of the Federal Register 19920228, individuals performing analyses, as well as laboratory director or designee, will bear witness to the execution of this protocol by filling in and filing an Attestation form with data submission on each survey subscription event.
Raw data is copied from the laboratory’s primary record to the data submission forms provided. This occasionally requires some kind of unit conversion and almost always involves several transcription types unrelated to patient testing and it is, therefore, another very troublesome source of artifactual error. Transcriptions to the report form must be checked by someone other than the transcriptionist before the data form is either forwarded directly or the data is submitted electronically to the provider of the survey subscription.
A copy of the data submission form along with any relevant primary records is retained for two years. The form can be helpful in terms of tracking down clerical errors from the primary record onto the data submission form itself and from the data submission form into the proficiency data processing system. Once the Evaluation Summary has been received back from the survey facility and reviewed, the data submission form is of little, if any, use; retention at this point is purely a matter of compliance with the certifying agency.
CAP: Technical personnel directly contributing to the assay of survey samples will sign the form provided with the data submission sheets. The form will be forwarded to the Office of the Technical Director where it will be counter-signed, scanned and indexed for rapid retrieval throughout the enterprise. Do not keep a copy on site.
API: Before testing data is submitted electronically, names of technical staff that participated in the testing are transcribed from the Attestation signature form onto the appropriate API e-form. Using the button built into the web page, print a copy. Send this copy along with the original Attestation signature form to the Office of the Technical Director where they will be counter-signed by the Technical Director, scanned and indexed for rapid retrieval throughout the enterprise. Do not keep copies of these documents on site.
Data Summary; evaluation and review:
Once the raw data has been received, entered, collated and computer-graded at the central data processing facility, an Evaluation Summary is printed and mailed back to the participants. One copy goes to the submitting laboratory site and another copy is forwarded to the Technical Director for review. The site/department Supervisor/Manager and the Technical Director are particularly sensitive to any survey trials or challenges which deviate from the group specific (method/reagent/instrument) performance pattern either in a glaring, punctate fashion or in a more subtle, general bias configuration.
Results which do not “fit” with their comparison group are assessed in terms of whether or not they are “wrong” and if so why. Obvious error like clerical mistakes can usually be recognized and resolved quickly by referencing the relevant primary record and data submission forms. Other deviations from nominal can be substantially more difficult to decipher; additional testing may be part of the investigation. If remedial action or adjustment is warranted, it is triggered by the Technical Director along with the appropriate follow-up.
The Site/Department Supervisor completes a pdf “Proficiency Testing – initial scripted response to ‘unsuccessful’ survey challenges” for each result that does not “fit” with the comparison group. The completed form is printed and forwarded to the Technical Director for review and completion of section VIII. The form is signed by the Technical Director and returned to the enrolled sites. The Proficiency Testing Scripted Response is attached to the Proficiency Testing Evaluation Summary. The results are reviewed and signed by the Site/Department Supervisor(s) and a Pathologist. The document is then posted for a period of time allowing the staff opportunity for review.
A single addendum is prepared by the Technical Director evaluating survey trials that are “not graded”. This includes testing events with less than 10 participants and those that do not obtain the agreement required for scoring.
Upon notification by the Joint Commission of an unsuccessful proficiency testing status, the Site Supervisor will notify the Technical Director and submit an appropriate plan of action within ten calendar days.
Evaluation Summaries are, of course, kept for a minimum of two years.
Analytes not covered by traditional proficiency testing programs:
There are a number of laboratory tests which, because of properties associated with either the analytical target or the matrix, do not lend themselves to the logistics of commercial Proficiency Testing and are not included in Surveys offered by “sanctioned” agencies. CLIA ‘88 regulations mandate the biannual “accuracy verification” of these procedures. This, of course, is not completely possible either but, where it is, a formal program exists. Refer to Intersite Analytical Method Parallel.
9/20/95; S. Raymond; Technical Director; DBQ Pathology Associates
2/24/04; S. Raymond; revised for handling Attestation sheets
9/18/07; S. Hosch/S. Raymond; revised Data Summary eval & review “Proficiency Testing-initial scripted response…”
Inoperable or “down” Test Systems & Backup
When analytical systems are not working properly tension typically mounts. Staff will immediately focus on local cause, effect and remediation and sometimes forget to warn recipients that their routine laboratory service has been compromised. The purpose of this policy is to help assure that accountable Management is formally involved, that compensatory systems work smoothly and that practitioners dependent upon the affected service can adjust their expectations in accordance with the situation.
Further more, because circumstances surrounding a “down” analytical system vary greatly from one event to the next, it is difficult if not altogether counterproductive to attempt pre-configuration of response in terms of backup instrumentation, testing location, timing, sample handling, results reporting logistics, etc. A general policy, on the other hand, which sets the stage for defining when a test system is inoperable and then describes guidelines for the design and implementation of the proper course of action in the face of “down” status is meaningful.
Backup instruments and methods:
Written test procedures generally will not include reference to backup methods because the laboratory system will not typically maintain methods dedicated or designated specifically as “backup”. Any exception to this general posture will be covered in the appropriate test procedure; i.e., cross-references to particular backup method/instruments will be made.
Sometimes recognition that a test system is inoperable is painfully straightforward; the fact that there’s just no available reagent or that the system has by itself gone into shutdown/fatal error mode or that the analyzer is stone cold to touch could certainly be enough to settle any doubt. More often, symptoms will be subtler as with insidious imprecision or trace carryover and the realization that a system is inoperable can be considerably less explicit.
An analytical system will be considered “down” if a physical malfunction renders it obviously inoperable, when system status has gone to auto shutdown (fatal error) and/or when one of the following personnel has determined that the analytical outcome, controls or otherwise, is unacceptable.
List of personnel who can officially declare an analytical system “down”:
- Instrument Specialists
- Technical Director or Assistant to the Technical Director
- Supervisor or Technician in concert with any of the above.
If the “down” status of an analytical system is instrument-related, an Instrument Specialist, if not already aware of the situation, is to be notified immediately.
The appropriate response for Management will be largely ad hoc; i.e., dictated by prevailing circumstance and fabricated in real time; a number of variables will have to be defined and many details may have to be attended in order to effect an orderly, efficient process. The management group will sort through the following considerations and orchestrate the warranted activity.
Communication to affected Services:
a. The communication must include identification of those tests the contacted department is likely to become anxious about, an estimate of how long they can expect to wait for results on the affected tests and a projection to recovery time; i.e., when they can expect the system to be back “up”.
b. In the Incidence Log of the affected test system record the time of the communication, the name(s) of the department(s) and client(s) that have been informed along with the name(s) of the person(s) contacted.
a. Pathologist on-site or on-call
b. Emergency Room (hospital)
c. Intensive Care (hospital)
d. House Supervisor, Nursing Service (hospital)
f. Acute Care, Pediatrics, Oncology (Medical Associates)
Alternate Testing sites and re-routing of specimens:
a. Contact supervisory staff at the alternate site(s) to let them know that specimens will or may be routed to them.
b. Determine if the alternate site(s) can absorb the extra workload with existing staff configurations.
c. If additional staff or extended hours will be required to manage the modified workflow at either the original testing site or the alternate testing site(s), make those arrangements.
d. Bolster the courier staff if necessary; e.g., rearrange routine work assignments of courier staff to cover the additional trafficking of specimens, retain the local cab company, commandeer testing, clerical or management personnel, contract with a professional courier service, etc.
The Technical Director or Assistant to the Technical Director will determine when an analytical system can be released from a “down” status. If the “down” status is instrument specific, an Instrument Specialist can make this determination in real time and then communicate the specifics subsequently to the office of the Technical Director.
A Technician on site will be appointed on an ad hoc basis to re-contact appropriate stations and personnel in order to inform them that the system is back in operation.
5/12/95; S. Raymond: Technical Director; DBQ Pathology Associates
10/10/95; revised to include “matched analytical system” definition; S. Raymond
1/27/96; revised specific language: Purpose/design, “down” declaration & stations/personnel to be contacted; S. Raymond
July 2010; revised: added Radiology; L. McGovern
Reagent Overlap, “serology” kits
New reagent lots must be checked against current reagent lots or with selected reference materials before being placed into service.
While not as prevalent as in the past, there can still be some significant variance in the avidity and affinity between batches of "serology" reagent due to the nature of manufacture and application of the biologic components inherent in the method group. "Overlapping" controls and/or reagents is a technique that can provide some modicum of assurance that the new material will behave comparable to the material that is already in use. It is noted that exogenous controls can vary independently of the reagent lot and that while the reagent lot may change with a new kit lot, controls may remain the same. In these circumstances, the value of the overlap exercise is diminished. We will nevertheless, for the sake of simplicity, run the routine overlap protocol anyway.
“Serology Testing” has been divided into categories based on kit methodology, and packaging of reagents and controls.
Immunoassays with endogenous controls:
These assays have endogenous controls in every test cartridge to verify that the cartridge functioned properly and that the test was performed correctly.
QuickVue Flu A+B, Rapid Strep A, ßhCG, Mono, RSV, Rotavirus.
Overlap testing is not required on waived test kits (Rapid Strep A, BhCG, QuickVue Influenza A+B, Mono, RSV). We have elected not to perform overlaps on waived test kits since requirements state it is not necessary and there is no added value in doing so.
Latex Agglutination Assays/Other Assays:
Control/Reagent Kit packaged together:
Rubella, RA, FDP, Directigen Meningitis Panel, Teichoic Acid, Cryptococcal Murex, Immunocard H. plylori, Immunocard Clostridium difficile Toxins A & B, Mycoplasma pneumonia, RPR.
Standard Overlap Protocol:
1983; J. Schultz: Policy Regarding Overlapping of Controls for any Serological Procedure
9/19/96; L. McGovern, S. Raymond: Policy revised for QA language, explanation and forma
2/18/98; S. Raymond: standardized routine overlap protocol between serology packaging configurations
Micropipettor Pipetting Techniques
Micropipettor Pipetting Techniques; principle:
There are three basic techniques that will be allowed when using two-stop micropipettors. All test procedures will clearly specify which technique is to be used when micro pipetting is called for and the language used to refer to the technique will conform to this protocol.
The Blood Bank Multiple Dispense Micropipettor (ID-TipMaster) is designed to deliver multiple volumes of 12.5, 25, and 50µl. The ID-TipMaster pipettor is intended for blood bank "Gel" procedure use only.
Micropipettor Pipetting Techniques; Two-Stop Micropipettors:
This technique will usually be specified when delivering a sample into a dry receptacle.
Note: In the "Reverse" mode several deliveries of the same sample may be made with a single pipette tip. The actual number of tubes that can be obtained from one tip will vary with the pipette size. Care must be taken to prevent the increasing residual sample from reaching the pipette tip filler.
This technique is used when delivering sample into solution.
a. Depress the plunger all the way to the second stop and then while keeping slight pressure on the plunger allow it to return to its original, or rest, position.
b. Repeat step "A" four more times (total of five rinses) all the while keeping the pipette tip in the solution.
c. Depress the plunger all the way to the second stop and then while holding at this position, remove the pipette tip from the solution.
Note: Use a new tip for each sample delivery.
This technique is rarely used.
Note: Use a new tip for each sample delivery.
Blood Bank Multiple Dispense Micropipettor (ID-TipMaster):
Modified "Reverse" or "To Deliver" mode:
Note: With the ID-TipMaster pipettor several deliveries of the same sample may be made with a single pipette tip.
4/19/78; S. Raymond
8/24/91; S. Hafenbredl
11/26/96; S. Wallace, S. Rodriguez: added FP-2 micropipettor
1/28/97; L. McGovern: revised to QA format
August 2001; S. Wallace: added ID-TipMaster micropipettor
Reference Range Evaluation/Validation
Reference Ranges; kinds:
Within the context of clinical laboratory analytical testing, there are four major kinds of “reference range”.
The nature of the reference range can effect both its initial and on-going evaluation processes. For example, if the reference range is decreed, as with cholesterol at less than 200 mg/dl, it’s important to prove the in-house method against the analytical system that was used to establish the target and probably inappropriate to launch some sort of local study either in a mechanical attempt to comply with regulation or in some misguided effort to validate the target rather than the method. Likewise, grouping and typing of RBCs in routine blood banking is limited to combinations of A, B, AB and O with positive, negative or weak D for Rh antigen. This is the “range” of expected values. Should it be checked periodically? No.
Reference Ranges; new tests:
For all new analytical methods or procedures in the clinical laboratory system under the direction of DBQ Pathology Associates, reference range considerations are made prior to installation of the test. The Technical Director or designee and a Medical Director sign off on the written procedure. Generally, factors such as our specific geographic location and population mix have no clinically significant impact on the disposition of the manufacturer’s statistical assessment and reference ranges are adopted directly from the supporting literature. With the “normal range” type of reference interval, if there is some reason to suspect a valid bias between results produced out of the local population and the manufacturer’s projection, we perform an expected range check generally using a Gausian model initially and then adopting NonParametric Percentile Estimate statistics if the early data seems to warrant it. [See HbA1c example following.]
Reference Ranges; established tests:
If the analytical method is well-established in our laboratories, if it remains stable and, if the data generated by the test is providing unchallenged, expected and effective diagnostic leverage for our clients (the clinicians), existing reference ranges are tacitly validated.
On-going, formal assessment of posted reference ranges, including Critical Values and Phoned Result triggers, takes place on a number of levels both periodically and on a situational basis.
Periodic formal validation:
In the Procedure Review cycles, Interim as well as Comprehensive, the Incidence Logs are checked for problem patterns associated with the method. If there have been unresolved complaints of a nature that would implicate the posted reference range, an investigation is initiated by the site manager or supervisor in concert with the office of the Technical Director in response to observations noted in the Procedure Review Report Form.
The Procedure Review program also provides a planned occasion for the re-alignment of reference range postings on each test. The number and variety of places within the laboratory enterprise where ranges on a given analyte are strewn is surprising and it tends to foster a certain amount of drift from nominal that has to be countered.
A check-off protocol embedded within the Procedure Review process is designed to chase down the ranges of each analyte and make sure they match between written procedures, the Clinical Laboratory Testing Manual (CLTM), internal instrument parameters, LIS filters, interim and final reports, etc., at all laboratory locations under the direction of DBQ Pathology Associates.
Situational formal validation:
Occasionally, manufacturer/suppliers will make adjustments in their analytical methods that are designed to either optimize a particular system element such as sensitivity, specificity, dynamic range, etc. or simply offset their system so that its results are brought into alignment with some recognized standard or group. The former will sometimes impact the reference range of the revised method; the latter will always affect the reference range. In either case, the change is usually announced in advance by the manufacturer/supplier in a bulletin or a flagged package insert.
Our response to these announcements varies with the degree of anticipated impact. Generally, if the change is product-related rather than procedural, revised material will be acquired prior to its official release and the expected shift will be confirmed with a brief patient parallel. From there, the Technical Director in concert with Medical Directors will determine how the issue will be managed internally. It may or may not be appropriate, for example, to write a memo to all or a group of targeted physicians and then coordinate the memo mailing with:
a. revision to all of the analytes reference range postings,
b. launch date of the revised method and
c. timing of computer result entry so that analytical results generated before the change will be matched with the previous range and results generated after the method change will appear with the new reference range. [Example: MVR re. Triglyceride offset to CDC reference method.]
d. In some instances it has been necessary to sequester and store specimen aliquots and maintain the original method conditions for a period of time so that either the serial testing being conducted on some patients can be completed in context or so that analyses from patients being serially tested can still be actively paralleled with the method that has been replaced until the physician is comfortable with the transition.
There are also occasions when reference range validation is prompted by physician query based on a feeling that test results appear to be running high or low to average expectation on a given analyte. We encourage physician feedback and this type of question almost always elicits a rapid, open response. Assuming normal patterns in routine control data, we would very likely pull a representative number of sequentially reported results on the “suspect” analyte from recent archive and compare them to the reference range. If that comparison is unclear or if it tends to support the notion that there’s been departure from nominal, additional reported results are recovered from the record and statistically reviewed. While it’s not at all common, the investigation might lead to a more elaborate patient population distribution study and eventuate in some adjustment in the method or the posted reference range or both. [I.Phos example.]
Analytical Range vs. Linear Limits
“Analytical Range” is a set of numbers which comes from the manufacturer and describes the outer markers at which a test method is reliable under optimum conditions. It has its roots in the K-1, FDA approval criteria for the specific analytical procedure. A wide range of variables including chemical kinetics, wavelengths, sample to reagent ratio, timing, etc., etc., are adjusted in order to effect a repeatable, discrete instrument response which is directly attributable to the target constituent and which will cover the kind of values that are likely to be encountered in the clinical setting with any frequency. Analytical Ranges are primarily of technical interest.
“Linear Limits”, on the other hand are numbers that have been derived by us. They define the lowest and highest constituent level that we can routinely and independently prove will generate an instrument response which is mathematically identical to the response generated by the analyte standard(s) per unit of measure; i.e., concentration, activity, number, etc. These are the lowest and the highest test values we can report without effecting some manipulation of the sample or the system or both. We look to the “Linear Limits” as stated in our written procedures to determine whether we will have to dilute, either directly or indirectly, the sample and rerun it.
8/9/97, S. Raymond; Technical Director, DBQ Pathology Associates
1/16/98 S. Raymond; revised
September 2003 S. Raymond; revised
Intersite Analytical Results Comparability Verification:
Comparing Test Results between sites; purpose/principle:
Occasionally, it becomes expedient or even necessary to temporarily route samples for testing that is routinely done at the site of initial processing to another laboratory site within the enterprise doing the same testing. Any number of circumstances might trigger this transfer but typically it is driven by an effort to optimize workload, to trouble-shoot some element of the analytical system or data capture process or to compensate for some short term staffing imbalance. Whatever the cause, it is important to ensure that the results from clinical testing protocol used for this kind of backup are comparable.
The analytical systems of laboratory sites under the direction of Dubuque Pathology Associates; i.e., Mercy Dubuque, Finley, Cathedral Square, Mercy Dyersville, Medical Associates East and West campuses, are carefully matched and monitored under the design and direction of the group Technical Director.
Not only analytical methods, but also the instruments, collection/storage systems, written policies, procedures, reagent lots, control lots, calibrator lots and documentation formats are the same for testing that is common between sites.
This program of Distributed Standardization provides considerable stability in the local clinical laboratory community. Routine method bias is virtually eliminated, reference ranges and reports are essentially identical, rapid problem recognition and remedy is enabled.
Proficiency Testing subscriptions are also matched. Consultant copies of the Tri-annual Summary reports are forwarded to the Technical Director where the results of these challenges are formally evaluated primarily for deviation from group-specific means as well as bias throughout clinically appropriate ranges within our own group.
Investigation of any statistically significant discrepancy can take the form of control or reagent swapping, instrument component exchange, linearity checks, patient parallels or any combination of the above depending upon the nature of the apparent departure from the expected.
Intersite Analytical Method Parallel (JCAHO Method Validation")
The Analytical Method Parallel is performed twice per year at the direction of the office of the Technical Director, to comply with joint commission regulations that state:
● “The laboratory performs correlations to evaluate the results of the same test performed with different methodologies or instruments or at different locations. (QSA.02.08.01).”
● “The laboratory verifies the accuracy and reliability obtained for nonregulated analytes and for those regulated analytes for which compatible proficiency testing samples are not available. (QSA.01.05.01)”
Daily Hematology Review:
The City Control (Refer to “Hematology City Control: Preparation, Application and Management” procedure) is reviewed daily to monitor the performance of the different Hematology instruments in use at sites under the direction of Pathology Associates.
● Where same constituents are assayed by more than one method within the system, results from third party proficiency testing, if common to the methods, will be formally compared. (e.g., ACT, Blood Gases, HbA1c, Hematology)
● Where same constituents are assayed by the same method at different sites, results from third party proficiency testing are formally compared. (Since this is automatically done, these tests are not listed in table.)
● Where same constituents are assayed by different methods within the system and different third party proficiency testing is performed, two samples will be assayed at the sites performing the testing and the results will be formally compared (e.g., [Gem Premier Hematocrit / LH750 Hematocrit], [PXP Glucose /DxC Glucose], [DxC Sodium, Potassium, Glucose / Gem Premier Sodium, Potassium, Glucose], [Access Troponin / i-STAT1 Troponin], [Xpand Plus Creatinine / i-STAT Creatinine], [i-STAT1 iCa / Gem Premier iCa], [DxC HbA1c / A1cNow],. [ABO Rh gel method / ABO Rh tube method], [Antibody Screen gel method / Antibody Screen tube method]. This parallel is not required for waived testing, however it is performed for internal purposes where possible.
● Where same constituents are assayed by the same method at different sites within the system and third party proficiency testing is not available, five samples are split and assayed at the sites performing the testing and the results are formally compared. Since all sites performing this testing are CLIA certified, this fulfills requirement QSA.01.05.01 for the tests in this category (e.g., Cryoglobulin, Methylene Blue Stain, MPV).
● For MRSA/SA-BC and MRSA/SA-SSTI and VanA third party proficiency testing is not available. According to Cepheid, the Microbiologics controls can be used for proficiency testing. The Microbiologics controls are routinely run with each new lot number and each shipment.
Intersite Analytical Method Parallel (JCAHO “Method Validation)
MMC-DBQ Cardiac Cath
Formally compare survey results
Aqueous blood gas survey
Formally compare survey results
Gem Premier 3000
EDTA whole blood
Formally compare survey results - GH2 survey
Metrika A1c Now
Finley Diabetes Center
EDTA whole blood
2 samples run on DxC and A1c Now.
Hematocrit / Hemoglobin
Beckman Coulter LH750
City control run daily
Beckman Coulter LH500
Gem Premier HCT
Heparin whole blood collected at different times during bypass procedure
Two samples will be run on the Gem Premier 3000 and then taken to the MMC-DBQ Lab to be run on the LH750
City control run weekly
Hemocue 201 Hgb
MMC Respiratory Care
Clayton County VNA
EDTA Whole Blood
Two samples are run on the MMC LH750 & Hemocue.
Formally compare survey results two samples run on DxC and a representative PXP Meter.
MMC Nursing Service
Finley Nursing Service
Heparin whole blood
Formally compare survey results two samples are run on MMC DxC & Gem Premier.
Lithium heparin plasma
Two samples are run on the Access and i-STAT1 at each site.
Lithium heparin whole blood
Lithium heparin plasma
Formally compare survey results two samples are run on Xpand Plus and i-STAT at MAW
Lithium heparin whole blood
Methylene Blue Stain
Difco TB Methylene Blue
i-STAT1 Chem 8+
Lithium heparin whole blood
Lithium heparin whole blood
i-STAT1 Chem 8+
Lithium heparin whole blood/plasma
Lithium heparin plasma
Beckman Coulter LH750/LH500
EDTA whole blood
Microbiologics Kwik Stik Controls
Assayed controls run with every new lot number and every shipment
Microbiologics Kwik Stik Controls
Assayed controls run with every new lot number and every shipment
Microbiologics Kwik Stik Controls
Assayed controls run with every new lot number and every shipment
1. All results are recorded on the Data Acquisition Log and sent to the office of the Technical Director at United Clinical Labs-Cathedral Square.
2. Results will be evaluated on a case-by-case basis by the UCL Technical Director to determine if method match is adequate in the specific clinical setting.
3. Records will be scanned and indexed for recovery.
2/16/00; S. Raymond: Technical Director; DBQ Pathology Associates
December 2002, S. Raymond/L. McGovern (Revised: Intersite Analytical Method Parallel added)
July 2004 L. McGovern (Revised: Intersite Analytical Method Parallel)
February 2005 L. McGovern (Revised: Intersite Analytical Method Parallel)
July 2005 L. McGovern (Revised: Intersite Analytical Method Parallel)
September 2005 L. McGovern (Revised: Intersite Analytical Method Parallel)
October 2006 L. McGovern (Revised: Intersite Analytical Method Parallel) *1 Megan Sawchuk, MT JCAHO agent in phone conference follow-up to UCL PPR, Oct. 2006
July 2007 L. McGovern (Revised: Intersite Analytical Method Parallel)
October 2007 L. McGovern (Revised: Intersite Analytical Method Parallel)
February 2008 L. McGovern (Revised: Intersite Analytical Method Parallel)
August 2008 L. McGovern (Revised: removed Primidone)
October 2008 L. McGovern (Revised: added Acetone)
April 2009 L. McGovern (Revised: added MRSA/SA-BC, MRSA/SA-SSTI)
July 2009 L. McGovern (Revised: added APT)
September 2009 L. McGovern (Revised: MRSA/SA-BC, MRSA-SA-SSTI, added iCa)
March 2010 L. McGovern (Revised: Intersite Analytical Method Parallel)
September 2010 L. McGovern (Revised: Intersite Analytical Method Parallel)
September 2011 L. McGovern (Revised: Intersite Analytical Method Parallel)
March 2012 L. McGovern (Revised: Intersite Analytical Method Parallel)
Phoned Result and/or Order
Protocol For Phoning A Result
● Who called the result.
● The name of the person who is recording/reading back the result and their location.
● The date and time the result was called.
Example: SW to BF (ER) 9/16/05 0900
Protocol For Taking A Phoned Result
● Patient name and unique identifier number or date of birth
● Date and time of call
● Test name
● Date and time of test collection
● Called by
● Called to
● Special notes
Protocol for Phoned Microbiology Result:
Call the following test results to the nurse’s station (if inpatient or nursing home resident) or Doctor’s office (if outpatient):
Protocol for Phoned Order:
Federal regulations require that a laboratory have written confirmation of a physician's verbal order. When a physician or his designee calls United Clinical Laboratories with patient orders:
June 1988 M. Bonifas CAP Inspection Committee;
September 1992 S. Hosch (Revised: Added section I.3)
January 1994 C. Sullivan (Microbiology phoned result protocol)
December 1995 M.J. Bonifas (Phoned orders)
August 2000 M. English (Revised: combined all phoned result/order protocols)
August 2005 J. Mueller (Revised: patient identifiers)
April 2009 J. Wedig (Revised: protocol for phoned Microbiology results)
Outdated Materials, Use of
The continued use of any reagent, standard, control or other time-dated laboratory consumable past it’s expiration date will require the authorization of the Technical Director. If the Technical Director is not available, a department/site supervisor may grant tentative authorization for the extended use of outdated materials but this tentative authorization must be formally approved by the Technical Director at the earliest opportunity.
1. Check every chemically or biochemically degradable laboratory product for its expiration date every time it is used.
2. If the outdate is approaching and if a routine switch to fresh-dated material is not anticipated, notify the department head or laboratory manager.
3. If the outdate has been exceeded and authorization by the Technical Director for continued use of the product is not posted, do NOT use it. Contact the department head or laboratory manager.
4. Never assume that past authorization for the continued use of an outdated product will serve as tacit permission to do it again.
5. Occasionally a manufacturer will “extend dating” on a product. Continued use of these products will still require authorization from the Technical Director.
June 1986 S. Raymond
Package Insert File Program
It is fairly common for products used in the clinical laboratory to undergo modifications in manufacturing, handling instructions, material components, information, etc. Occasionally, one of these changes will have a considerable impact on the procedure governing the use of the product. In order to bolster our efforts at recognizing and reacting appropriately to these changes, the Package Insert File program has been implemented.
MMC-DBQ work areas:
remainder of laboratory
MMC-DV work areas:
Finley work areas:
Cath. Square work areas:
Medical Assoc. work areas:
A. If the revision dates are the same, discard the insert from the package just opened.
B. If the revision dates do not match:
a. Compare the revisions of the current insert to the one on file to see what has changed.
b. If the change is likely to affect the procedure, consult with the Site Supervisor/Manager before using the product.
c. Clip the current insert to the one that has been on file and leave them on the Site Supervisor’s/Manager’s desk for review, information dissemination and re-filing.
June 1988 S. Raymond, CAP Inspection Committee
August 1994 L. McGovern (Revised: II.2.)
Citrated Plasma For Coagulation Studies
(Protime, APTT, Thrombin Time, Fibrinogen, DVVT And D-Dimer)
Plasma is the blood component used in laboratory tests to evaluate clotting. Soluble citrates act as anticoagulants by combining with the calcium in whole blood to form insoluble calcium salt. Plasma thus obtained can be studied by the addition of sufficient calcium to neutralize the anticoagulant that was added.
A. 3.0 ml tube:
a. Using a 100 µl Rainin pipettor, remove and discard 100 µl of the sodium citrate from an unused 3.0 ml draw Greiner vacuum tube.
b. Using a sterile, plastic syringe, draw 3.0 ml of blood from the patient.
c. Remove the needle from the syringe and immediately transfer 2.7 ml of the blood to the citrate tube that has been specially prepared.
d. Replace the blue stopper and mix by inverting 10 times.
e. Assay the sample within one hour.
B. 2.0 ml tube:
a. Using a 50 µl Rainin pipettor, remove and discard 50 µl of the sodium citrate from an unused 2.0 ml draw Greiner vacuum tube.
b. Using a sterile, plastic syringe, draw 2.0 ml of blood from the patient.
c. Remove the needle from the syringe and immediately transfer 1.8 ml of the blood to the citrate tube that has been specially prepared.
d. Replace the blue stopper and mix by inverting 10 times.
e. Assay the sample within one hour.
Preparation of specimens for testing:
2-25-81 S. Raymond
1-10-91 L. Kelley, MT(ASCP)
12-1-93 J. Schultz, MT(ASCP)(Revised: note under III-3-B)
6-28-96 L. Kelley, MT(ASCP) (Revised: added III.2.C.)
4-9-98 L. McGovern, (Revised: III.1. and III.4.)
February 1999 L. McGovern (Revised: III.3.Note)
April 2002 L. McGovern/S. Raymond (Revised: III.1.B-D., 3.B.Note, 4.; IV.3-4.)
May 2002 L. McGovern (Revised: II., III.B,D.)
June 2008 L. McGovern (Revised: updated for ACL Elite Pro)
August 2009 L. McGovern (Revised: updated for NCCLS guidelines Storage 1-3.; Reference 5.)
September 2009 L. McGovern (Revised: for Greiner 3.0 ml sodium citrate tubes)
April 2010 L. McGovern (Revised: for Greiner 2.0 ml sodium citrate tubes)
Hematology Specimen Integrity addresses criteria for inspecting, accepting and rejecting specimens received into the hematology section. It should be read and understood by all phlebotomists as well as anyone performing analysis in Hematology.
Obtaining a Greiner vacuum tube or microtainer specimen:
Note: Heparin specimens from cardiac surgery cases may be accepted for hemoglobin only.
Processing all specimens:
Open tube sampling systems:
Closed tube sampling systems:
Hemoglobin and Hematocrit:
WBC and Peripheral Blood Smear:
December 1985 J. Mueller
April 1997 J. Schmitz
January 2001 L. McGovern/S. Raymond (Revised: II.4. added)
May 2006 S. Hosch/L. McGovern (Revised: for new hematology analyzers)
March 2011 L. McGovern (Revised for K2 EDTA Microtainer MAP tubes, storage & stability info, ref 4-5 added)
April 2011 L. McGovern (Revised: Processing all specimens section)
Criteria are outlined for inspecting, accepting and rejecting veterinary specimens received in Specimen Processing.
Refer the “Specimen Handling: Identification, Integrity & Rejection” policy in the Quality Assurance Manual.
Note: This policy is limited to veterinary specimens and is not applicable to human specimens.
October 1993 L. Kelley/S. Raymond
November 2004 L. McGovern/S. Raymond (Revised: II.3.B.)
Turbidity Assessment, Serum
Note: "Standardized print" refers to font attributes as they appear in this printed statement or in the body text of our analytical procedure template; i.e., Arial 10 - 12 - plain.
L. McGovern; March 1996
Specimen Handling: Identification, Integrity & Rejection
All specimens received in the laboratory must be properly identified and meet specific integrity requirements to assure the best patient care. This policy establishes guidelines for acceptability and steps to take to resolve problems.
All specimens must be labeled in the patient's presence per required standards (Joint Commission; College of American Pathologists).
The following criteria must be met for any specimen to be tested. Any deviations are documented on the Specimen Problem Form (SPF) form and filed in the appropriate Incidence Log in the specimen processing area.
Specimen processing personnel are responsible for ensuring all criteria have been met before referring any specimen. If a referred specimen’s identification and/or collection documentation is incorrect or incomplete, call the referring laboratory for resolution and complete and file an SPF.
Certain instances require a client's participation in the completion of the SPF and are indicated in the instructions that follow.
A. Laboratory Collectibles
1. Two patient identifiers are required: patient name AND unique identification number (e.g., medical record number; Typenex number; birth date; patient's account number; client number)
No patient identifiers: Reject unless extreme circumstances exist (e.g., unable to redraw patient). If questionable, consult a supervisor, pathologist or designee. Complete and file an SPF.
One patient identifier: Return to phlebotomist/collector for complete patient identification. Complete and file an SPF.
2. Collection date, time or collector’s initials missing
Transcribe the information from the accompanying requisition if the accession # matches, or return the specimen to the collector for completion. If the collector is not available use the appropriate disclaimer(s). Complete and file an SPF.
b. CLICS requisition
All collector information must be completed. Transcribe the information from the specimen to the requisition if the specimen has the missing information, or return the requisition to the collector for the missing information. No further documentation needed.
B. UCL Client Collectibles (excluding Veterinary)
1. Two patient identifiers are required: patient name AND unique identification number (e.g., medical record number; Typenex number; birth date; patient's account number; client number)
No patient identifier: Call the client and reject the specimen unless extreme circumstances exist (e.g., unable to obtain another suitable specimen).
Rejection agreed to by client; complete an SPF and attach a copy of the SPF to the order for office tracking. File the original SPF.
Client requests the unlabeled specimen be tested:
If the specimen does not have time restrictions, send the specimen (only if easily transported; e.g., a culturette swab), office paper work and the SPF to the client for follow-up. Save a copy of the SPF in the pending area at the receiving site awaiting the return of the specimen and order. File the returned completed SPF.
If the specimen must be tested the day of arrival for appropriate patient care, FAX the SPF to the client for identification confirmation, noting that the client’s signature denotes responsibility for the identification of the specimen. Results are held until the client returns (preferably by FAX the same day) the completed SPF. File the returned completed SPF.
Enter the comment “Specimen received unlabeled; client verified.” in the Report Comment section of the result.
2. One patient identifier:
Call the client and inform them the specimen was not adequately identified. FAX an SPF to the client for identification verification. Hold the results until the completed SPF is returned (preferably by FAX the same day). File the returned completed SPF.
3. Specimen does not have the date/time of collection:
Use the appropriate disclaimer(s) and complete and file an SPF. No client contact is necessary.
4. Specimen does not have the collector’s initials:
Use “UNK” as the collector. No further documentation is necessary.
5. Specimen requires a fixative and was received with no fixative label:
Call the client and ask if a fixative was added. Add a fixative label if the client added fixative or add the fixative (if appropriate) and fixative label if the client did not add fixative. Complete and file an SPF.
6. Specimen and client request do not match:
Name or identification number mismatch must be clarified by the client. Call the client to inform them of the discrepancy and FAX the SPF. Hold the results until the completed SPF is returned. File the returned completed SPF.
b. Client request (order)
1. Call the Client for missing information and complete and file an SPF:
Date of Birth
Specimen Source (type or body site) if not blood
2. Date and/or Time of Collection:
If not provided on the specimen use the appropriate disclaimer(s) and complete and file an SPF. No client contact is necessary.
3. Initials of Collector:
If not provided on the specimen use “UNK” as the collector. No further documentation is necessary.
4. Priority Level:
No action is necessary.
C. Veterinary Collectibles
Note: Veterinary testing is not regulated by UCL’s accrediting agencies. Due to the unique nature of veterinary testing, only one patient identifier is required.
a. Veterinary specimens require only one identifier (animal name or identification number). Call the veterinary office for clarification and complete and file an SPF if:
1. Unlabeled specimen received with identifying paperwork.
2. Specimen label does not match the request.
b. If necessary information is not included on the request (or on the specimen) call the veterinary office for clarification and complete and file an SPF. Necessary information includes:
1. Animal type
2. Specimen source (type or body site) if not blood
3. Owner’s name
4. Ordering veterinarian
c. If the Date and/or Time of collection are not provided use the appropriate disclaimer(s) and complete and file an SPF. No client contact is necessary.
d. If the collector’s initials are not provided use “UNK” as the collector. No further documentation is necessary.
D. Backup Testing Form
If any information is missing, call the referring site. No further documentation is necessary.
a. Name of the referring site
b. Initials of person notifying the receiving site
c. Date and time of notification
d. Receiving site
e. Accession numbers of each specimen
f. Test(s) requested
E. Aliquotted (separated) specimen
a. If a specimen is accompanied by a requisition or backup testing form, only the accession number is required. An unlabeled aliquot is rejected unless extreme circumstances exist. Call the referring site and ask for a new, labeled aliquot. Complete and file an SPF.
b. If no requisition or backup testing form accompanies the specimen all the following information is required: Call the referring site if more information is needed. No further documentation is necessary.
1. Patient name
2. Patient identification number
3. Accession number
4. Date and time of collection
5. Initials of phlebotomist/collector.
a. The medical record number must be handwritten from the patient's hospital armband onto the specimen or the accompanying requisition.
b. A Typenex number must be placed on the specimen (or handwritten) if the Typenex band is the patient's identification.
A. Retrievable specimens not meeting the required integrity for testing are rejected.
B. Irretrievable specimens (e.g., CSF, biopsy, pre-antibiotic specimen) not meeting the required integrity for testing are brought to the attention of a supervisor, pathologist or designee (with the exception of veterinary specimens).
C. QNS specimens – consult with the ordering physician as to which tests to perform. Document the physician response on the SPF and cancel all tests that could not be performed. File the completed SPF.
1-20-89 M. English
8-13-93 M. English (Revised: format; II.3.B.)
3-14-95 S. Raymond (Revised: added II.3.B.c.)
6-23-96 M.J. Bonifas (Revised: II.1.B.1-6.)
4-3-97 J.A. Schmitz (Revised: added II.1.A.g.-i. & II.3.C.)
12-17-97 S. Hosch (Revised: added II.1.A.j. & II.1.B.g.)
1-13-98 E. Steiner (Revised: added II.1.B.h. & II.3.B.e.)
January 2007 L. McGovern (Revised: adding a test requiring new accession no.)
January 2009 S. Hosch (Revised: Nurse/Physician/UCL Client collectibles-patient identifiers)
July 2010 S. Rodriguez, S. Raymond (Revised: for two patient identifiers; mod. sample relabeling)
Reported Erroneous Results
Reported Erroneous Results
The Reported Erroneous Results policy directs and documents correction and follow-up activity when an erroneous result has been released from the laboratory by phone, computer or charted report. The purpose of the policy is to prevent harm to the patient and the occurrence of similar errors in the future.
When a reported error is discovered notify the supervisor (or acting supervisor) immediately. The supervisor will take the following steps in the following order:
Reported Erroneous Results
RERL Log; page 1
Reported Erroneous Result Log
RERL Log; page 2
6/20/89 S. Raymond/J. Brennan, M.D.
7/10/86 T. Edmonds, M.D. (Revised: II.2.D.a.)
September 2001 M. English/S. Raymond (Revised: II.2.E.b-c. & F.; format)
April 2007 M. English/S. Raymond (Revised: protocol 4.g. & RERL)
Calibration Verification; CLIA Specifications
Regulations for Calibration Verification:
Joint Commission Standard QSA.02.03.01
Calibration Verification Policy
Reference: CAMLAB Update 1, March 2012.
Reportable Range is verified during the Calibration Verification process on those procedures to which Calibration Verification applies.
For Calibration Verification, tests are broken down into categories and acceptable materials determined for each test.
Reportable Ranges are adjusted in the relevant analytical procedures to correspond with the six month AMR (Analytic Measurement Range - Calibration Verification).
Test List with assigned Calibration Verification Categories & material (pgs 1-5)
Calibration Verification Categories:
- Tests included:
DxC: AMPH, BARB, BENZ, COCM, OP2, PCP, THC5, UCRP, CAR, DIGN, GEN, PHNB, PHNY, THE, TOB, VPA, VANC, ACTM
DxI/Access: T4 FREE, PSA, Myoglobin, Troponin I, CKMB, Estradiol, Progesterone, Cortisol, BhcG, TSH, FSH, LH, Folate, Ferritin, Prolactin, Free PSA, CA125, CEA, B12, T4, PTH, PTHio, Testosterone, AFP
Architect: Homocysteine, Anti-CCP, 25-OH Vitamin D2/D3.
Xpand Plus: ALB, ALP, ALT, AST, BUN, CA, CHOL, CK, CRE, CRP, DBILI, GGT, GLU, HDLD, HBA1C, LD, LDLD, MA, PHOS, TBILI, TP, TG.
- Tests included: APTT, Thrombin Time, Protime, Fibrinogen, DVV, Hepcon ACT, Gem PCL ACT, TU, Platelet Function testing (P2Y12), Sed Rate.
- Tests included:
Architect: Hepatitis C Ab, Hepatitis BsAg, Hepatitis BsAb, Hepatitis A Ab, HIV Combo
QuantiFERON TB Gold, Aptima Chlamydia & GC, Polymedco iFOB
BacT/Alert: Blood Culture
- Tests included: Hemocue HGB, PXP Glucose, Coaguchek XS INR, Cholestech (Chol/TG/HDL/Glucose), Bilichek, Metrika A1c, DCA 2000+ HgbA1c (Bayer), INRatio 2 INR.
- Tests included:
Clinitek Status: UA Chemical Screen
- Tests included: BNP, iSTAT blood gases, iSTAT1 Troponin I, iSTAT Creatinine, i-STAT Lactate, i-STAT Chem 8+ (Na, K, Cl, TCO2, BUN, Glucose, Creatinine, iCa)
- Tests included:
Beckman DxC/CX5: ALB, ALP, ALT, AMM, AMY, ASO, AST, BUN, CA, CHOL, CL, CK, CO2, CRE, DBIL, ETOH, FE, GGT, Glu, HDLD, HPT, IBCT, IGA, IGG, IGM, K, LD-L, LDLD, Lipase, LITH, MG, Na, PO4, RF, TBIL, TP, M-TP, TG, URIC ACID, CRP, MA, SALY, CRPH, Prealbumin, HbA1c
Xpand Plus: NA, K, CL,
Elite Pro: D-Dimer
Nanoduct Sweat Chloride
LH500/LH750: RBC, HGB, WBC, PLT, Retic
Gem Premier: pH, pO2, pCO2, Na, K, Cl, iCa, Glucose, Hgb, Hct
Gem OPL: Total Hgb
Note: “The requirements for analytical measurement range verification apply only to those parameters that can be calibrated and that are measured directly. These requirements do not apply to calculated parameters. Most oximeters can be calibrated for only total hemoglobin. Therefore, AMR verification should be performed for total hemoglobin, but it is not needed for derived quantities, such as carboxyhemoglobin fraction or methemoglobin fraction.”
(CAP; Laboratory Accreditation Newsletter, Queries and Comments. 05/31/2011)
20050322 L. McGovern/S.Raymond
20050922 L. McGovern
20081021 L. McGovern/S.Raymond
20091119 L. McGovern
20100427 L. McGovern
20101012 L. McGovern
20110504 L. McGovern
20111007 L. McGovern
20111121 L. McGovern
20111216 L. McGovern/S.Raymond
Information Technology Problem Report/ ITPR
The Information Technology Problem Reporting program with its ITPR template [see ITPR template following this paragraph] enables laboratorians in the process of pattern recognition, documentation and response to IT problems they might encounter. It, like the IPR (Instrument Problem Report), is an evidence-chain format designed to coordinate detection, recognition, and remediation of LIS problems early on, hopefully while they are still relatively minor and before they pose serious interruption of services and consumption of resources.
The ITPR is divided into 8 sections:
It is always reviewed by the LIS Project Coordinator.
It is frequently reviewed by the LIS Director and/or Technical Director/Chief Information Officer (CIO).
Final resolution is communicated by the LIS Project Coordinator to the Supervisor for dissemination of information to staff.
Section 7 allows space for formal comment from LIS Director and/or the Technical Director/CIO.
2005 S. Raymond (Technical Director/CIO); M. English (LIS Project Coordinator)
Product Notices and Safety Recalls
Not infrequently, almost every day in our system, a manufacturer/provider of clinical laboratory material (instrumentation, reagent, hard consumables, software) will have to communicate an important message or directive to the end user regarding the handling of a specific product.
Content of these notices ranges from the purely informational, requiring no action, through procedural workarounds to discontinuance or even recall on occasion.
Notices are sent by the manufacturer/provider to the laboratory site(s) where the product in question is registered with them. The format is usually a letter by mail, fax or email (sometimes all three) and is almost always accompanied by a form to be filled out and returned to the supplier documenting that the target site has received the message and understood the content. (See example)[4 pgs thumbnail hyperlink]
Handling Product Notices
The laboratory site manager or department manager reviews the notice, reacts and formally responds to the provider. Initial activity prompted by the information, including the fact that the loop has been closed with the manufacturer/provider is documented on the Product Notice. It is dated and signed by the manager and then forwarded to the Office of the Technical Director for review, further action if warranted and scanning/indexing for future reference and retrieval.
Often times a Product Notice sent to one site will have longer term ripple effect on the rest of the enterprise triggering responses such as but not limited to:
● lot discontinuance
● product recovery
● procedural compensation
● consolidation of specific testing
● results retrieval and review
● physician notification
20070228 S. Raymond
In addition to Reference (normal) Range, Critical values, Important Called Results (ICR), Reportable Range and Delta flagging, CLICS screens results of select analytes for Unlikely values. An “Unlikely” result is statistically improbable under any circumstances and needs to be investigated.
20120215 S. Raymond (Technical Director/CIO)
20120306 S. Raymond (Technical Director/CIO)