Nuclear Medicine: Quality Control for NM Detectors
Review of frequently tested quality control measures for nuclear medicine detectors including dose calibrators, well counters, gamma cameras, and G-M counters.
Nuclear Medicine: Quality Control for Nuclear Medicine Detectors
Edited episode transcription:
The key with a Geiger Mueller counter is these have high sensitivity and are used commonly to detect contamination. So, you are measuring low levels of radiation with a Geiger Mueller counter.
With other detectors such as an ion chamber like a cutie pie, you are often measuring elevated levels of radiation.
Neither can tell the difference between types of radiation or different energies. They simply detect that radiation is present.
There are several things you need to remember in terms of quality control:
Geiger Mueller Counters:
Daily for a Geiger Mueller counter is battery checks. Make sure the battery works.
Daily constancy test where you measure the count rate sensitivity. What you do is hold it up to a long half-life radiation source, such as Cesium 137, that is often attached to the Geiger Mueller counter itself.
You should also perform a daily background count rate, which is measured in an area remote from radioactive sources. So do not measure this in the hot lab but get it out in the general non-radiation areas and confirm that this survey meter is not contaminated.
If you go in the patient waiting area and the Geiger Mueller counter is detecting a lot of radiation, hopefully you did not have a major spill where patients are waiting. More likely you have contaminated your probe. That is the way to check.
At installation, annually, or after repairing you also need to calibrate the Geiger Mueller probe for accuracy, which is determined by using a long-lived radioactive source, again often Cesium 137, but in this case (unlike the course you use to measure count rate sensitivity) the source activity has been strictly validated.
Remember that Geiger Mueller probes can give you a millirem per hour reading, but they cannot tell you the energy of the radiation that it is detecting.
Well Counters:
For quality control for a well counter, you need to do daily calibration which is typically performed with Cesium 137. Remember, a well counter is like a photomultiplier tube unit in a gamma camera, but it is just one unit (unlike multiple tubes/units used in a gamma camera). can drop radiation inside of this well, so it has high geometric efficiency. This type of detector uses Indirect conversion of radiation to light via scintillation.
Daily calibration: Unlike a Geiger Mueller probe, you can separate energies with a well counter. You want to make sure when you are calibrating a well counter that you center it around the photopeak of Cesium, which you likely will not need to memorize for the Core Exam. However, it is 662 keV just so you know.
At installation and annual repair for a well counter or thyroid probe, you need to measure efficiency. You can also think of this as sensitivity. Your day-to-day measurements on this should be within 10%. If you fall outside of this range on your daily measurements, you need to re calibrate the instrument.
Gamma Cameras:
Daily uniformity testing. What you do is either image a Cobalt 57 flood source, or a Technetium 99m source, to ensure that the image is uniform. This is a flood image. The uniformity of the image tests whether you have an area of non-uniformity meaning one area looks brighter or absent despite flooding the entire camera with counts. If so, you have something wrong. Non-uniformity can denote a problem with the photomultiplier tube, or it can be a problem with your collimator for a gamma camera.
Weekly spatial resolution and linearity testing: To do this you use a four-quadrant bar phantom. If you haven't seen what this looks like, make sure and look it up. To assess adequate spatial resolution, you identify the quadrant in the image that has the smallest detectable line. For linearity you want to look and make sure that the lines (on the bar phantom) look straight.
Flood images: Remember that when you do a flood image with Technetium, you're using a point source that you hold away some distance from the gamma camera so that the counts shower down evenly across the face of the detector. If you use a Cobalt source, that is a sheet that just lies flat directly on the detector.
Intrinsic and extrinsic resolution: Extrinsic spatial resolution is the resolution of the system when you have the collimator on it. I remember extrinsic resolution by thinking about the radiation source being more external to the gamma camera because you have the collimator in between the gamma camera and the source. Intrinsic resolution has no collimator, and you're testing purely the performance of the gamma camera itself.
Dose calibrators:
A dose calibrator is a gas-filled detector and does not use a photomultiplier tube as in a scintillation-based detection like a well counter. There actually is gas inside that has ion pairs that form when exposed to radiation. A dose calibrator has optimal geometric efficiency, but it cannot separate between types of energies like you can with a well counter. Remember that a well counter is a scintillation-based detector like a gamma camera that has a photomultiplier tube. Remember that gamma camera systems can distinguish between different energies from isotopes, and you use that ability to discriminate between energies to improve your spatial resolution (by rejecting counts that lie outside of the expected energy range of the radioisotope you are imaging thereby reducing scatter detection). Remember that a dose calibrator cannot separate between types of energies. Therefore, the technologist must tell the dose calibrator what radioisotope they are going to measure. There are buttons on the dose calibrator that you push for each different type of radioisotope. A dose calibrator is good at determining the radiation dose in terms of the actual dose present.
Dose calibrator quality control testing: Daily you do the constancy test. Remember that you are constantly (daily) testing constancy. This test is performed at installation and daily thereafter. This is a measure of precision. For this test, you typically use Cesium 137, which has a 30-year half-life, which therefore yields very reproducible day-to-day measurements. You measure this every day on all the different isotope settings that you are likely to use. You place this Cesium 137 source in the calibrator, measure and record the value, and compare it with recent values to determine the constancy of your system.
Linearity testing: Every quarter you need to do the linearity test. For this test you prove that the calibrator can reliably measure sources at the radiation levels you are likely to encounter. Therefore, you need to do this from the microcurie into the millicurie range. There are a few ways you can do this. One option is to take a lot of Technetium 99m, say up to 300 millicuries. Measure this at baseline at full activity (no decay) and then continue to measure at regular intervals for up to 48 hours. As the Technetium decays you check whether the measurements you get follow the predicted decay of Technetium, and therefore determine the linearity of your dose calibrator. A disadvantage of this is that it takes 48 hours and is labor intensive.
Another option is to take a single high activity technetium source, measure it, and then put successive lead shielding around that source. You then measure the Technetium with successively increasing shielding (with known half-value layers and thicknesses) and see if the measurements you get are in line with what is predicted from the progressive increase in shielding that you apply to the system.
Both will give you linearity.
Annual accuracy testing: The key to this is to remember the A (Annual) with the A (Accuracy). So, you're constantly testing Constancy (so that's daily) and Annually you're testing Accuracy. This test shows that the dose calibrator gives you the correct readings throughout the entire energy scale. It's honestly surprising to me that you only must do this annually, but I guess it makes sense that at installation and then every year you do a test and prove that it is accurately measuring the dose of a standard radioisotope that is designed for this accuracy testing. Then once you've established that it is accurate, every single day from that point on, you check to make sure that it’s maintaining constancy. That makes me feel better about it that you get an accurate measurement then every single day (constantly) you're testing the constancy to make sure the dose calibrator doesn't vary. And if you've shown every day that you have constant values then you also know that you're maintaining the accuracy of the system. And every year you then must go back and prove that this is still accurate, although you should already know it will be because you've been constantly testing the constancy.
Geometry testing: At installation and repair and anytime you do anything major this system you have to test the geometry. This test shows that you get this same reading no matter how you put the sample into the system. You can have a very large amount of radioactivity in a very small amount of fluid, or a very small amount of radioactivity in a large amount of fluid and you can imagine how geometrically inside the detector that is geometrically different. If the system is working well, you should get a good measurement regardless.
One way to test this is take activity in a very small amount of water and then progressively dilute it to a larger amount of water. In between, test and make sure that the dose calibrator is not changing the radiation reading because all you are doing is adding water to the system. This does change the geometry but not the activity within the dose calibrator. If you have a deviation of greater than 10% then you are obligated to record that value and then fix and note the repair or recalibration or whatever fix you did. Then retest to make sure you're now in the expected range and then go forward from there.