Daily Instrument Tuning
Delta Plus Advantage:
Although our instrument configuration does change over time, we currently have three peripherals connected to the Delta Plus: an EA, and a TOC analyzer, both connected via a Conflo interface, and a Gasbench with the multiprep heating block. Gas flow must go from the peripheral that will be used to the mass spectrometer during the tuning. It also needs to be in its operational mode: ie. the EA or TOC analyzer should be in "work" mode.
The MAT 253 has three peripherals: a TC/EA, with associated Conflo interface, a GC-C/TC with associated interface, and a dual inlet. If we are doing dual inlet analyses, then the SGE valves for the continuous flow inlets should be closed. If we are doing continuous flow measurements then only the SGE valve for the appropriate peripheral should be open. We record the ion source pressure with each peripheral connected and that number has remained constant over years.
The overall tuning process includes:
1) background checks for leaks or high moisture
2) peak centering the reference gas
3) autofocusing the instrument
4) stability checks
5) linearity checks (including the H3 factor measurement for hydrogen)
1) Background checks:
Before any analyses it is necessary to check the background levels of H2O, N2, O2, and Ar. We start with the instrument in the CO2 gas configuration. Then do a mass jump with the center cup to the appropriate mass (say, m/z 18 for H2O). We then do a peak center at this mass and record the beam intensity on the center cup. We repeat this process for m/z 28 (N2), 32 (O2), and 40 (Ar). Note that high N2 and O2 are indicative of a leak with atmosphere somewhere in the system, additionally, a high water background will result in a higher m/z 32 background. High water can indicate moisture collecting somewhere or failing water traps. In particular, if the instrument has been vented to atmosphere then a high water background can be expected for a couple of days. Although the atmosphere is about 1% argon, monitoring this gas is not as sensitive to leaks as N2 and O2. In a properly operating system, argon is mainly present as an impurity in the UHP helium. For us, the argon background is an indication of the quality of our helium. It happens that occasionally the argon background is 50 to 100 times higher than usual with a new helium tank. In this case we return the helium tank to the supplier and ask for another one. High argon tells us that the quality of helium in the tank is not reliable and it also makes it difficult for us to troubleshoot leaks in the system. There will be more information on this in the "troubleshooting" section, but briefly: if we suspect leaks then we set the center cup to monitor m/z 40 in instrument control and spray the fittings with argon. If there is a big enough leak then we'll see the m/z 40 beam rise. If there is a lot of argon in our background then it becomes more difficult to see leaks when troubleshooting.
2) Peak centering the reference gas
After we have documented the background levels, then we set the center cup to m/z 45 and insert the CO2 reference into the open split. A new peak center is done and the value of the high voltage is recorded.
3) Autofocusing the instrument
There are special precautions that need to be taken for hydrogen analysis. These are not addressed here, instead check the "Hydrogen" section. With the reference gas on, we then choose the autofocus option and let Isodat do the work. We had done some initial studies to see how the beam intensity and isotope ratio varies with the tuning parameters. This should be done for any new gas on any instrument in order to characterize its behavior. After the autofocus is done the new parameters are saved with the "add to gas configuration" option.
4) Stability checks
The instrument stability (and precision) are checked by running a series of reference gas pulses and seeing what the precision of the measurement is. The manual refers to this as the "zero enrichment" test. We generally define the second reference gas peak as the standard in the method file and set its isotope ratios to zero. We then check the standard deviation of a series of 10 reference gas pulses. For a reason that we really don't understand, the first run usually comes out worse than the others, so we perform this check two to three times in sequence. For δ13C, δ18O (from CO2), and δ15N we generally get standard deviations that are below 0.05�. It is important that the standard deviation here be low as it represents the best precision that can be obtained with the instrument. If poor precision occurs here then it is necessary to figure out why the precision has died before proceeding with any sample analyses.
5) Linearity checks
Another issue with mass spectrometers is linearity. Linearity refers to the reproducibility of a result with different amounts of sample or signal. One way to test this is by performing a series of 10 reference gas pulses, but manually increasing the reference gas pressure at the end of each pulse. The result is a series of reference gas peaks stepping up in intensity. The spec for the both of our mass spectrometers is that the isotope ratio should not change by more than 0.06�/V. Ie. if one reference gas peak is 1 volt higher in intensity than another one, then its measured isotope ratio should not be more than 0.06� different from the first peak. The way this is monitored is by looking at the slope of the regression curve for the series of gas pulses. For any continuous flow device, the best way to perform a linearity check is by varying the amount of a real standard that is analyzed and observing the effect of the sample size on the measured isotope ratio. This most closely reflects the real run conditions, and also allows you to make an appropriate linearity correction to the data.
δ13C, δ18O and δ15N: All of the above checks are performed for each reference gas that will be used.
δ2H: All of the above checks are performed for hydrogen. However, the H3+ factor must be determined for hydrogen. This is effectively a linearity correction as well.