Optical power meter detector characteristics
Optical Power meters can be usefully characterized by defining the type of detector used:
InGaAs (Indium Gallium Arsenide)
InGInGaAs detectors provide excellent performance in the 1000 to 1650 nm windows, however often their accuracy in the 850 nm window is terrible, due to very high wavelength sensitivity. So they are the preferred detectors for work on high performance single mode systems. For accurate power measurements on WDM systems beyond 1550 nm, InGaAs detectors are required.
Ge detectors provide inexpensive and modest performance over the 850 to 1550 nm bands, but above 1550 nm they are effectively restricted to relative (loss) measurement usage only. If used on WDM systems at wavelengths above 1550 mn, expect absolute errors of a few dB. Absolute accuracy in the 1550 nm band can be reduced, eg at 1580 nm in cold conditions.
They are a good solution for installers who do a mixture of multimode and single mode work at 850 and 1300 nm, with occasional jobs at 1550 nm. They can also measure the 630 - 670 nm red light used in plastic fibers or a visible fault locators.
Si detectors provide excellent accuracy at 850 nm and visible wavelengths. They are therefore used on 860 nm multimode & PCS systems, and 660 nm POF systems.
However note that many cable test standards require loss testing at 1300 nm on multimode systems, in which case a silicon detector is not adequate.
High Power detectors consist of one of the above with an attenuating filter placed in front of the detector. Various types of attenuator elements have varying levels of wavelength sensitivity, coherence sensitivity, polarization sensitivity and reflection levels, all of which can badly affect meter accuracy. The attenuators used on Kingfisher meters have been carefully optimized to provide nearly ideal performance for this application.
Alternatively, the best way to attenuate high power measurements in a laboratory, is to use an integrating sphere, which provides high accuracy.
Semiconductor optical detectors convert one photon of light energy into one electron of electrical current, with an efficiency that is wavelength dependent. Fig 9.1 shows the response curves for three detectors, including "traditional" InGaAs showing the poor characteristics at 850 nm.
The energy in each photon of light is determined by the light wavelength, and must be at least as big as the electron band gap of the detector material. At the wavelength where the photon energy exactly matches the band gap energy, peak responsivity is achieved. If the energy is too low, no current flows. If there is excess energy (eg shorter wavelengths), then the excess energy is lost as heat.
The detector response to wavelengths longer than the peak efficiency is very temperature sensitive. So for precision work, it is not advisable to use a detector in this region.
An important part of these detectors is an anti-reflection coating on the detector surface, without which the detector material would reflect most of the light before it ever reached the active region. The performance of this anti-reflective coating critically affects overall performance
The detector package window should be made of optically flat glass with an anti-reflective coating. Many cheaper detectors fail in this area.
The performance of both the package window and anti-reflective coatings can significantly affect overall detector performance.
The responsivity or calibration of a particular detector at a particular wavelength is usually defined in terms of Amps per Watt, or A/W.
Correct calibration is important when measuring absolute power levels such as transmitter output and receiver input power. However attenuation measurements are always relative power readings, in which case only linearity and repeatability are important.
Calibration significantly affects accuracy. A power meter detector requires calibration close to the wavelengths at which it will be used, and the calibration should be properly traceable to a national standard. ‘Calibration uncertainty’ refers to the uncertainly under ideal conditions, eg with a particular fiber and connector type, wavelength, power level, and at a laboratory temperature.
Most electrical multimeters quote ‘basic accuracy’, which includes all calibration, linearity, ageing and temperature effects, and so gives a good idea of what a meter will achieve in real use. However with optical meters, quoted "accuracy" is often just "calibration uncertainty", so it gives no clue as to expected performance under real conditions. This is probably a hang-over from the days when these were really scientific instruments, but it is most unhelpful to current users.
So to find out what an instrument will achieve in field conditions, you may need to assess the effects of changing fiber types, connectors, power levels, wavelengths (eg within a band ) and temperature, and if any additional "zeroing" procedures are required to achieve this performance. It is often hard to do this assessment, since specifications are left deliberately vague.
Kingfisher quotes a "Total Uncertainty" specification for power meters, which is equivalent to a "basic accuracy" specification, since it includes all of these effects.
Most national standards authorities suggest a standard re-calibration period for electronic instruments of 1 year. However most meter specifications are so imprecise compared to their electronic systems, that this is generally excessive, and anyway calibration does not adjust linearity or other offsets that users might find worrying. The almost universal reason we find for meters going out of calibration during use is dirt or physical detector damage, since the optical detector is relatively delicate and easily damaged.
Fig 9.2: Effect of temperature on spectral response, and therefore accuracy, for an InGaAs detector.
However, users might still be advised to perform a low cost verification check perhaps more than once a year, due to the simple question: what is the cost and likelihood of an inaccurate instrument? This suggests that the shrewd user could usefully implement a process of inexpensive regular checks, followed by occasional formal re-calibration.
Calibration accuracy can be upset by changing connector type or connector ferrule materials. On some types of instrument, we have observed errors of 1 dB due to this effect. It is not easy for the typical user to analyze this rather worrying issue. We can say that kingfisher meters are designed to minimize this effect, with errors typically within 0.02 dB.
At high power levels, localized detector saturation occurs, which makes the measurement increasingly non-linear. The level at which this happens depends on the beam profile or geometry. Therefore saturation effects can vary on a case by case basis, typically somewhere above 0 dBm. Instrument specifications tend to be vague, hopeful, or optimistic on the saturation effect. This has only started to become an issue recently, with the introduction of high powered optical amplifiers and some high power linear transmission systems. Look very carefully at accuracy specifications at the required power levels. If in doubt, check with a precision attenuator placed before the meter.
At low power levels, detector amplifier drift dominates, and the detected power can become highly non-linear. This effect can be highly temperature sensitive, to an extent where accuracy is only guaranteed within ± 1OC from " zero " compensation: not much use on a field instrument! Many instruments require the user to apply some sort of "zero" compensation to meet accuracy specifications. This is often not done, or done incorrectly, thus making measurements suspect. The whole subject of "zeroing" is avoided by many suppliers, who only refer to it half way through the operating manual. When specifying an instrument, be very careful when interpreting specifications.
In relation to the dynamic range of an instrument, look for the achieved accuracy over the stated range. It is very easy to make an instrument display a number over a wide dynamic range, but quite another to achieve a specified accuracy over the same dynamic range. Kingfisher instruments are fully specified over their entire dynamic range.
Wavelength sensitivity needs to be checked for your application. For a bad example, an InGaAs sensor may have a specified accuracy of ± 0.2 dB at 850 nm, however measuring over the 850 nm band, eg ± 30 nm, the achievable accuracy is no better than ± 1.5 dB. In contrast, for WDM applications from 1480 - 1625 nm, an InGaAs sensor delivers good accuracy even if it is only calibrated at 1550 nm. The typical error with an instrument set for 1550 nm, but used at 1625 nm, is only 0.3 dB.
Germanium has a number of specific weaknesses, which limit its application to low precision work:
- Linearity is about ± 0.04 dB across the measuring range.
- Surface uniformity is relatively poor at ± 0.09 dB. eg if a light spot is moved across the detector surface, the reading will vary by this much.
- Calibration drifts over temperature, eg at 1300 nm it drifts typically ± 0.1 dB over temperature.
- Responsivity is affected by highly coherent light from DFB lasers, leading to errors of up to 0.5 dB.
- Limited sensitivity.
Because of these combined effects, InGaAs provides measurement accuracy at 1300 and 1550 nm which is markedly better than Ge, even if the instrument specifications do not show it.
How much measurement resolution do I need?
Measurement instruments are available with resolutions of 0.1 - 0.001 dB of resolution. It may occasionally be possible to make use of 0.001 dB resolution in carefully controlled laboratory conditions, but there is very little use for such resolution in this application.
0.1 dB resolution has a major drawback: it can not be used to reliably measure the performance of high quality connectors or splices, since the measurement uncertainty involved is in excess of ± 0.14 dB ( eg ± 1 digit, over 2 measurements ). This is purely due to display limitations, and assumes otherwise perfect performance.
It therefore becomes apparent that 0.01 dB ( 0.23% ) resolution is ideal for most work. It is for this reason that Kingfisher instruments generally provide a resolution of 0.01 dB.
Instrument battery life is obviously a major convenience issue. Kingfisher meters have a battery life in excess of 170 hours. Also important is an auto-turn off feature, and low battery indicator. Where possible, alkaline batteries are preferable to re-chargeable types, due to greater operational convenience and the problem of disposing of toxic cadmium waste.
Connector: An interchangeable connector is obviously highly desirable, allowing the user to change connector styles. An interchangeable connector also allows better cleaning during use.
Speed of response: Many meters seem to have an excessively slow response speed, or spend their time auto-ranging. Always try this out when specifying new meters.
It is a great productivity feature to have a proper reference feature on a meter, since attenuation measurements are related to a reference. The reference feature should allow a separate reference for each wavelength, and also recall the previous reference when the unit is turned back on.
Ease of use: how long will it take to train operators? Kingfisher meters with similar general features and controls are available over the full spectrum of price and performance. A single method of operating all meters could save a large organization plenty on training costs.
It is handy if your meter can also perform some sort of tone identification. This is useful for continuity checking, or locating breaks.
Simplicity of use is very important for a good level of measurement confidence.
It is handy if re-calibration can be performed without opening the instrument, and without adjusting potentiometers.
When examining a meter, check to see that it is reasonably easy to clean the input. Some instruments are seriously deficient in this area.
There is some confusion about the requirement to display power in both logarithmic ( dB, dBm ) and linear ( mW, uW, nW, pW ) units. Traditionally, multi-purpose meters for general scientific applications, can display either unit. However in this particular application it is usual to use dB or dBm units.
There are a couple of ways of achieving productivity gains in dual wavelength measurement situations:
- Some sort of "Autotest" type feature can display both wavelengths simultaneously. This requires the source and meter to communicate, and the meter measures each wavelength alternately. This requires both instruments to have compatible protocols.
- A different arrangement can be used where the source power levels can be adjusted. The meter is referenced to the source and the output power of one wavelength adjusted relative to the other, so that the meter reads 0 dB at both wavelengths, without changing meter wavelengths. Attenuation at both wavelengths can now be measured, without adjusting the meter.
Kingfisher International are specialist designers and manufacturers of handheld fiber optic test equipment.
Our equipment is used in all phases of fiber optic manufacture, installation and maintenance. Our comprehensive documentation and resources help you get started easily.
Kingfisher Products are Australian made, with a global distribution & support network spanning over 70 countries.