This article builds on our understanding of flame detectors developed in issue 761 of this magazine by discussing the need for a method by which “real world” detector performance can be compared.

At present flame detector performance is usually measured using a standard test fire (usually 0.09 m2 n-heptane fire) in ideal conditions and recording the maximum distance at which a unit responds to the fire, for a given sensitivity setting. The ability for a detector to reject false alarm sources is also usually tested, what is not universally tested is the detector response to a standard fire in the presence of various false alarm stimuli.

We previously learned1 that all flame detectors have strengths and limitations, how do we therefore obtain an impartial assessment for the relative performance of one manufacturer and / or technology over another?

One such way is for manufacturers to engage Factory Mutual to test their detector to the requirements of their FM3260 standard test procedure2. This allows a review of each detector to be published (if the manufacturer chooses to release this information) and presents the fire and gas (F&G) system designer with an independent data set with which a comparison of the flame detectors available may be undertaken.

FM3260 is a challenging test procedure that allows manufacturers to promote their device if it is robust enough to perform well in the tests. It also allows operators to review the results, if the manufacturer is prepared to release the details. The procedures in generating the FM3260 report include the following fire tests, which are most relevant to compare relative responses:

  • Section 4.1 Baseline Sensitivity Test
  • Section 4.2 Flame Response Sensitivity Test
  • Section 4.3 False Stimuli Response Test
  • Section 4.4 Field of View

When discussing flame detection performance many manufacturers discuss maximum headline sensitivity and the ability of their product to reject false alarms. Few, however, discuss device desensitisation, this is when external factors can reduce the baseline sensitivity to a level where the product may not be able to respond in an acceptable way. Some might suggest that knowledge of device desensitisation is as important as positive fire detection and the reduction of false alarms.

The table over page gives an indication of the response change for a typical triple IR detector, for various test conditions, note the flame detector sensitivity and fire type tested is unchanged.

FM3260 has had a positive impact on the flame detection industry from the perspective of the purchaser and F&G designer. This information allows designers to accurately map the capabilities of the detectors in the environments in which they are to be placed. With reference to oil and gas assets, most of the time these environments will contain the stimuli present in this testing procedure.

Developing an effective viewing distance3

As discussed previously1, flame detectors can be desensitised by sunlight, optical contamination and modulated black bodies to name but a few. We therefore need a way to use the data from FM3260, along with other factors, to provide a fair comparison between coverage of detectors. This is often referred to as “effective viewing distance.”

To perform the comparison, we need a common start point, in this instance we use the “industry standard” fire of a 0.09mn-heptane fire, in ideal conditions. Other factors that have an impact on detector sensitivity are:

Figure 2: Triple IR.

1) Reduction in sensitivity to genuine flame in the presence of unwanted / false alarm stimuli:

Values for this can be taken from the manufacturer or detector manual if supplied. Unwanted stimuli may include sunlight (direct, modulated, unmodulated, reflected), welding activities, blackbody radiation (modulated / unmodulated), fluorescent / incandescent lamps, shielded and unshielded quartz halogen lamps etc. If the information is not readily available, this could be a sign of poor performance against such test. For the purpose of design, however, it is not unreasonable to use similar values to those of other detectors utilising the same technology.

2) Reduction in sensitivity due to dirty optics:

As the optical fault will not occur until potentially 50% degradation in field of view, the detector could potentially only have 51% of the stated capability in viewing distance without indicating an optical fault. It is therefore critical that the design takes account of this potential reduction.

3) Reduction in sensitivity due to detector field of view:

This factor represents the reduction in sensitivity across the claimed field of view from the maximum which can be found along the centreline. It should be noted that as the fire moves away from the centre-line, the sensitivity reduces. This value differs from device to device as it depends on the detector’s unique field of view, examples of which are given below for two different detection technologies.


We have previously seen1 that when looking to recommend a flame detector it is essential that the spectral characteristics of the flame are matched to the detection technology. In addition, the environment in which the unit is to be used must be understood with regards to interferants (smoke, oil films, particulates, water) that will reduce detector sensitivity and false alarm sources (sun, flare, exhausts, hot process). In simple terms, we need a detector to respond the fire (positive flame detection) whilst reducing the potential of false alarms and recognising that some factors may impede device performance (understand desensitisation).

We have added to our knowledge base a means of comparing detector sensitivity taking into accounts factors such false alarm stimuli, dirty optics and the detector field of view.

For further information, go to