Flame Tracker Module Calibration FSM-1002-001 on GE Mark V

The flame scanner / Flame Scanner module calibration is in regards to a GE Mark V that is running a 7EA Gas Turbine.

We had a secondary flame scanner "crap" out on us so I had replaced it with an ITS Flame Scanner that followed GE spec. About two months after that install, we started getting a nuisance diagnostic alarms coming in on the Alarm Scanner, D_1748_R (S and T) "TCE1 Flame Detector #5 out of limits". They came in during zero speed, while the unit wasn't running. I dig some digging and found out that if the scanner is picking up an intensity of 2 Hz or more at zero speed, it will send this alarm.

Troubleshooting Methods
  1. Hooked in series, a calibrator, to read the loop current, it was reading 3.7mA and the signal at the Mark V was reading 12 Hz. This was with the flame scanner removed from the can and the lens capped.
    1. According to the manual, a signal under 4.25mA should be reading ZERO.
  2. I sourced the flame scanner module with a calibrator and to get 12 Hz, it had to be sourced 4.0455 mA. Kind of odd!
  3. To rule out a cabling issue, I plugged can 4S cable into 3S and it worked without any issues... The same scanner read 3.54mA on that channel. This proved the cabling was fine.
  4. I looked up ITS flame scanner data sheet and noticed their sensitivity was > 4mA at 310 nm while GE flame scanner was > 5mA at 310 nm. It is possible that ITS flamer scanner is just a bit more sensitive causing it give a signal to the Mark V. What is odd, is when I took the ITS scanner and plugged it into 3S, I never got a diagnostic alarm. This had me intrigued.
    1. This had me confused and led to believe that the scanner would NOT pick up or generate an output signal less than 5mA, that is not true.

I decided to open up the flame scanner module and noticed each input channel had a zero and span POT (see picture below). I "assume", by looking in the IO configuration utility for the Mark V, the range would be from 0 - 255 pulses or 0 - 4080 hz. Converted, that would be 4080 hz. Mark V counts the pulses generated by the flame scanner in 1/16 of a second (255 * 16 = 4080). So at 4mA = 0 , 5mA = 256 Hz (per manual), and 20 mA = 4080 hz. If you are wondering, the constant for flame detection in our system, it is 16. Which again, would lead you to believe that the Mark V would "block" any signal lower than 5 mA. Not the case.

FSM-1002-002 REV A.jpg

Since I did not want to remove the module from service, or risk breaking the unit, we had a spare module that I was able to perform some tests on. Using a calibrator, I sourced the module.

4mA = 58hz
5mA = 302hz
20mA = 3854
21mA = 4084.

I found this rather odd that 5mA was at 302hz, I would have though it would have to be 256hz. Anything under 3.873mA was 0hz. Again, should anything under 4.25 mA be zero? Am I missing something?

The manual mentions NOTHING about these POTs in the module so I am curious if anyone has any experience calibrating the channels?

For now, i put the ITS flame scanner in 3S and took the scanner that was in 3S (a GE one) and moved it to 4S. Solved the problem but did not solve my curiosity with it.
 
I can't help you, but I want to compliment you on the detail in your post, describing what the problem is/was and what you've done to troubleshoot it and the thorough reasoning involved. It was an enjoyable read. Your level of detail occurs, maybe, once every two years on a forum, so it stands out. I'll be following to see what response you might get.
 
RJSolo,

First, I second David_2's commendation about the thoroughness of your post, including what was done in trying to understand the problem and what the results were. We rarely see that on Mark* questions, so, thank you.

The Mark V was designed to use Geiger-Mueller UV flame detectors and there were dedicated inputs on the <P> core for them. The faster processing speeds afforded by the TCEA cards meant any flame intensity signal had to be connected to the Mark V <P> core. THEN, GE purchased Reuter-Stokes which had a "better" flame detector (and since GE owned the company they could make even more profit on the sale of every flame detector!), BUT because the Mark V couldn't handle the 4-20 mA Reuter-Stokes flame detector inputs an interface module had to be developed. Hence, the Flame Trakker Interface Module was developed to allow for the use of the 4-20 mA Flame Trakkers to send a frequency to the <P> core flame intensity inputs for flame detection.

Second, very early Flame Trakker Interface Module pots were covered with a small dab of caulking. Which always meant to me, "Do not touch." It's been a while since I've had to open one of the interface modules (though nearly EVERY Mark V site I visit these days with the interface modules has the covers removed.... But, I haven't noticed if the dabs of caulking are used any more.

I don't know how much help you would get by contacting Reuter-Stokes directly, or if they even have a technical support group. But, I seem to sense that you are of the opinion that the module output is 4-20 mA--and it's not; it's a frequency that's "proportional" to the 4-20 mA signal from the Flame Trakker. And, there's always the chance my perception is incorrect. In the absence of any manufacturer's documentation, I would contact R-S directly and see if you can find someone who's got a few minutes to help answer your questions.

Please write back to let us know how this turns out! And, kudos for you original post!!!
 
I just realized how terrible my grammar can be at 5:30 am. Hah!

I appreciate the feedback, I contacted Reuter Stokes so I will wait to hear an answer. I get from a business standpoint of making information "secret", however, it really is a doozy when GE documentation lacks critical information that can help in troubleshooting.

CSA, there is still a dab of red caulk on these POTs. Its like when someone says "dont press the button" and you want to press the button! That is how I feel when POTs are caulked.

Curious_One, I appreciate the attachment. This documentation varies slightly from what I have on file. While it didnt help much more, I will add it to my files. I will also attach a TIL, T2028 which helps slightly but not in my case.
 

Attachments

I talked with Reuter Stokes and here is the summary from them:

  1. The minimum flame detection IO configuration setting of "16", as mentioned in TIL2028, is correct. It is independent of the Mark V diagnostic alarm D_1748_R S T.
  2. Factory acceptance for the flame sensor module would be to source at 4mA and it should equal ZERO. They will then source 5mA and it should be around 256 hz +/- 6 hz. After that, who really cares (im sure to a point they care). Essentially we are just trying to detect flame which is registered at or around 310 nm uv wavelength. In GE's case, that is detected at 5mA.
  3. They do have a one year warranty on their modules, however, in my case, mine is 20 years old... I am pretty sure my backup module is 20 years old too.
  4. He informed me that there is no recommendation, or even a procedure, for adjusting the POTS.
    1. If I read in-between the lines, and my opinion, they wouldn't release it even if they have it. I am sure this has to do with a liability and insurance issue. If sites start messing around with flame detection and the procedure was incorrect, it could be very bad. They would know, since the POTs are caulked/painted, if they happen to be adjusted.
  5. If a tech were to be sent out to look at my modules, no repairs would be done. It sounds like an upgrade, newer style detector/module combo, would be installed in its place.
  6. In my testing, he said that both my modules would have failed the factory test. Since I detected 58hz at 4mA, it would have failed.


My overall conclusion on this issue:

Since the ITS flame detector electronics (internally by design) are set to detect the wavelength of UV light at 4mA (vs. 5mA GE design) and in combination with an out of tolerance flame scanner module input channel, it "created" the nuisance diagnostic alarm. In no way did it affect turbine performance or flame detection when it needed to. By moving the ITS flame scanner to another can (another input channel), I think I just got lucky and that channel had alittle more deadband in it thus NOT causing a diagnostic alarm. My advice, dont be cheap, stick with a replacement scanner from GE RS.

IN NO WAY am I advocating this, however, I feel if you have a NIST certified high accuracy/resolution current source calibrator device, you could probably source 4 mA and if you get 58 hz, dial down the ZERO POT (Lower) and then source 5mA, and you should be around 256 hz. If NOT, replace your module! You could also buy a calibration lamp in the UV wavelength and test your scanners.

Thanks
 
RJSolo,
I have a procedure for adjusting the pots that the commissioning GE tech left. I also have a drawing, but due to company policy, I cannot upload it here. Here's the procedure:
The offset adjustment potentiometers are R14 and R29. There are TWO EACH of these potentiometers. The R14 are for scanners 1 and 3, R29 are for scanners 2 and 4. These adjustments are very sensitive.

1. Apply 4.05 milliamps to the channel to be adjusted. The source, or plus connection, of the current source is connected to pin 2 of the input connector. The return, or negative connection, of the current source is connected to pin 6 of the output connector.
2. Adjust the appropriate potentiometer clockwise until the output frequency is zero. Rock the adjustment back and forth and set it as close to the cutoff point as possible.
3. Lock the potentiometer screw with nail polish.
When there is no light scanner output is 3.6mA-3.8mA.

Hope you can find this useful.
 
RJSolo,
I have a procedure for adjusting the pots that the commissioning GE tech left. I also have a drawing, but due to company policy, I cannot upload it here. Here's the procedure:
The offset adjustment potentiometers are R14 and R29. There are TWO EACH of these potentiometers. The R14 are for scanners 1 and 3, R29 are for scanners 2 and 4. These adjustments are very sensitive.

1. Apply 4.05 milliamps to the channel to be adjusted. The source, or plus connection, of the current source is connected to pin 2 of the input connector. The return, or negative connection, of the current source is connected to pin 6 of the output connector.
2. Adjust the appropriate potentiometer clockwise until the output frequency is zero. Rock the adjustment back and forth and set it as close to the cutoff point as possible.
3. Lock the potentiometer screw with nail polish.
When there is no light scanner output is 3.6mA-3.8mA.

Hope you can find this useful.
I understand the company policy issue and I appreciate you taking the time to summarize it on here.

The other day, I decided to touch channel 1 offset POT and adjust them on a flame scanner module NOT installed on the unit. It was a backup module. On channel 1, It must have been R29 (I didnt pay attention to the label), the bottom POT on channel 1, because when I source 4mA to the input channel, I had to adjust that POT to the point where it would cut out... Which was around 3.873 mA (10.2hz). Any lower, I was unable to pickup a frequency on both the oscilloscope and the calibrator.

If anyone is curious on bench calibration wiring for the FS Module, I posted below.. Disclaimer, always consult you manual for wiring schematics just incase they differ.

FS Module Input:
Term 1: Look Calibrator - (Used a Fluke 707 and Beamex MC6. If you use MC6 use + output)
Term 2: Loop Calibrator + (Used a Fluke 707 and Beamex MC6. If you use MC6 use - output)
Term 3: None

FS Module Output:
Term 1: Oscilloscope Input Channel (Used Fluke 199c)
Term 2: (Was not testing this channel)
Term 3: None
Term 4 & 5: Jumped together, Sourced 24V (DC Power Supply)
Term 6 & 7: Jumped together, Common or Neutral or Return from 24V supply. Connect Oscilloscope ref here as well.
Term 8: None


I am glad to see my "playing around" was close to, if not spot on, to the GE procedure. Again, thank you for posting this, big help!
 
Top