I am looking for your help to solve this problem. I use IR Back-Scatter sensor to measure the suspended solid in sludge inline. However, I think the pressure in the line prevents it from getting the right measurement because when I did the calibration I tried 3 different concentration samples (static) and I can get the difference in mA and/or voltage reading but when I install the sensor in pipeline I cannot get the difference in reading so I presume it's the pressure problem.
So is there any solution in terms of sensor electronic board modification to handle the pressure or in terms of the pipeline to reduce the pressure like bypass or any other solution you suggest?
Thank you very much and I really looking forward to your advice.
Definitely not pressure. Typically it is the optical window and its cleanliness.
How do the process grab samples read on the bench?
Thank you very much for your reply. Well, I don't understand what you mean by how the process grab sample read on the bench?
When I put the head of the sensor in different static samples (in small tank), I read different volt/current which is good but when I install it in the pipe line (100 mm diameter) and turn on the pump and circulation start, I read only one value with different sludge concentration.
That is why I thought it's flow/ pressure.
Also could you please advise the best way for cleaning the sensor (how to avoid the clean problem)?
Ok I understand your static sample is from the process, and answers that question.
You've described the measurement as an Infrared Back scatter measurement of particulates.
That is certainly done in gas streams, but if your device has been adapted to liquid slurries, it will have an optical window to protect the sensing element from the process.
If that is the case, a successful measurment will depend on proper installation and some provision for keeping the window clean.
Recognize that in liquid slurries, you may be dealing with stratified flows and the probe only seeing that part of the flow with minimal particulates.
If cleaning is required, it would be called out in the operating manual.
Thank you very much and yes, the cleaning is required and I install the probe horizontally as recommended and I also clean it but not successful yet.
The comment on liquid slurries/stratification is very important.
Additionally have you considered entrained bubbles? Bubbles could make it sensitive to pressure.
Thank you for your reply, and I am in stage of calibration so I put in a tank a known concentration sludge (lets say 2%). I got reading it's ok but when I change the concentration by adding water and get a sample with 1% the reading is not changing.
Yes I considered bubbles so I tried to installation position, vertical one which I advised to change it because bubbles will be on the top of the pipe mostly and I changed to horizontal 180 degrees.
the sensor i am using is to measure sludge concentration up to 5%SS, and its IR Back-Scattering infrared 950 nm.
i did install the probe in pipe (100 mm diameter), and i use MyDAQ LabVIEW as data acquisition. i put 1K Ohm resistor between the signal wire and the ground to measure mA. i knew this sensor out put is 0 to 5mA so that's why i put this resistor. when i turn on the pump to circulate the sludge, i read 0.65 mA and the sludge sample is 0.57%SS. i did put more thickening sludge (1.87 %SS), but i still read the same from the sensor 0.65mA. however, it's still not big concentration, (1.87%) is not that thick. so i really looking for a help i might miss something.
i also installed pressure gauge on the pipe and looks like nothing as the pipe is open end.
(i can see the change in mA when i out an object in front of the lens).
Thank you very much and i am looking forward to hear from you.
"sensor output is 0 to 5mA"
unusual signal span for industrial equipment. just scanning the various options I see, for example 4 to 20mA DC or 0 to 1V DC, etc., etc.
It appears, from an earlier thread, that this is not a commercial sensor.
if i understand what exctly mean, it is a commercial sensor, its been in the market for long time (IL55).
At a minimum, you need to calibrate your sensor with a stable light source in air and liquids, at the given wave length and determine the linearity of your transducer electronics.
Until that is done actual testing is virtually meaningless.
Optical attenuators (for your wavelenth) are commonly used. You'll have to research that, as the product offering have changed quite a bit over the last decade.
The purpose of the calibration is to identify the overall functionality of the optics/electronics. The calibration of the particle concentration is another matter entirely. You would need precalibration samples.