I am designing a piece of production automation equipment and trying to decide between the use of 4-20mA or 0-10Vdc monitor/control signals. I come from a lab environment and have always felt that the 0-10Vdc signal will inherently have better accuracy, resolution, and long term stability over 4-20 mA circuits. However, the consensus from the plant floor always seems to call for 4-20 mA circuits in order to minimize transmission errors. Resolution and accuracy is key to this project. I am looking for comments and/or information sources on this subject.