R
I have been reading the Modbus over serial line specification V1.0 ( http://www.modbus.org ) and have been left a little perplexed by the line biasing scheme.
In a normal RS485 network when the A line is positive wrt the B line then the line is in an idle state (mark). The modbus standard defines the A line as D0 and the B line as D1.
RS485 networks can suffer from false start bits between data packets which can occur while the line is in an high impedance state. There are 3 ways to ensure that this does not occur (or does not cause the loss of a packet):
1) Packets start with a pre-amble and/or doorstep (idle line period) which allows the main packet to be recieved even if the UART had been falsly triggered.
2) A defined and rigidly respected handover period is defined in ordr to ensure that the master returns to transmitting idle within a time that is less than than a minimum time in which the slave maintains the idle line, and visa versa. That way the line never falls into an idle state.
3) Bias resistors are used to maintain the line in a known state when in a high impedance state.
Modbus does not implement 1 or 2, which is not a bad thing as the 3rd method is by far the superior one as it has no bandwidth overhead. Unlike some standards such as DMX which hold the line in BREAK between packets, Modbus expects an idle line state, and therefore I would expect the bias resistors to hold the line in an idle state, that is with A/D0 pulled up to 5V and B/D1 pulled down to 0V.
The specification however seems to reccomend biasing to a break state.
The reccomendation also states that if terminators are used then the number of nodes must be reduced by 4. A common value for these bias resistors is 620 ohms. If we regard the supply as a short circuit, that means they represent a 1.2K load on the line, which would seem to be equivalent to 10 RS485 devices (a std. RS485 load is 12K). On the other hand The standard recommends a 150 ohm terminating resistor, a valid value, but as we all know, RS485 is frequently terminated with 120 ohm resistors. The 1.2K bias impedance in paralell with (1) terminating resistor is 133 ohms, i.e. I would suggest that both from a theoretical viewpoint (and from pratical experience) that it really is not that critical.
The standard also refers to the use of terminating resistors in series with a capacitor may be more appropriate for biased lines. The use of such a technique reduces power consumption in both biased and unbiased schemes. The only difference for biased networks is that it may allow higher values for the terminating resistors, say 10K, because the stop bit pulls the line into the idle state and thus only a minimum (leakage) current from the bias resistors is required to maintain the level. The specification however does not allow such techniques as the maximum value for a bias resistor is 650 ohms. This in itself is slightly odd (thougth not critical) as one school of thought indicates 680 ohms as an optimal value and hence use 620 ohms as the nearest standard resistor value, the modbus standard presumably takes the average of these 2 values!
Another interesting point about the bias resistor specifications is that they are considered as items that may be required by a device, and the device must be documented to state if they require them. This is very odd, **any** 485 modbus device is susceptible to false start bits and the issue of wether to use them is a system one, and not dependent on the device manufacturer. Personaly I would strongly reccomend bias resistors in all Modbus implementations however the standard requires device manufacturers to state that thier device must have them, which rather implies an inferior spec device, so presumably nobody does insist on them. I feel this part of the specification should be reworded to recommend bias resistors on all networks, and recommend that all master devices should be able to provide this bias.
BTW, another issue I have with these specs is the requirement for even parity as default on RTU devices. Allthougth I appreciate that the original Modicon devices all used even parity, this was probably because the UART's thay used had parity generators/checkers built in. Not all UARTS have such facilties, including many popular microcontrollers of the the sort that would be highly suitable for modbus devices. Implementing parity in code can be a serious overhead in a simple device and given that thier is a 16 bit CRC which is a more than adequate error detection technique for 256 byte packets (IP and hardisks only use 16bit CRC's for much larger packets), then perhaps it would be the case to derate the even parity requirement to an optional, and completely eliminate the odd parity which I suspect is never used.
The specification is also vague about higher baud rates. It would suggest that baud rates increase by doubling standard baud rate values (115K, 230K etc.). This makes little sense as at higher baud rates most UARTS will require a 'BAUD rate crystal' to get the correct clock rate, which in a microcontroller can be inconvinient as the UART Clock will be derived from the CPU clock, which may require other values for timing requirements etc. Given that over 115K one is unlikely to be using a standard serial port, it would make more sense to switch over to round number baud rates at higher speeds, i.e. 250K, 500K, 1M etc.
Just my 2C, perhaps I have not understood anything, but it would be interesting to hear what others think. ;-)
In a normal RS485 network when the A line is positive wrt the B line then the line is in an idle state (mark). The modbus standard defines the A line as D0 and the B line as D1.
RS485 networks can suffer from false start bits between data packets which can occur while the line is in an high impedance state. There are 3 ways to ensure that this does not occur (or does not cause the loss of a packet):
1) Packets start with a pre-amble and/or doorstep (idle line period) which allows the main packet to be recieved even if the UART had been falsly triggered.
2) A defined and rigidly respected handover period is defined in ordr to ensure that the master returns to transmitting idle within a time that is less than than a minimum time in which the slave maintains the idle line, and visa versa. That way the line never falls into an idle state.
3) Bias resistors are used to maintain the line in a known state when in a high impedance state.
Modbus does not implement 1 or 2, which is not a bad thing as the 3rd method is by far the superior one as it has no bandwidth overhead. Unlike some standards such as DMX which hold the line in BREAK between packets, Modbus expects an idle line state, and therefore I would expect the bias resistors to hold the line in an idle state, that is with A/D0 pulled up to 5V and B/D1 pulled down to 0V.
The specification however seems to reccomend biasing to a break state.
The reccomendation also states that if terminators are used then the number of nodes must be reduced by 4. A common value for these bias resistors is 620 ohms. If we regard the supply as a short circuit, that means they represent a 1.2K load on the line, which would seem to be equivalent to 10 RS485 devices (a std. RS485 load is 12K). On the other hand The standard recommends a 150 ohm terminating resistor, a valid value, but as we all know, RS485 is frequently terminated with 120 ohm resistors. The 1.2K bias impedance in paralell with (1) terminating resistor is 133 ohms, i.e. I would suggest that both from a theoretical viewpoint (and from pratical experience) that it really is not that critical.
The standard also refers to the use of terminating resistors in series with a capacitor may be more appropriate for biased lines. The use of such a technique reduces power consumption in both biased and unbiased schemes. The only difference for biased networks is that it may allow higher values for the terminating resistors, say 10K, because the stop bit pulls the line into the idle state and thus only a minimum (leakage) current from the bias resistors is required to maintain the level. The specification however does not allow such techniques as the maximum value for a bias resistor is 650 ohms. This in itself is slightly odd (thougth not critical) as one school of thought indicates 680 ohms as an optimal value and hence use 620 ohms as the nearest standard resistor value, the modbus standard presumably takes the average of these 2 values!
Another interesting point about the bias resistor specifications is that they are considered as items that may be required by a device, and the device must be documented to state if they require them. This is very odd, **any** 485 modbus device is susceptible to false start bits and the issue of wether to use them is a system one, and not dependent on the device manufacturer. Personaly I would strongly reccomend bias resistors in all Modbus implementations however the standard requires device manufacturers to state that thier device must have them, which rather implies an inferior spec device, so presumably nobody does insist on them. I feel this part of the specification should be reworded to recommend bias resistors on all networks, and recommend that all master devices should be able to provide this bias.
BTW, another issue I have with these specs is the requirement for even parity as default on RTU devices. Allthougth I appreciate that the original Modicon devices all used even parity, this was probably because the UART's thay used had parity generators/checkers built in. Not all UARTS have such facilties, including many popular microcontrollers of the the sort that would be highly suitable for modbus devices. Implementing parity in code can be a serious overhead in a simple device and given that thier is a 16 bit CRC which is a more than adequate error detection technique for 256 byte packets (IP and hardisks only use 16bit CRC's for much larger packets), then perhaps it would be the case to derate the even parity requirement to an optional, and completely eliminate the odd parity which I suspect is never used.
The specification is also vague about higher baud rates. It would suggest that baud rates increase by doubling standard baud rate values (115K, 230K etc.). This makes little sense as at higher baud rates most UARTS will require a 'BAUD rate crystal' to get the correct clock rate, which in a microcontroller can be inconvinient as the UART Clock will be derived from the CPU clock, which may require other values for timing requirements etc. Given that over 115K one is unlikely to be using a standard serial port, it would make more sense to switch over to round number baud rates at higher speeds, i.e. 250K, 500K, 1M etc.
Just my 2C, perhaps I have not understood anything, but it would be interesting to hear what others think. ;-)