For Data acquisition cards and also PLC AI cards there are two terms: accuracy and resolution. let's have an example. An AI input card of 0-5V with 16bits of resolution and accuracy of 0.1%. It says that this card makes difference between voltages down to 5v/2^16=76.3uv. on the other side saying that the accuracy is 0.1%( which is very usual for AI cards). it means that the card accuracy is 0.1%*5v=5 mv, which means that every value you read from PLC you should regard it as +-5mv. so the question is what is the purpose for such high resolutions in this example because the accuracy will spoil the result anyway?