N
As a part of my study project I want to mathematically calculate the output of a Digital PID controller with an input scan time of 2ms. The conditions I intend to set are as follows:
My PV scale is 0 - 100% (Corresponding to 4 - 20mA simulated Process input)
SP is 50%
I'll source exactly 51% current as PV input to the PID. And hold it there. I will connect a mA Meter to the Controller output.
I'll set Kp at 1 and Ti (Integral Time) at 10. I am ignoring Td (Derivative Time Constant) assuming that it will not contribute anything for a constant error.
Given these facts, how to mathematically calculate the output mA begining from time T = 0? Theoretically, at what intervals will the output renew/change and what should be the value of this new/changed mA?
Thank you,
Nice
My PV scale is 0 - 100% (Corresponding to 4 - 20mA simulated Process input)
SP is 50%
I'll source exactly 51% current as PV input to the PID. And hold it there. I will connect a mA Meter to the Controller output.
I'll set Kp at 1 and Ti (Integral Time) at 10. I am ignoring Td (Derivative Time Constant) assuming that it will not contribute anything for a constant error.
Given these facts, how to mathematically calculate the output mA begining from time T = 0? Theoretically, at what intervals will the output renew/change and what should be the value of this new/changed mA?
Thank you,
Nice