Phidgets 2.1->2.2 conversion questions
Posted: Fri Dec 29, 2017 10:06 am
Happy new year! In lieu of fireworks, I'm working on upgrading my Phidgets application to Library 2.2. Three questions, if I may, in that regard ...
This Python application is hosted on Debian Jessie Linux (though I don't think these questions are specific to that language or platform) and uses a 1203 2x20 LCD + 8/8/8 Interface Kit to monitor several digital inputs and two analog inputs. The system has run reliably 7x24 for over 5 years. I am now preparing to add some new functions and and convert it to Phidgets 2.2.
1. I need to reliably monitor a 12 V. battery-powered engine starter circuit through an 1135 Precision Voltage Sensor for Voltage spikes/troughs with durations as short as 75 ms. and magnitudes as little as .25 V. I therefore monitor this analog input at a fixed 32-ms sampling rate with its change trigger disabled (0).
To minimize loss of precision due to two concatenated rounding/averaging processes (the Phidgets library and my code), I use "raw" analog input on this channel and do my own input unit->Voltage conversion, averaging, and filtering. This seems to work well, providing stable readings with a precision of about 0.07 V; attempting to obtain more precision produces fairly constant jitter (I could probably do better with a voltage divider instead of the 1135, but I want the isolation that the 1135's differential voltage sensing provides, and 0.1 V. precision is sufficient for this application).
It appears that Phidgets 2.2 does NOT provide a raw analog data API for InterfaceKit analog inputs. Is this correct? If so, what would be the least-smoothed form of input available? Or is there a better way to achieve the same objectives in 2.2?
2. It appears that I can set sensor type only if I operate the input as a VoltageInput, not a VoltageRatioInput (i.e., setSensorType() on a VoltageRatioInput channel raises a PhidgetException). I find this strange, since the 1135 is billed a ratiometric sensor. Is it nonetheless correct?
In fact, operating in 2.2 as a VoltageInput with sensor set to the 1135 (0x2bde), I get readings substantially different (ca. 2 V low) from my 2.1 code, the conversion factors for which I have calibrated to match a good-quality multimeter's readings at the signal source at normal battery Voltage (compensating for about 0.02 V voltage drop from the nominally 12 V source at the 1135 input terminals due to ca. 15 m of cable with a 10k series resistor) and at zero (shorted source).
3. After running tests with 2.2 and restarting my 2.1 application, I get low voltage readings in the 2.1 application, too. It appears that the only way to clear this anomaly is to cycle power on the 1203. Can the 2.2 setSensorType() function set some persistent hardware/firmware state in the 1203 that the 2.1 library will not clear? If so, is there some 2.2 function sequence to reliably reset that state?
Thanks for any insight you can provide.
This Python application is hosted on Debian Jessie Linux (though I don't think these questions are specific to that language or platform) and uses a 1203 2x20 LCD + 8/8/8 Interface Kit to monitor several digital inputs and two analog inputs. The system has run reliably 7x24 for over 5 years. I am now preparing to add some new functions and and convert it to Phidgets 2.2.
1. I need to reliably monitor a 12 V. battery-powered engine starter circuit through an 1135 Precision Voltage Sensor for Voltage spikes/troughs with durations as short as 75 ms. and magnitudes as little as .25 V. I therefore monitor this analog input at a fixed 32-ms sampling rate with its change trigger disabled (0).
To minimize loss of precision due to two concatenated rounding/averaging processes (the Phidgets library and my code), I use "raw" analog input on this channel and do my own input unit->Voltage conversion, averaging, and filtering. This seems to work well, providing stable readings with a precision of about 0.07 V; attempting to obtain more precision produces fairly constant jitter (I could probably do better with a voltage divider instead of the 1135, but I want the isolation that the 1135's differential voltage sensing provides, and 0.1 V. precision is sufficient for this application).
It appears that Phidgets 2.2 does NOT provide a raw analog data API for InterfaceKit analog inputs. Is this correct? If so, what would be the least-smoothed form of input available? Or is there a better way to achieve the same objectives in 2.2?
2. It appears that I can set sensor type only if I operate the input as a VoltageInput, not a VoltageRatioInput (i.e., setSensorType() on a VoltageRatioInput channel raises a PhidgetException). I find this strange, since the 1135 is billed a ratiometric sensor. Is it nonetheless correct?
In fact, operating in 2.2 as a VoltageInput with sensor set to the 1135 (0x2bde), I get readings substantially different (ca. 2 V low) from my 2.1 code, the conversion factors for which I have calibrated to match a good-quality multimeter's readings at the signal source at normal battery Voltage (compensating for about 0.02 V voltage drop from the nominally 12 V source at the 1135 input terminals due to ca. 15 m of cable with a 10k series resistor) and at zero (shorted source).
3. After running tests with 2.2 and restarting my 2.1 application, I get low voltage readings in the 2.1 application, too. It appears that the only way to clear this anomaly is to cycle power on the 1203. Can the 2.2 setSensorType() function set some persistent hardware/firmware state in the 1203 that the 2.1 library will not clear? If so, is there some 2.2 function sequence to reliably reset that state?
Thanks for any insight you can provide.