Phidgets 2.1->2.2 conversion questions

Comments & issues
Post Reply
wbclay
Fresh meat
Posts: 4
Joined: Mon Jan 14, 2013 8:27 am
Contact:

Phidgets 2.1->2.2 conversion questions

Post by wbclay »

Happy new year! In lieu of fireworks, I'm working on upgrading my Phidgets application to Library 2.2. Three questions, if I may, in that regard ...

This Python application is hosted on Debian Jessie Linux (though I don't think these questions are specific to that language or platform) and uses a 1203 2x20 LCD + 8/8/8 Interface Kit to monitor several digital inputs and two analog inputs. The system has run reliably 7x24 for over 5 years. I am now preparing to add some new functions and and convert it to Phidgets 2.2.

1. I need to reliably monitor a 12 V. battery-powered engine starter circuit through an 1135 Precision Voltage Sensor for Voltage spikes/troughs with durations as short as 75 ms. and magnitudes as little as .25 V. I therefore monitor this analog input at a fixed 32-ms sampling rate with its change trigger disabled (0).

To minimize loss of precision due to two concatenated rounding/averaging processes (the Phidgets library and my code), I use "raw" analog input on this channel and do my own input unit->Voltage conversion, averaging, and filtering. This seems to work well, providing stable readings with a precision of about 0.07 V; attempting to obtain more precision produces fairly constant jitter (I could probably do better with a voltage divider instead of the 1135, but I want the isolation that the 1135's differential voltage sensing provides, and 0.1 V. precision is sufficient for this application).

It appears that Phidgets 2.2 does NOT provide a raw analog data API for InterfaceKit analog inputs. Is this correct? If so, what would be the least-smoothed form of input available? Or is there a better way to achieve the same objectives in 2.2?

2. It appears that I can set sensor type only if I operate the input as a VoltageInput, not a VoltageRatioInput (i.e., setSensorType() on a VoltageRatioInput channel raises a PhidgetException). I find this strange, since the 1135 is billed a ratiometric sensor. Is it nonetheless correct?

In fact, operating in 2.2 as a VoltageInput with sensor set to the 1135 (0x2bde), I get readings substantially different (ca. 2 V low) from my 2.1 code, the conversion factors for which I have calibrated to match a good-quality multimeter's readings at the signal source at normal battery Voltage (compensating for about 0.02 V voltage drop from the nominally 12 V source at the 1135 input terminals due to ca. 15 m of cable with a 10k series resistor) and at zero (shorted source).

3. After running tests with 2.2 and restarting my 2.1 application, I get low voltage readings in the 2.1 application, too. It appears that the only way to clear this anomaly is to cycle power on the 1203. Can the 2.2 setSensorType() function set some persistent hardware/firmware state in the 1203 that the 2.1 library will not clear? If so, is there some 2.2 function sequence to reliably reset that state?

Thanks for any insight you can provide.
User avatar
Patrick
Lead Developer
Posts: 634
Joined: Mon Jun 20, 2005 8:46 am
Location: Canada
Contact:

Re: Phidgets 2.1->2.2 conversion questions

Post by Patrick »

Phidget22 always gives you the most accurate (raw) value - as a double where phidget21 gave you an integer.

1135 is non-ratiometric - where did you see it billed as ratiometric? You should really change your code to non-ratiometric, otherwise the voltage reported by 1135 depends on USB voltage which can vary. I suspect that your calibration is just wrong, as it was performed against USB voltage instead of the non-ratiometric 5v reference

Opening as a VoltageInput under phidget22 will disable the ratiometric setting on the 1203, which defaults to on in firmware - your phidget21 software probably isn't setting this value explicitly, rather assuming that it's set.

-Patrick
wbclay
Fresh meat
Posts: 4
Joined: Mon Jan 14, 2013 8:27 am
Contact:

Re: Phidgets 2.1->2.2 conversion questions

Post by wbclay »

Thanks for the quick reply.

Why did I think the 1135 is ratiometric? A misinterpretation of https://www.phidgets.com/?view=articles ... getSensors. Under heading "Easy Case Calibration – Sensor with Linear Output", numbered item 2 states, "With other sensors, make sure “Ratiometric” is checked ...." I took "other sensors" to mean all members of the preceding sensor list except the 3133 described in item 1. I now see "non-ratiometric" on the product spec page.

I may have also misunderstood the term itself. I understood ratiometric to mean the sensor output is scaled with respect to possible variations in the sensor supply Voltage. From your reply, I guess that (mis)understanding is exactly backward. This wasn't an issue until I recently started using a 2nd AI and the 1135. My other AI was just a potentiometer used to control LED screen brightness; no precision concern there!

Back to the drawing board ...
Post Reply

Who is online

Users browsing this forum: No registered users and 0 guests