In the previous article I showed my successful implementation of ShArc, a novel bend/shape sensor using capacitive sensing. My video showed that the system could track the bend shape of the sensor but it was not quantitively analysed.
To take this a little further I wanted to show the relationship between the signal that the sensor generates and the radius of curvature. To begin with I 3D printed a flexible covering for the flex using TPU filament and the modified extruder I designed to enable TPU printing on my Monoprice 3D printer.
The cover is flexible but holds the layers of flex together to ensure there are no air gaps between the layers. It also provides a flat surface on the top and bottom which is an improvement over the elastic bands previously holding the layers together.
This flat surface allows the sensor to be bent around a surface of known curvature in order to hold the sensor in a known shape. For this I 3D printed a number of bent cantilevers that could slot into the plastic base.
Using cantilevers of 23mm, 43mm, 83mm curvature (as well as a flat cantilever) I measured the response of the sensor.
For each radius of curvature, we expect to see a certain amount of “shift” at each pixel. This is the amount that we expect the TX flex to move relative to the RX flex. With a flat cantilever (a radius of infinity) we expect a shift of 0mm. For a constant radius of curvature we expect to see more shift on each subsequent pixel as each pixel is further from the fixed point at the end of the sensor.
We expect to see an approximately linear relationship between the sensor response (in ADC Counts) and the shift. We also expect that each pixel should have an identical relationship to shift. The data I collected is shown here which shows the sensor response (ADC Counts) on the y-axis for each amount of expected shift (x-axis):
While each pixel does show an approximately linear response to shift, we see that each pixel has a different gradient. There could be a number of reasons for this.
Firstly, it is hard to tell if the TPU cover is adequately holding the layers together – potentially the layers may be separating during bending which could cause air gaps and an inaccurate reading. Secondly, it is difficult to ensure a constant radius of curvature and there could be error in the 3D printed cantilevers or in my ability to hold the sensor in contact with the cantilevers – the TPU cover is also made of two layers super-glued together which could cause a variable modulus of bending along the length of the sensor. Thirdly, the sensor assumes a parallel plate capacitance which is not accurate and there will in fact be fringing field lines. The sensor may also be interacting with external objects such as my hand during the bending. I also do not take into account the impact of traces on the sensor which will also contribute some capacitance.
Given that each pixel has a close-to-linear response it seems likely that these inaccuracies can be calibrated out.