Show HN:我为我的博士学位用 3D 打印机丝制作了一个新传感器
Show HN: I made a new sensor out of 3D printer filament for my PhD

原始链接: https://paulbupejr.com/developing-the-optigap-sensor-system/

本文介绍了作者在博士研究期间创建的一种名为 OptiGap 的新型传感器系统。 他们解释了实现 OptiGap 传感器的动机、实验、发现和后续发展。 该传感器的功能是检测沿柔性杆或带的何处发生弯曲,这种现象称为“弯曲定位”。 OptiGap 利用填充有柔性光学光管的透明管中的气隙,触发对于识别弯曲位置至关重要的独特编码图案。 本发明源于作者在研究弯曲检测传感器期间对通过各种光管传输光的研究。 当他们在测试连接到卷尺的透明 3D 打印机灯丝(特别是靠近电工胶带的附着点)时注意到光传输的差异时,出现了意外的发现。 受这一观察的启发,他们进行了额外的测试,确认了粘合剂引起的灯丝拉伸与随之而来的光透射率下降之间的关系。 他们通过检查控制这些气隙放置的后果来继续探索。 这导致了他们的假设的形成,涉及使用沿相邻光学光管的一系列图案化气隙对绝对传感器位置进行编码。 利用逆格雷码来实现这种编码方法,所得系统可以用作单一的高精度光纤传感器。 在整个开发过程中,改进包括使用更细的塑料光纤减小传感器的物理尺寸、实现更简单的光电二极管和红外 LED 设置,以及通过微控制器和朴素贝叶斯分类器添加实时机器学习功能。

在我的电气工程博士之旅中,我创建了一个独特的传感器项目,该项目约占我学习时间的三分之一。 尽管面临挑战,但这种经历是令人愉快的,而且是在三年内实现的。 最初,我想要的是实际应用,而不是理论或被遗忘的论文。 在一位杰出导师的支持下,我成功地找到了创新与实用之间的平衡。 这个过程涉及复杂的问题解决,导致看似简单的概念实现起来却很复杂。 我的热情在于使用 PyQtGraph 等库设计传感信息的图形用户界面。 这些工具显着增强了数据的呈现和解释。
相关文章

原文
Reading Time: 9 minutes

This article explores the research and development journey behind my new sensor system, OptiGap, a key component of my PhD research. I’m writing this in a storytelling format to offer insights into my decision-making process and the evolution leading to the final implementation. It should hopefully provide a glimpse into the sometimes-shrouded world of PhD research and may appeal to those curious about the process. For a deeper dive into technical specifics, simulations, and existing research on this subject, my dissertation is available online here.

What does it do?

In very general terms, this sensor is basically a rope that if bent can tell you where along its length you bent it. The fancy term for that is “bend localization.”

OptiGap’s application is mainly within the realm of soft robotics, which typically involves compliant (or ‘squishy’) systems, where the use of traditional sensors is often not practical. The name OptiGap, a fusion of “optical” and “gap,” reflects its core principle of utilizing air gaps within flexible optical light pipes to generate coded patterns essential for bend localization.

How the OptiGap Sensor System Started

The idea for OptiGap came about while I was experimenting with light transmission through various light pipes (optical cables) for use as a bend detection sensor. I was initially trying to see how I could effectively “slow down” light through the fiber…a seemingly straightforward task, right?

During this process, I attached a section of clear 3D printer filament (1.75mm TPU) to a piece of tape measure for an experiment and incidentally discovered that when I bent the tape measure (and filament) at the spot where the electrical tape was attached, there was a significant drop in light transmission. I hypothesized that this was because the sticky residue of the electrical tape was causing the filament to stretch, which in turn reduced the light transmission.

To verify this hypothesis, I attached a longer piece of TPU to a tape measure and began bending it at various points to observe how light transmission would change.

I wrote a small Linux I2C driver for the VL53L0X ToF sensor to run on a Raspberry Pi and push the data to a socket using ZeroMQ. I then created a rough GUI in Python to pull the sensor data from the socket and visualize the light transmission data in realtime, shown in the GIF below, which very quickly validated my hypothesis. This validation marked the “Eureka!” moment that sparked the eventual development of the OptiGap sensor.

Initial OptiGap discovery
My excited face while validating my discovery.

The OptiGap Realization

I realized that since I could control where the light was being attenuated, I could use this to encode information about the position of the bend on the sensor. Using electrical tape was not a practical solution, so I started looking for a more reliable and consistent way to create these attenuations. This led me to the idea of cutting the filament and then reattaching it together using a flexible rubber (silicone) sleeve, leaving a small air gap, as shown in the image below.

The main working principle of the air gap is that translation and/or rotation of one light pipe face relative to the other changes the fraction of light transmitted across the gap. The greater the bend angle, the more light escapes across the gap. The resulting change in intensity of the optical signal can then be correlated with known patterns for use as a sensor.

The Big Idea

I then proceeded to test this idea by creating multiple air gaps in a row and bending the filament to measure the attenuation.

As depicted in the GIF below, the optical intensity decreases at each air gap, with a more noticeable decrease as the bend angle increases. This initial experimentation served as proof of concept, demonstrating the feasibility of the idea. It led to the formulation of my final hypothesis of utilizing a pattern of these air gaps to encode information regarding the sensor’s bending and employing a naive Bayes classifier on a microcontroller to decode the bend location.

Validating the attenuation at the air gaps.
Validating the attenuation at the air gaps.

This concept resembles the functionality of a linear encoder. Linear encoders gauge an object’s linear movement, typically comprising a slider rail with a coded scale akin to a measuring ruler and a sensing head that moves across this scale to read it. Linear (absolute) encoders emit a distinct code at each position, ensuring consistent identification of displacement.

The OptiGap system, functioning like an absolute encoder, would encode absolute positions using patterns of bend-sensitive air gaps along parallel light pipes, effectively serving as a singular fiber optic sensor.

Encoding the Bend Location using Inverse Gray Code

Absolute encoders commonly employ Gray code, a binary system where two successive values differ in only one bit. This property allows for various applications, including error checking. However, Gray code isn’t optimal for the OptiGap sensor system. Here, we aim for consecutive values to differ by the maximum number of bits to facilitate easier differentiation. This necessity gave rise to Inverse Gray code.

Inverse Gray code is a binary code where two successive values differ by the maximum (n-1) number of bits. To implement this, I simply create cuts in the filament wherever there’s a “1” in the Inverse Gray code sequence. This approach can scale to any bit number. For the prototype, I utilized 3 bits, providing 8 possible positions.

Visualization of the OptiGap Sensor System

The illustration below depicts the signal patterns of the OptiGap sensor system for each bend position using three fibers. By employing a naive Bayes classifier, the sensor system can discern bend positions based on signal patterns. The third graph represents actual sensor data from the prototype system, utilized for training the classifier on the microcontroller.

The OptiGap Prototype

I proceeded to construct a prototype of the OptiGap sensor system, utilizing 3 strands of clear TPU 3D printer filament, each featuring a distinct pattern of air gaps. The image below showcases the filament just before cutting, with the cut pattern indicated on a piece of tape.

For the prototype, I employed a commercial 3:1 fiber optic coupler to merge the light from the 3 strands into a single fiber optic cable, resulting in the completion of the sensor prototype, as depicted below.

This marked the final phase of validating the hypothesis and operational theory behind the OptiGap sensor.

Reducing the Physical Size

The initial prototype proved to be large and bulky, primarily due to the size of the 3D printer filament used. Drawing from previous experience, I recognized that PMMA (plastic) optical fiber offered a smaller and more flexible alternative suitable for this application. Consequently, I assessed 500, 750, and 1000 micron unjacketed PMMA optical fibers from Industrial Fiber Optics, Inc. for the sensor strands, resulting in a significant reduction in sensor size.

I conducted tests on all three types of fibers to evaluate their light transmission and flexibility. Among them, the 500 micron fiber emerged as the optimal choice overall, although all three exhibited sufficient flexibility for this application.

Reducing the Optical Transceiver Complexity

I decided to switch from using the complex VL53L0X ToF sensor to a simple photodiode and IR LED setup to reduce the complexity of the system and to increase modularity. This also allowed me to use a  microcontroller to read the sensor data, which was a significant improvement over the initial prototype.

I then created a demo system for the sensor based around an STM32 microcontroller and a photodiode/IR LED setup.

Realtime Machine Learning on a Microcontroller

The final stage in developing the OptiGap sensor system involved integrating a naive Bayes classifier onto the STM32 microcontroller to decode the bend location from the sensor data. I opted for a naive Bayes classifier due to its efficiency compared to if-statements or lookup tables, its capability to handle new or previously unseen data, and its potential for increased accuracy by considering relationships between multiple input variables.

Implementing the naive Bayes classifier proved to be relatively straightforward. This classifier is a probabilistic model based on applying Bayes’ theorem to determine how a measurement can be assigned to a particular class, with the class representing the bend location in this context. I utilized the Arm CMSIS-DSP library for the classifier implementation.

Fitting the Sensor Data

The initial step in integrating the classifier was to fit the sensor data to a Gaussian distribution for each air gap pattern. To expedite this process, I developed a Python GUI for rapid labeling and fitting of the data using GNB (Gaussian Naive Bayes) from the scikit-learn library.

I later improved this UI to be more general and to allow for more complex data fitting.

The probabilities for each class were computed and saved as a header for use on the microcontroller.

Filtering the Sensor Data

To enhance the accuracy of the classifier, I implemented a two-stage filtering process on the STM32 . The initial stage involved a basic moving average filter, followed by a Kalman filter in the second stage.

The OptiGap Sensor System Demo

The GIFs provided below illustrate various stages of the OptiGap sensor system, encompassing assembly and the operational demonstration of the final sensor system.

System Overview

Assembly of an OptiGap Sensor using TPU Filament

Attenuation of Light through the OptiGap Sensor

Fitting of the Sensor Data

Segment Classification using PMMA Optical Fiber

Segment Classification using TPU Filament

Underwater Operation

OptiGap Design Specifications

Key Properties & Parameters

Material Recommendations

Next Steps

I’ve made significant progress on the OptiGap system beyond what’s documented here, including its integration into another modular actuation and sensing system I developed called EneGate.

This has involved custom PCB design and systems integration, detailed in my dissertation. Additionally, I’ve prototyped miniature PCB versions of the optics to interface with the PCBs for the EneGate system.

I’ve also validated OptiGap on a real-world soft robotic system, with full details set to be presented in an upcoming RoboSoft paper titled “Embedded Optical Waveguide Sensors for Dynamic Behavior Monitoring in Twisted-Beam Structures.

Commercialization

There’s an ongoing commercialization aspect to this research as well. Feel free to reach out if you’re interested in further details.

That’s it for now!

I don’t want to make this too long so I’ll end here. I hope this provided some insight into the research and development process involved in something like this. If you have any questions or would like to learn more, don’t hesitate to contact me!

联系我们 contact @ memedata.com