Collecting data for machine learning algorithms is an important part of the learning process. Moreover, human-machine interface systems operate with the user’s physical movements, and recording these gestures is a standard method for creating datasets. However, this process is time-consuming and many people are required due to the risk of model overfitting. In this paper, we present a new method for automatizing data collection. The volunteers were replaced by a robotic arm with a mounted electric circuit to simulate the impedance of the human hand. Data recording and labeling were performed by a dedicated application that controlled the manipulator kinematics. The application generated randomized paths for the effector, with the purpose of collecting data similar to those collected from people. The gestures were recognized by a system based on capacitive sensors and a neural network algorithm executed on a microcontroller received data from the sensor signals. The system was taught with data collected from using the manipulator and tested with physical users. We describe the benefits of the proposed method, with the most significant advantage being that data collection was three times faster than using manual methods. |
*** Title, author list and abstract as seen in the Camera-Ready version of the paper that was provided to Conference Committee. Small changes that may have occurred during processing by Springer may not appear in this window.