IPMash RAS scientists created a robotic wheelchair controlled by the "power of thought"
The mathematicians of the Institute of Problems in Mechanical Engineering of the Russian Academy of Sciences and St. Petersburg State University created an algorithm for wheelchair noninvasive control using the neural signals from the brain. The development was reported by Alexander Fradkov, Head of work, at the VI International Conference on Neural Networks and Neurotechnologies (NeuroNT’ 2025) at LETI (Saint Petersburg Electrotechnical University).
Cybernetic neuroscience is a new scientific field which combines the methods of computational neuroscience and cybernetics to study control processes in the nervous system and brain. It explores mathematical models of neural ensembles using the control theory approaches such as feedback synthesis, parameter estimation and classification of brain states based on electroencephalography (EEG) signals.
The scientists of the Institute of Problems in Mechanical Engineering of the Russian Academy of Sciences are among the leaders of a new field of science. They formulated the essence of the new direction and became one of the first to develop it systematically. Modern developments in the field of neurotechnology open up new horizons in controlling the equipment using the neural interfaces, including robots, wheelchairs and robotic prostheses, as well as improve significantly the diagnosis of nervous diseases and pathological conditions of the brain. In addition, the use of mathematical models of neural ensembles and individual regions of the cerebral cortex allows for deeper understanding of the principles of brain function, that contributes to the development of new methods of treatment and rehabilitation.
The IPMash RAS scientists have been working in this area in recent years together with the scientists of the St. Petersburg State University. So, they built the learning network versions of the Fitz Hugh-Nagumo and Hindmarsh-Rose models to improve the quality of modeling the human brain functioning.
“We are not standing still and now, together with the students, we have developed a robotic wheelchair which is controlled by brain signals directly. Thanks to the created algorithms, it detects accurately when a person wants to move to the right and when to the left”,- said Alexander Fradkov, Chief Research Scientist at the IPMash RASComplex Systems Management Laboratory, Professor of the St. Petersburg State University.
The wheelchair responds to the user's intentions by recognizing the brain signals through electroencephalography (EEG). Special machine learning algorithms analyze brain activity, identifying the patterns corresponding to the commands «forward», «left», «right» and «stop». To increase accuracy, the adaptive methods such as the modified Yakubovich–Bragman algorithm and the «implicit strip» are used, which separate effectively the signals even with a limited amount of data. The system is based on multi-stage brain signal processing. First, the EEG-data is cleared from noise using the band-pass filters, highlighting the key frequency ranges (for example, alpha and beta rhythms). Then the machine learning algorithms analyze the patterns of brain activity, matching them to the user's particular intentions. To increase accuracy, the system uses adaptation: the parameters of the neural ensemble model are refined continuously to meet the individual characteristics of the user. The final stage is the conversion of the recognized commands into the signals for the wheelchair electric drives, ensuring smooth and precise movement. Thus, the software acts as a «translator» between the brain and mechanics, connecting neuroscience, cybernetics and robotics. The advantages of such a system are its non-invasiveness and personalization. Unlike the traditional interfaces which require implantation of electrodes directly into the brain, external EEG sensors are used in this case. The algorithms adjust themselves to the individual characteristics of the user's brain, that speeds up learning and improves control accuracy. In addition, the technology allows to adapt the system to new types of commands, expanding its functionality. In the future, such developments can be used not only for rehabilitation, but also for controlling another devices, from smart home to exoskeletons, opening up new opportunities for people with limited mobility.
In addition, Alexander Fradkov spoke about the development at the 6th International Conference Neurotechnologies and Neurointerfaces and 11th International Conference on Physics and Control.