Innovative, touch-sensitive avatar-robotic arm based on real-time haptics

(Nanowerk News) Researchers at Keio University’s Haptics Research Center have developed a ‘real-time-avatar-robotic arm’ that transmits sound, vision, and highly sensitive feelings of touch to remotely located users. This innovative touch sensitive robotic technology that was reported in the October 2017 issue of IEEE Transactions on Industrial Electronics ("Artificial Replacement of Human Sensation Using Haptic Transplant Technology") and demonstrated at CEATEC (October 2017, Tokyo), is expected to find applications in areas such as industrial manufacturing, harvesting farm produce, and nursing care.
Touch sensitive avatar-robotic arm based on real time haptics developed by Takahiro Nozaki and colleagues at the Keio University Haptics Research Center.
There is demand for robotic technology to overcome the daunting challenges of the 21st century such as providing care for the elderly in rapidly aging industrialized nations, supporting labor intensive agriculture, and responding to extreme emergencies where humans cannot intervene directly, such as nuclear power station disasters.
With this background, a growing number of researchers are focussing on the potential of ‘haptics’ – man-machine communication based on touch – to solve these and related problems. In its simplest form, haptics enables users to feel the sense of touch via vibrations of forced motion. Such technology employs touch sensors that can be difficult to calibrate and often malfunction in extreme environments such as heat and radiation. Furthermore, conventional haptics technology is based on vibrations and is pseudo-tactile. So, although it can be used for games and entertainment, its range of industrial applications is very limited.
Takahiro Nozaki and colleagues of the Faculty of Science and Technology and Haptics Research Center at Keio University developed a haptic-based avatar-robot with a General Purpose Arm (GPA) that transmits sound, vision, movement, and importantly, highly sensitive sense of touch (force tactile transmission), to a remotely located user in real time. “This ‘real-haptics’ is an integral part of the Internet of Actions (IoA) technology, having applications in manufacturing, agriculture, medicine, and nursing care,” says Nozaki.
This is the world’s first high precision tactile force transmission technology that remembers human movements, edits them, and reproduces them. Also, this arm does not employ conventional touch sensors, thereby making it cheaper, more compact, and robust with respect to malfunction and noise. The core technology behind this avatar-robot is based on high precision motors integrated in the avatar arm and algorithms to drive them. High precision control of force and position is critical for transmitting a sense of touch without using touch sensors.
Nozaki and colleagues have launched ‘Motion Lib’ to commercialize their ‘real-haptics technology’. The main product is an integrated chip called the ‘ABC-CORE’ IC force/tactile controller. This IC chip controls the force adjustment of DC/AC servomotors and forces tactile transmission with two motors synchronized in motion. Importantly, since the load force applied to the motor is calculated by an algorithm in the chip, it is not necessary to install force or torque sensors.

Background

High precision robotic arms are widely used in industry, for repetitive actions in automobile assembly lines, for example. However, such robotic arms only repeat a preprogramed series of commands, grabbing well-defined, solid components used for constructing cars.
The challenge is to be able to recognize the shape, material composition—soft or hard— and position of an object, and manipulate it according to real-time instructions from a user located at a distance from the arm, where the arm acts as a real-time avatar.
The critical technical breakthroughs in motor control and robotics for the robotic-avatar developed by Nozaki and co-workers were first reported by Keio University’s Kouhei Ohnishi, in 1983 in a paper titled, “Torque –speed regulation of DC motor based on load torque estimation method” (lPEC— Tokyo'83, page 1209).
Ohnishi continued to develop his ideas in his 1993 paper on ‘sensorless torque control’ (IEEE Transactions on Industrial Electronics, 40, 259, (1993)). This report was followed by his proposals for ‘motion control in mechatronics’ (IEEE Transactions on Mechatronics, 1, 56, (1996)). Then, in 2004, Ohnishi addressed the issue of ‘good sense of remote objects’ in AMC 2004– Kawasaki, Japan.

Future work

Nozaki has set up a consortium with 30 companies to undertake proof of concept projects for the commercialization of this technology as an integral part of the Internet of Actions (IoA). The assist-avatar robotic GPA is being tested for use in supporting farmers to pick fruit and other agricultural applications.
Source: Keio University