Vibrotactile Feedback Enables Precision Robotic Bone Milling on Curved Surfaces
In the rapidly evolving field of surgical robotics, achieving high precision during delicate procedures such as bone milling remains a significant challenge. The complex geometry of human bone, particularly its curved surfaces, introduces variables that traditional robotic systems often struggle to manage. A new study from researchers at Nankai University introduces a breakthrough method that leverages the subtle vibrations generated during bone milling to enable robots to maintain both precise cutting depth and optimal tool angle—critical factors for safety and efficacy in orthopedic and neurosurgical operations.
The research, published in the peer-reviewed journal Robot, presents a novel control strategy that transforms the mechanical chatter of a milling tool into a rich source of real-time feedback. Unlike conventional systems that rely heavily on force sensors or preoperative imaging, this approach uses a compact, low-cost three-axis accelerometer mounted directly on the surgical tool. By analyzing the frequency components of the vibration signals, the system can simultaneously monitor and adjust both the depth of cut and the orientation of the tool relative to the bone surface. This dual-control capability marks a significant advancement over previous methods, which typically focused on depth control alone, often under simplified, flat-surface conditions.
The work was led by Jinggang Wang, Guangming Xia, Yu Dai, and Jianxun Zhang from the Institute of Robotics and Automatic Information System at Nankai University in Tianjin, China. Their findings offer a promising pathway toward more autonomous and reliable robotic surgical assistants, particularly in procedures like spinal laminectomy or cranial reshaping, where millimeter-level accuracy is paramount and the risk of damaging adjacent soft tissues is high.
The Challenge of Curved Surface Milling
Bone milling is a fundamental technique in many surgical interventions, used to remove bone tissue for decompression, implant placement, or tumor resection. However, the biological properties of bone—its low stiffness and tendency to deform under load—make it a challenging material to machine. High-speed milling tools, while effective, can easily penetrate too deeply if not carefully controlled, potentially injuring critical structures such as the dura mater, nerves, or blood vessels.
In manual surgery, experienced surgeons rely on a combination of visual cues and haptic feedback—the feel of the tool’s vibration and resistance—to gauge the cutting process. This sensory information allows them to make real-time adjustments to the tool’s position and angle. Replicating this intuitive skill in a robotic system has been a long-standing goal in medical robotics.
Most existing robotic systems for bone milling focus on maintaining a constant depth of cut. They typically use force/torque sensors to detect the reaction forces from the bone and adjust the tool’s position accordingly. While effective in controlled environments, these sensors are often bulky, expensive, and difficult to integrate into compact surgical tools. Moreover, force signals can be noisy and are influenced by factors such as tool wear and bone density variations.
Another common approach involves using preoperative CT or MRI scans to plan a surgical path. Intraoperative navigation systems then guide the robot along this path. However, this method has a critical limitation: bone and surrounding tissues can shift or deform during surgery, rendering the preoperative plan inaccurate. This “brain shift” or “tissue deformation” problem undermines the reliability of image-guided systems in real-time control.
Acoustic signals—sounds generated by the milling process—have also been explored as a feedback source. While easier to collect than force data, sound is highly susceptible to environmental noise and ambient disturbances in the operating room, making it less reliable for precise control.
The Nankai University team recognized that vibration signals, which are the physical origin of both force and sound, offer a more direct and robust alternative. “Vibration is an intrinsic property of the machining process,” explained Yu Dai, one of the senior authors of the study. “It is generated right at the tool-bone interface and can be measured with high fidelity using small, durable sensors. Our goal was to decode this signal to extract not just depth information, but also the tool’s orientation, which is crucial on curved surfaces.”
From Signal to Sensation: Decoding the Vibration Language
The core innovation of the study lies in its ability to extract two distinct pieces of information from a single vibration signal: the depth of cut and the angle between the tool axis and the local bone surface. This is achieved through a sophisticated signal processing pipeline that transforms raw acceleration data into actionable control commands.
The researchers began by establishing a theoretical model of the milling process. They treated the surgical tool as a dynamic system with equivalent mass and stiffness, subject to periodic cutting forces generated by the rotating flutes of the end mill. According to classical vibration theory, the resulting displacement of the tool contains harmonic components at multiples of the spindle rotation frequency. These harmonics carry information about the cutting conditions.
To capture these vibrations, the team mounted a commercial three-axis digital accelerometer (KX122-1037) near the tip of a 4 mm ball-end milling tool. The sensor continuously recorded acceleration in three orthogonal directions: tangential (along the feed direction), axial (along the tool’s rotation axis), and normal (perpendicular to the feed and axial directions). The data was streamed in real time to a digital signal processor (OMAP-L137), where it underwent Fast Fourier Transform (FFT) analysis to convert the time-domain signal into the frequency domain.
The FFT analysis revealed that the amplitude of the first harmonic (at the spindle frequency of 500 Hz, corresponding to 30,000 RPM) in the tangential acceleration signal correlated strongly with the cutting depth. In a series of controlled experiments on artificial bone plates, the researchers found a near-linear relationship between the harmonic amplitude and the depth, allowing them to create a calibration curve. This meant that by simply measuring the vibration level in the tangential direction, the system could accurately estimate how deep the tool was cutting into the bone.
However, the team discovered a critical confounding factor: the angle of the tool relative to the bone surface. On a curved bone, this angle is constantly changing as the tool moves. The researchers found that changes in this angle significantly affected the vibration signal, particularly in the axial direction. When the tool was held at a steeper angle, the flutes engaged the bone differently, altering the cutting forces and, consequently, the vibration pattern.
This presented both a challenge and an opportunity. If unaccounted for, changes in tool angle could be misinterpreted by a depth-only control system as changes in depth, leading to errors. But the Nankai team realized they could turn this into a feature. By analyzing the first harmonic amplitude in the axial acceleration signal, they were able to establish a second calibration curve that related this amplitude to the tool-bone angle.
“We essentially taught the robot to ‘feel’ two things at once,” said Guangming Xia, another lead author. “It can feel how deep it is cutting from the tangential vibrations, and it can feel the angle of its tool from the axial vibrations. This dual-sense capability is what allows it to handle complex, curved surfaces.”
A Closed-Loop Control System for Real-Time Adaptation
With these two calibration relationships established, the researchers integrated them into a closed-loop control system. The architecture of this system is elegant in its simplicity and robustness. The real-time vibration data is processed to extract the two harmonic amplitudes. These values are then fed into their respective calibration equations to compute the current depth and angle.
These computed values are compared to the surgeon’s desired setpoints—the target depth and the initial tool angle. The differences, or errors, are then used as inputs to a Proportional-Integral-Derivative (PID) controller. The PID controller calculates the necessary corrections to the robot’s motion.
The robot used in the study was a custom-built 4-degree-of-freedom system with three linear axes (X, Y, Z) and one rotational axis (α). The Z-axis controls the depth of cut, while the α-axis controls the tool’s tilt angle. The X and Y axes control the lateral movement along the bone surface.
A key design consideration was the potential for interference between the depth and angle control loops. Adjusting the tool angle to maintain optimal contact could inadvertently change the depth, and vice versa. To mitigate this, the team implemented a staggered control strategy. The depth control loop ran continuously at a high frequency, making fine adjustments to the Z-axis position every 10 milliseconds. The angle control loop, however, ran at a much lower frequency, updating the α-axis position only once every 25 depth control cycles (i.e., every 250 milliseconds).
“This decoupling was crucial,” said Jinggang Wang, the first author. “It allowed the depth control to remain stable and responsive, while still enabling the system to make necessary adjustments to the tool angle. It prevented the two control actions from fighting each other.”
Demonstrating Superior Performance on Curved Bone
The true test of the system came in experiments on a femur model with a near-circular cross-section, simulating a real-world curved bone surface. The researchers conducted two sets of experiments: one with only depth control (no angle adjustment), and one with the full depth-angle fusion control.
In the first scenario, the robot successfully maintained the target depth of 0.5 mm initially. However, as the tool progressed along the curved surface, the natural change in geometry caused the tool-bone angle to increase. This change in angle altered the vibration signature, which the depth-only system misinterpreted as an increase in cutting depth. In response, the robot retracted the tool slightly, resulting in a progressively shallower cut. By the end of the 60-second milling pass, the actual depth had decreased significantly, creating an uneven and potentially inadequate surgical result.
In the second scenario, with the full fusion control active, the outcome was dramatically different. The system detected the change in tool angle via the axial vibration signal and commanded the rotational axis to tilt the tool, restoring the optimal angle. This action kept the vibration signature consistent, allowing the depth control loop to maintain the tool at the precise 0.5 mm depth throughout the entire pass. The resulting milling groove was uniform and consistent from start to finish.
Quantitative analysis confirmed the superiority of the fusion control method. Without angle control, the average milling depth was 0.455 mm with a standard deviation of 0.046 mm, indicating significant variation and a systematic under-cutting. With angle control, the average depth was 0.499 mm with a standard deviation of only 0.028 mm, closely matching the target and showing much greater consistency. The maximum deviation from the target was reduced from -0.150 mm to -0.078 mm.
The researchers also tested the system at a higher target depth of 0.75 mm and a faster feed rate of 0.5 mm/s. While performance remained good at the higher depth, the increased feed rate led to greater errors, highlighting a trade-off between speed and precision. The system also performed well with a different initial tool angle (60 degrees), demonstrating its robustness to varying starting conditions.
Implications for the Future of Robotic Surgery
The implications of this research extend far beyond the specific application of bone milling. The principle of using high-frequency vibration signals as a rich source of haptic feedback could be applied to a wide range of robotic manipulation tasks, from minimally invasive surgery to industrial automation.
One of the most compelling advantages of this approach is its simplicity and cost-effectiveness. The three-axis accelerometer used in the study is a small, off-the-shelf component that is significantly cheaper and easier to integrate than a six-axis force/torque sensor. This could make advanced robotic surgical capabilities more accessible, particularly in resource-limited settings.
Furthermore, the system’s reliance on direct physical interaction with the material makes it inherently robust to the limitations of preoperative imaging. It doesn’t need a perfect 3D model of the anatomy; it adapts in real time to the actual surface it is encountering. This “reactive” intelligence, grounded in physical sensation, complements the “planned” intelligence derived from imaging.
The work also represents a significant step toward bio-inspired robotics. By mimicking the way human surgeons use haptic feedback, the system embodies a more natural and intuitive form of machine intelligence. “Our goal is not to replace the surgeon,” said Jianxun Zhang, the project lead. “It is to augment their capabilities, to provide them with a tool that has superhuman precision and consistency, while still operating under their ultimate supervision and control.”
The research team is now working on expanding the system to handle more complex bone geometries and varying bone densities. They are also exploring the integration of machine learning algorithms to make the system adaptive, capable of learning the unique vibration signatures of different bone types and surgical tools.
As surgical robotics continues to advance, the line between human skill and machine precision will continue to blur. The work from Nankai University demonstrates that by listening closely to the subtle language of vibration, robots can gain a new sense of touch, one that promises to make surgical procedures safer, more accurate, and ultimately, more effective for patients.
Jinggang Wang, Guangming Xia, Yu Dai, Jianxun Zhang, Institute of Robotics and Automatic Information System, Nankai University. Robot, DOI: 10.13973/j.cnki.robot.210001