Project src-HALS
-
Collaborative robotized system for hand assisted laparoscopic surgery
Funding Organization: Ministerio de Ciencia e innovación. CICYT
Reference: DPI2013-47196-C3-1-R
Participants: Universidad de Málaga, Universidad Miguel Hernández, Universidad de Valladolid
Period: From 01/01/2014 to 31/12/2016
Main Researcher: Víctor Fernando Muñoz Martínez
System Abilities
Group | Ability | Level | Description |
Configurability | Mechatronic Configuration | 2 | User Run-time configuration. The configuration, both in terms of software operating parameters and mechatronic configuration can be altered by the user during the cycle of operation. |
Adaptability | Parameters | 2 | Individual Parameter adaptation. The system alters individual parameters in any part of the system based on assessments of performance local to the module on which the parameter operates. |
Components | 2 | Adaptation of individual components. The system selects one of several processing components based on online feedback during operation. | |
Task | 2 | Single task adaptation. A single task performed during the process cycle is adapted over time to optimise a particular metric. This adaptation is achieved by strategic overview of the performance of the system while carrying out the task. Adaptation is the result of accumulated experience. | |
Interaction | Human-Robot | 5 | Task sequence control. The system is able to execute sub-tasks autonomously. On completion of the sub-task user interaction is required to select the next sub-task resulting in a sequence of actions that make up a completed task. |
Human-Robot Feedback | 2 | Augmented haptic feedback. The system feedbacks visual information about the state of the operating environment around the robot based on data captured locally at the robot. The user must interpret this visual imagery to assess the state of the robot or its environment. | |
Robot-Robot | 1 | Communication of own status. Two or more robots communicate basic status information and task specific status. Status information is pre-defined for the task. The information communicated only relates to the state of the robot within the task. | |
Human-Robot Safety | 1 | Basic Safety. The robot operates with a basic level of safety appropriate to the task. Maintaining safe operation may depend on the operator being able to stop operation or continuously enable the operating cycle. The maintenance of this level of safety does not depend on software. | |
Dependability | Dependability | 2 | Fails Safe. The robot design is such that there are fail safe mechanisms built into the system that will halt the operation of the robot and place it into a safe mode when failures are detected. This includes any failures caused by in-field updates. Dependability is reduced to the ability to fail safely in a proportion of failure modes. Fail safe dependability relies on being able to detect failure. |
Motion | Unconstrained | 4 | Position constrained path motion. The robot can execute a path motion where the path is constrained by physical objects or by defined zones that must be avoided. The robot is able to execute a path to an unvisited location obeying constraints. |
Constrained | 1 | Compliant motion. The robot can execute motions that change in response to external forces applied to the robot such that the force exerted on the external body is controlled. The robot is able to maintain position and path in the absence of any external force. The force is working on the robot only at the intended tool tip, and the environment is static and rigid. | |
Manipulation | Grasping | 1 | Simple pick and place. The robot is able to grasp any object at a known pre-defined location using a single predefined grasp action. The robot is then able to move or orient the object and finally un-grasp it. The robot may also use its Motion Ability to move the object in a particular pattern or to a particular location. Grasping uses open-loop control. |
Holding | 1 | Simple holding of known object. The robot retains the object as long as no external perturbation of the object occurs. | |
Handling | 1 | Simple release. The robot is able to release an object at a known pre-defined location, but the resulting orientation of the object is unknown. The object should not be prematurely released. | |
Cognitive | Action | 3 | Sense driven action. The robot is able to modulate its action in proportion to parameters derived from its perceptions. The perceptions are used to drive the selection of pre-defined actions or the parameters of pre-defined actions. |
Envisioning | 2 | Dynamic motion prediction. The robot is able to project the effect of its motion to predict short term interactions with both static and dynamic objects in the environment that the system can detect. | |
Human Interaction | 2 | Task context interaction. The system is able to interpret commands from the user that utilise task context semantics within a domain specific communication framework appropriate to the range of the task. The system is able to relay task status to the user using task context semantics suitable for the task. |
Abstract
Hand-assisted laparoscopic surgery (HALS) is a new intermediate scenario between conventional laparoscopy and laparotomy, in which the surgeon inserts one hand into the abdominal cavity for organ manipulation, while handles the conventional minimally invasive surgical tools with the other. This is a useful approach in complex situations where conventional laparoscopic surgery cannot be performed, and it has been demonstrated that HALS does not imply longer patients’ recovery time.
The aim of the present proposal is to develop a robotic system for HALS procedures by using the concept of co-worker robot, where the manipulator works side by side with the surgeon, collaborating during the surgical maneuvers and learning from practice. Thus, experience is expected to improve robot’s assistance and to make the system able to detect emergency situations. To that goal, the system will be composed of a manipulator able to handle both an articulated laparoscopic tool and an endoscope, and another arm specialized in the control motion of miniature robots inside the abdominal cavity. On the other hand, the system will include a human-machine interface based on a smart surgical glove, and it will also emulate the concept of “transparent abdomen” by combining real images with augmented reality. Thus, the system will be supposed to recognize the present stage of the surgical workflow by means of the surgeon’s hand gestures, the motions of the surgical instrument or through the outputs of the physiological signals combined with a patient-procedure model. Thanks to this information, the robotic arms will be able to collaborate with the surgeon and to assist him/her with the articulated tool and placing the endoscope and the miniature robots in the suitable location to provide a complete and appropriate view of the surgical field. Additionally, the “transparent abdomen” system will provide the surgeon information of interest to him/her. Overall elements will be integrated in a cognitive architecture, which will include a supervisor system to watch over the system integrity, the performance of the surgical intervention and the safety of the patient.
Hence, this project will tackle the techniques and methodologies for the performance of collaborative movements with surgical instruments which takes into account both the position and force control to assist in maneuvers such as knotting, clipping or cauterization. Secondly, the HALS gestures identification will be tackled, which together with the integration of physiological signals will help to identify the present stage of the surgical workflow. Moreover, methodologies to emulate the concept of “transparent abdomen” will be developed, by means of virtual personalized models of the patient, a vision system composed of an endoscope and camera miniature robots, and the information provided by the smart surgical glove. Finally, all these developments will be used to establish a particular cognitive architecture oriented to the robot coworker concept. A set of in-vitro experiments will be performed in order to verify the work developed during the project.
Global solution
Figure 1 represents a possible configuration of the collaborative robotic system for Hand Assisted Surgery. This system will assist the surgeon emulating her/his left hand, providing an interface based on the “transparent abdomen” concept, and monitoring the patient trough physiological signals. The assistant robot has two robotic arms. One of them handles an articulated surgical instrument and assists with automatic movements in tasks that require two hands, such as blood-vessels knotting or suturing. The other arm will handle a set of miniature camera robots along the abdominal wall.
Figure 1
The system incorporates knowledge of the procedure protocol, the surgeons’ gestures, and a patient-intervention model obtained through physiological signals. The hand the surgeon inserts into the abdominal cavity wears a smart glove provided with a set of sensors to locate the tip of the fingers and other micro-devices. With this information, the robotic system is able to predict the actual state of the intervention and the surgical maneuver in order to decide how to assist the surgeon. The assistance also includes augmented reality using a patient model and data from the smart glove with the aim of providing a global vision of the operative field as if it were open surgery.
Using a cognitive approach, the robotic system learns how to improve its behavior and is able to identify unexpected or emergency situations. The cognitive architecture employed is depicted in Figure 2. The nucleus of the system is the module titled co-worker robot with augmented reality capabilities, in charge of deciding the actuation of the system depending on the actual state of the intervention and the knowledge learnt in previous trials. Emergency situations detection may change the actuation of the system depending on the situation detected. Long term memory data includes the patient-intervention model, the virtual model, and the surgical gestures.
Figure 2
Proposed goals
1. Development of collaborative strategies with the surgeon based on the use of a smart surgical glove
This objective involves a complete collaborative system in which the system acts depending on the actual state of the intervention. This is done using data of a smart surgical glove fused with other surgical data.
2. Design and development of an augmented reality system to simulate a transparent abdomen
This objective tackles the design of a complete transparent abdomen system with particular data of the patients obtained during the preoperative phase or the intraoperative one. This system will include augmented reality to display information of points of view not accessible with the camera.
3. Design and development of a system for detection and resolution of emergency situations.
This objective involves a system able to foresee emergency situations based on data from physiological signals and endoscopic images.
4. Integration and evaluation of the efficacy of the complete system through a set of in-vitro experiments.
This objective consists on the integration of all the technologies developed during the project in a set of experiments. This way, the advantages of the system versus traditional surgery will be analyzed.
Achievements
Architecture definition and design of experiments
- Definition of a generic HALS scenario in order to identify the requirements of the robotic system regarding to the surgical instruments motion and the point of view provided by the miniature camera robot.
- Identification and design of the architecture components and the communications among them. For this purpose, SOAR architecture has been used.
- Definition of three test scenarios:
- Transparent abdomen concept
- Surgeons gesture identification and colaborative movements.
- Emergency situations detection through biosignals and intraoperative images.
Long-term memory definition
Universidad de Málaga:
- Set-up definition for the use of the miniature camera robot in a collaborative scenario.
- In-vitro set-up to provide the appropriate point of view with the miniature camera robot depending on the actual state of a simulated intervention composed of a set of maneuvers.
- On-line learning to improve the behavior of the camera robot for each particular surgeon.
This data has been used to define the codification for the procedural, the semantic and the episodic memory. A learning rule to adapt the behavior of the robot to each surgeon preferences has also been established. Finally, this has been integrated in the control architecture, and a set of experiments have been performed in order to validate the work.
Movements and interaction forces planner for robotized surgical instruments
Universidad de Málaga:
- Motorization of commercial a surgical instrument in order to actuate the grasper and its orientation. Both the electronics and the software have been developed.
- Definition of the motion primitives of the robotic arm that handles the instrument. The control strategy has been based on a parallel Force/Position control that uses a Force/Torque sensor located between the manipulator's wrist and the motorized surgical instrument. Additionally, it has been developed a control strategy to predict interaction with soft tissue.
- Development of learning strategies by demonstration in collaboration with the European Space Agency.
Personalized patient's virtual model generation
Universidad Miguel Hernández:
- Generation of a new patient virtual model based on the one developed in the previous project. This new model includes fluids dynamics to simulate bleeding, and the miniature robots developed in the project.
- Integration of data from the cameras in the virtual model.
- Development of a physical set-up for testing surgical robots. This set-up incorporates deformable bodies, flexible tubes with a pulsating liquid as well as electronic signals that simulate an actual patient.
Supervisión del campo operatorio y detección de situaciones de emergencia
Universidad Miguel Hernández and Universidad de Valladolid:
- Detection of an unexpected bleeding inside the abdominal wall based on image analysis.
- Detection of bandages, based on image analysis strategies, for garanteeing their extraction at the end of the surgical procedure.
Augmented reality for simulating the transparent abdomen concept
Universidad de Málaga:
- Intraoperative vision system through miniature camara and lighting robots. Design of a miniature camera robot with two degrees of freedom to reach the intraoperative field without displacing the mini-robot.
- External magnetic handle to attach the miniature robots to the abdominal wall.
- Cable-driven actuation system to control the robot degrees of freedom.
- In-vivo experiments to test the design of the miniature robots during a nephrectomy intervention.
Universidad Miguel Hernández
- Development of the electronics for biopatches to allow a NFC wireless communication.
- Preliminary works with virtual reality libraries to develop a system that would integrate the information from the cameras in the virtual model.
Surgeon fingers position and orientacion computation
Universidad de Valladolid:
- Real time computation of the surgeon fingers position and orientation trough two mechanisms:
- Data from the smart glove regarding the degree of bending of every phalange and metacarpal.
- Hand volume generation to avoid collisions between the surgeon hand and the miniature robots.
- Hand volume generation to avoid collisions with other components of the system.
Surgical sensors integration and control in the smart glove
Universidad de Valladolid:
- Integration of a pressure sensor in the smart glove to recover the touch feeling loss due to the pressure on the surgeon arm typical of HALS procedures.
- Preliminary worws for the integration of a temperature sensor in the smart glove in order to determine the temperature during a cauterization.
Events detection through physiological variables evolution
Universidad de Valladolid:
- Study of monitoring techniques in HALS procedures to determine the physiological variables of study in order to generate a model. The model will be generated through the evolution of the physiological variables obtained from the operating room equipment as well as the smart glove sensors.
- Preliminary works to integrate the model with the bleeding detection system.
Main Results
src-HALS: Demonstration Platform | Cognitive Camera Robotic Assistant |