Autonomous Suturing in Robotic Surgery Using Reinforcement Learning and 3D Visual Feedback
DOI:
https://doi.org/10.52783/jns.v14.2754Keywords:
Autonomous Suturing, Robotic Surgery, 3D Vision, AI Surgery, Needle Control, Tissue Modeling, Surgical AI, Real-Time Feedback, Suture Accuracy, Motion Planning, Medical RoboticsAbstract
Autonomous suturing is a critical advancement in robotic-assisted surgery, offering the potential to enhance surgical precision, reduce workload, and improve patient outcomes. Traditional robotic-assisted suturing relies on human teleoperation, which can introduce variability and fatigue-related errors. This paper explores the integration of reinforcement learning (RL) and 3D visual feedback to develop a fully autonomous robotic suturing system. The proposed framework consists of a robotic arm equipped with a needle driver, a 3D stereo vision system for real-time depth perception, and a deep RL model optimized for suturing tasks. Our approach involves training a reinforcement learning agent in a simulated environment, where it learns optimal suturing strategies based on trial-and-error interactions. The RL model considers needle trajectory, suture tension, and tissue deformation while maximizing accuracy and minimizing tissue damage. The 3D vision module provides high-resolution depth maps to guide the robot in real time, enabling precise needle insertion and suture placement. The system is validated on synthetic tissue models, demonstrating superior performance in terms of precision, suture uniformity, and adaptability to tissue variations. Experimental results indicate that our RL-based approach outperforms traditional teleoperated suturing by achieving higher accuracy and reducing variability. Despite challenges such as real-time computation constraints and dynamic tissue behavior, this research highlights the feasibility of autonomous robotic suturing. Future improvements will focus on enhancing real-time adaptability, optimizing computational efficiency, and expanding the system’s applicability to various surgical procedures. This study represents a significant step toward fully autonomous robotic surgery.
Downloads
Metrics
References
S. Mosafer Khoorjestan and G. Rouhi, ‘‘An automatic suturing machine for intestinal anastomosis: Advantages compared with hand-suturing technique,’’ Surgical Innov., vol. 26, no. 2, pp. 209–218, Apr. 2019
Ma X, Song C, Chiu PW, Li Z. Autonomous flexible endoscope for minimally invasive surgery with enhanced safety. IEEE Robot Autom Lett. 2019;4(3):2607-2613.
J. Schulman, A. Gupta, S. Venkatesan, M. Tayson-Frederick, and P. Abbeel, ‘‘A case study of trajectory transfer through non-rigid registration for a simplified suturing scenario,’’ in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., Nov. 2013, pp. 4111–4117
Nguyen ND, Nguyen T, Nahavandi S. System design perspective for human-level agents using deep reinforcement learning: A survey. IEEE Access. 2017;5:27091-27102.
S. Sen, A. Garg, D. V. Gealy, S. McKinley, Y. Jen, and K. Goldberg, ‘‘Automating multi-throw multilateral surgical suturing with a mechanical needle guide and sequential convex optimization,’’ in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), May 2016, pp. 4178–4185
A. Attanasio, B. Scaglioni, E. De Momi, P. Fiorini, and P. Valdastri, ‘‘Autonomy in surgical robotics,’’ Annu. Rev. Control, Robot., Auto. Syst., vol. 4, no. 1, pp. 651–679, May 2021.
M. Yip and N. Das, ‘‘Robot autonomy for surgery,’’ in The Encyclopedia of MEDICAL ROBOTICS: Volume 1 Minimally Invasive Surgical Robotics. Singapore: World Scientific, 2019, pp. 281–313.
S. O’Sullivan, N. Nevejans, C. Allen, A. Blyth, S. Leonard, U. Pagallo, K. Holzinger, A. Holzinger, M. I. Sajid, and H. Ashrafian, ‘‘Legal, regulatory, and ethical frameworks for development of standards in artificial intelligence (AI) and autonomous robotic surgery,’’ Int. J. Med. Robot. Comput. Assist. Surg., vol. 15, no. 1, p. e1968, Feb. 2019.
S. Sen, A. Garg, D. V. Gealy, S. McKinley, Y. Jen, and K. Goldberg, ‘‘Automating multi-throw multilateral surgical suturing with a mechanical needle guide and sequential convex optimization,’’ in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), May 2016, pp. 4178–4185.
Kassahun Y, Yu B, Tibebu AT, et al. Surgical robotics beyond enhanced dexterity instrumentation: a survey of machine learning techniques and their role in intelligent and autonomous surgical actions. Int J Comput Assist Radiol Surg. 2016;11(4):553-568.
A. Shademan, R. S. Decker, J. D. Opfermann, S. Leonard, A. Krieger, and P. C. W. Kim, ‘‘Supervised autonomous robotic soft tissue surgery,’’ Sci. Transl. Med., vol. 8, no. 337, pp. 337–364, May 2016.
K. Watanabe, T. Kanno, K. Ito, and K. Kawashima, ‘‘Single-master dual-slave surgical robot with automated relay of suture needle,’’ IEEE Trans. Ind. Electron., vol. 65, no. 8, pp. 6335–6343, Apr. 2018,
Li Y, Richter F, Lu J, et al. SuPer: A surgical perception framework for endoscopic tissue manipulation with surgical robotics. IEEE Robot Autom Lett. 2020;5(2):2294-2301.
S. Petscharnig and K. Schöffmann, ‘‘Learning laparoscopic video shot classification for gynecological surgery,’’ Multimedia Tools Appl., vol. 77, no. 7, pp. 8061–8079, Apr. 2018.
S. Speidel, A. Kroehnert, S. Bodenstedt, H. Kenngott, B. Muller-Stich and R. Dillmann, ‘‘Image-based tracking of the suturing needle during Laparoscopic interventions,’’ Proc. SPIE Med. Imag., vol. 9415, Apr. 2015, Art. no. 94150B
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
You are free to:
- Share — copy and redistribute the material in any medium or format
- Adapt — remix, transform, and build upon the material for any purpose, even commercially.
Terms:
- Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
- No additional restrictions — You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.