000274184 001__ 274184
000274184 005__ 20240918113002.0
000274184 0247_ $$2doi$$a10.1016/j.media.2023.102770
000274184 0247_ $$2pmid$$apmid:36889206
000274184 0247_ $$2ISSN$$a1361-8415
000274184 0247_ $$2ISSN$$a1361-8431
000274184 0247_ $$2ISSN$$a1361-8423
000274184 0247_ $$2altmetric$$aaltmetric:143417154
000274184 037__ $$aDKFZ-2023-00479
000274184 041__ $$aEnglish
000274184 082__ $$a610
000274184 1001_ $$aWagner, Martin$$b0
000274184 245__ $$aComparative validation of machine learning algorithms for surgical workflow and skill analysis with the HeiChole benchmark.
000274184 260__ $$aAmsterdam [u.a.]$$bElsevier Science$$c2023
000274184 3367_ $$2DRIVER$$aarticle
000274184 3367_ $$2DataCite$$aOutput Types/Journal article
000274184 3367_ $$0PUB:(DE-HGF)16$$2PUB:(DE-HGF)$$aJournal Article$$bjournal$$mjournal$$s1726651782_8429
000274184 3367_ $$2BibTeX$$aARTICLE
000274184 3367_ $$2ORCID$$aJOURNAL_ARTICLE
000274184 3367_ $$00$$2EndNote$$aJournal Article
000274184 520__ $$aSurgical workflow and skill analysis are key technologies for the next generation of cognitive surgical assistance systems. These systems could increase the safety of the operation through context-sensitive warnings and semi-autonomous robotic assistance or improve training of surgeons via data-driven feedback. In surgical workflow analysis up to 91% average precision has been reported for phase recognition on an open data single-center video dataset. In this work we investigated the generalizability of phase recognition algorithms in a multicenter setting including more difficult recognition tasks such as surgical action and surgical skill.To achieve this goal, a dataset with 33 laparoscopic cholecystectomy videos from three surgical centers with a total operation time of 22 h was created. Labels included framewise annotation of seven surgical phases with 250 phase transitions, 5514 occurences of four surgical actions, 6980 occurences of 21 surgical instruments from seven instrument categories and 495 skill classifications in five skill dimensions. The dataset was used in the 2019 international Endoscopic Vision challenge, sub-challenge for surgical workflow and skill analysis. Here, 12 research teams trained and submitted their machine learning algorithms for recognition of phase, action, instrument and/or skill assessment.F1-scores were achieved for phase recognition between 23.9% and 67.7% (n = 9 teams), for instrument presence detection between 38.5% and 63.8% (n = 8 teams), but for action recognition only between 21.8% and 23.3% (n = 5 teams). The average absolute error for skill assessment was 0.78 (n = 1 team).Surgical workflow and skill analysis are promising technologies to support the surgical team, but there is still room for improvement, as shown by our comparison of machine learning algorithms. This novel HeiChole benchmark can be used for comparable evaluation and validation of future work. In future studies, it is of utmost importance to create more open, high-quality datasets in order to allow the development of artificial intelligence and cognitive robotics in surgery.
000274184 536__ $$0G:(DE-HGF)POF4-315$$a315 - Bildgebung und Radioonkologie (POF4-315)$$cPOF4-315$$fPOF IV$$x0
000274184 588__ $$aDataset connected to CrossRef, PubMed, , Journals: inrepo02.dkfz.de
000274184 650_7 $$2Other$$aEndoscopic vision
000274184 650_7 $$2Other$$aLaparoscopic cholecystectomy
000274184 650_7 $$2Other$$aSurgical data science
000274184 650_7 $$2Other$$aSurgical workflow analysis
000274184 7001_ $$aMüller-Stich, Beat-Peter$$b1
000274184 7001_ $$aKisilenko, Anna$$b2
000274184 7001_ $$aTran, Duc$$b3
000274184 7001_ $$aHeger, Patrick$$b4
000274184 7001_ $$aMündermann, Lars$$b5
000274184 7001_ $$aLubotsky, David M$$b6
000274184 7001_ $$aMüller, Benjamin$$b7
000274184 7001_ $$aDavitashvili, Tornike$$b8
000274184 7001_ $$aCapek, Manuela$$b9
000274184 7001_ $$0P:(DE-He78)97e904f47dab556a77c0149cd0002591$$aReinke, Annika$$b10
000274184 7001_ $$0P:(DE-He78)a06ba45fcf672f893e3d0946fe3f3483$$aReid, Carissa$$b11
000274184 7001_ $$aYu, Tong$$b12
000274184 7001_ $$aVardazaryan, Armine$$b13
000274184 7001_ $$aNwoye, Chinedu Innocent$$b14
000274184 7001_ $$aPadoy, Nicolas$$b15
000274184 7001_ $$aLiu, Xinyang$$b16
000274184 7001_ $$aLee, Eung-Joo$$b17
000274184 7001_ $$aDisch, Constantin$$b18
000274184 7001_ $$aMeine, Hans$$b19
000274184 7001_ $$aXia, Tong$$b20
000274184 7001_ $$aJia, Fucang$$b21
000274184 7001_ $$aKondo, Satoshi$$b22
000274184 7001_ $$aReiter, Wolfgang$$b23
000274184 7001_ $$aJin, Yueming$$b24
000274184 7001_ $$aLong, Yonghao$$b25
000274184 7001_ $$aJiang, Meirui$$b26
000274184 7001_ $$aDou, Qi$$b27
000274184 7001_ $$aHeng, Pheng Ann$$b28
000274184 7001_ $$aTwick, Isabell$$b29
000274184 7001_ $$aKirtac, Kadir$$b30
000274184 7001_ $$aHosgor, Enes$$b31
000274184 7001_ $$aBolmgren, Jon Lindström$$b32
000274184 7001_ $$aStenzel, Michael$$b33
000274184 7001_ $$avon Siemens, Björn$$b34
000274184 7001_ $$aZhao, Long$$b35
000274184 7001_ $$aGe, Zhenxiao$$b36
000274184 7001_ $$aSun, Haiming$$b37
000274184 7001_ $$aXie, Di$$b38
000274184 7001_ $$aGuo, Mengqi$$b39
000274184 7001_ $$aLiu, Daochang$$b40
000274184 7001_ $$aKenngott, Hannes G$$b41
000274184 7001_ $$aNickel, Felix$$b42
000274184 7001_ $$aFrankenberg, Moritz von$$b43
000274184 7001_ $$aMathis-Ullrich, Franziska$$b44
000274184 7001_ $$0P:(DE-He78)bb6a7a70f976eb8df1769944bf913596$$aKopp-Schneider, Annette$$b45
000274184 7001_ $$0P:(DE-He78)26a1176cd8450660333a012075050072$$aMaier-Hein, Lena$$b46
000274184 7001_ $$0P:(DE-HGF)0$$aSpeidel, Stefanie$$b47
000274184 7001_ $$0P:(DE-HGF)0$$aBodenstedt, Sebastian$$b48
000274184 773__ $$0PERI:(DE-600)1497450-2$$a10.1016/j.media.2023.102770$$gVol. 86, p. 102770 -$$p102770$$tMedical image analysis$$v86$$x1361-8415$$y2023
000274184 909CO $$ooai:inrepo02.dkfz.de:274184$$pVDB
000274184 9101_ $$0I:(DE-588b)2036810-0$$6P:(DE-He78)97e904f47dab556a77c0149cd0002591$$aDeutsches Krebsforschungszentrum$$b10$$kDKFZ
000274184 9101_ $$0I:(DE-588b)2036810-0$$6P:(DE-He78)a06ba45fcf672f893e3d0946fe3f3483$$aDeutsches Krebsforschungszentrum$$b11$$kDKFZ
000274184 9101_ $$0I:(DE-588b)2036810-0$$6P:(DE-He78)bb6a7a70f976eb8df1769944bf913596$$aDeutsches Krebsforschungszentrum$$b45$$kDKFZ
000274184 9101_ $$0I:(DE-588b)2036810-0$$6P:(DE-He78)26a1176cd8450660333a012075050072$$aDeutsches Krebsforschungszentrum$$b46$$kDKFZ
000274184 9131_ $$0G:(DE-HGF)POF4-315$$1G:(DE-HGF)POF4-310$$2G:(DE-HGF)POF4-300$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$aDE-HGF$$bGesundheit$$lKrebsforschung$$vBildgebung und Radioonkologie$$x0
000274184 9141_ $$y2023
000274184 915__ $$0StatID:(DE-HGF)0113$$2StatID$$aWoS$$bScience Citation Index Expanded$$d2022-11-18
000274184 915__ $$0StatID:(DE-HGF)0160$$2StatID$$aDBCoverage$$bEssential Science Indicators$$d2022-11-18
000274184 915__ $$0StatID:(DE-HGF)0100$$2StatID$$aJCR$$bMED IMAGE ANAL : 2022$$d2023-10-21
000274184 915__ $$0StatID:(DE-HGF)0200$$2StatID$$aDBCoverage$$bSCOPUS$$d2023-10-21
000274184 915__ $$0StatID:(DE-HGF)0300$$2StatID$$aDBCoverage$$bMedline$$d2023-10-21
000274184 915__ $$0StatID:(DE-HGF)0600$$2StatID$$aDBCoverage$$bEbsco Academic Search$$d2023-10-21
000274184 915__ $$0StatID:(DE-HGF)0030$$2StatID$$aPeer Review$$bASC$$d2023-10-21
000274184 915__ $$0StatID:(DE-HGF)0199$$2StatID$$aDBCoverage$$bClarivate Analytics Master Journal List$$d2023-10-21
000274184 915__ $$0StatID:(DE-HGF)0150$$2StatID$$aDBCoverage$$bWeb of Science Core Collection$$d2023-10-21
000274184 915__ $$0StatID:(DE-HGF)1160$$2StatID$$aDBCoverage$$bCurrent Contents - Engineering, Computing and Technology$$d2023-10-21
000274184 915__ $$0StatID:(DE-HGF)9910$$2StatID$$aIF >= 10$$bMED IMAGE ANAL : 2022$$d2023-10-21
000274184 9201_ $$0I:(DE-He78)E130-20160331$$kE130$$lE130 Intelligente Medizinische Systeme$$x0
000274184 9201_ $$0I:(DE-He78)C060-20160331$$kC060$$lC060 Biostatistik$$x1
000274184 980__ $$ajournal
000274184 980__ $$aVDB
000274184 980__ $$aI:(DE-He78)E130-20160331
000274184 980__ $$aI:(DE-He78)C060-20160331
000274184 980__ $$aUNRESTRICTED