notesum.ai
Published at November 27PDZSeg: Adapting the Foundation Model for Dissection Zone Segmentation with Visual Prompts in Robot-assisted Endoscopic Submucosal Dissection
cs.CV
cs.AI
Released Date: November 27, 2024
Authors: Mengya Xu1, Wenjin Mo2, Guankun Wang1, Huxin Gao1, An Wang1, Zhen Li3, Xiaoxiao Yang3, Hongliang Ren1
Aff.: 1Dept. of Electronic Engineering, CUHK, Hong Kong, China.; 2Dept. of Computer Science and Engineering, Sun Yat-sen University, Guangzhou, China.; 3Dept. of Gastroenterology, Qilu Hospital of Shandong University, Jinan, China.

| Dissection zone | No-go zone | ||||||
| Visual Prompt | Model | IoU | Dice | IoU | Dice | Mean IoU | Mean Dice |
| SETR | 43.91 | 61.02 | 97.93 | 98.95 | 70.92 | 79.99 | |
| STDC | 41.92 | 59.07 | 97.79 | 98.88 | 77.04 | 78.97 | |
| Point-Rend | 44.71 | 61.79 | 97.65 | 98.81 | 71.18 | 80.30 | |
| Fast-SCNN | 41.86 | 59.01 | 97.77 | 98.87 | 69.81 | 78.94 | |
| DeepLabv3 | 45.95 | 62.97 | 97.90 | 98.94 | 71.93 | 80.95 | |
| Without visual prompt | Our DZSeg | 47.55 | 63.01 | 98.05 | 99.04 | 72.80 | 81.02 |
| SETR | 50.53 | 67.14 | 98.18 | 99.08 | 74.36 | 83.11 | |
| STDC | 51.64 | 68.11 | 98.21 | 99.10 | 74.93 | 83.61 | |
| Point-Rend | 54.49 | 70.54 | 98.25 | 99.12 | 76.37 | 84.83 | |
| Fast-SCNN | 51.51 | 68.00 | 98.25 | 99.12 | 74.88 | 83.56 | |
| DeepLabv3 | 55.57 | 71.44 | 98.37 | 99.18 | 76.97 | 85.31 | |
| SAM-Adapter | 41.43 | 55.44 | / | / | / | / | |
| With point visual prompt | Our PDZSeg | 56.37 | 71.43 | 98.46 | 99.22 | 77.41 | 85.32 |
| SETR | 64.48 | 78.4 | 98.75 | 99.37 | 81.61 | 88.89 | |
| STDC | 69.41 | 81.94 | 98.90 | 99.45 | 84.15 | 90.69 | |
| Point-Rend | 67.87 | 80.86 | 98.80 | 99.40 | 83.33 | 90.13 | |
| Fast-SCNN | 66.36 | 79.78 | 98.73 | 99.36 | 82.55 | 89.57 | |
| DeepLabv3 | 69.24 | 81.83 | 98.90 | 99.45 | 84.07 | 90.64 | |
| With bounding box visual prompt | Our PDZSeg | 69.85 | 82.00 | 98.97 | 99.48 | 84.41 | 90.73 |
| SETR | 62.38 | 76.83 | 98.66 | 99.33 | 80.52 | 88.08 | |
| STDC | 62.65 | 77.04 | 98.58 | 99.28 | 80.61 | 88.16 | |
| Point-Rend | 64.42 | 78.36 | 98.74 | 99.37 | 81.58 | 88.86 | |
| Fast-SCNN | 61.43 | 76.11 | 98.60 | 99.30 | 80.02 | 87.70 | |
| DeepLabv3 | 64.15 | 78.16 | 98.73 | 99.36 | 81.44 | 88.76 | |
| With short scribble visual prompt | Our PDZSeg | 65.45 | 78.41 | 98.83 | 99.41 | 82.14 | 88.91 |
| SETR | 70.85 | 82.94 | 98.97 | 99.48 | 84.91 | 91.21 | |
| STDC | 70.19 | 82.49 | 98.94 | 99.47 | 84.56 | 90.98 | |
| Point-Rend | 74.73 | 85.54 | 99.17 | 99.58 | 86.95 | 92.56 | |
| Fast-SCNN | 71.87 | 83.63 | 99.04 | 99.52 | 85.46 | 91.58 | |
| DeepLabv3 | 72.30 | 83.93 | 99.08 | 99.54 | 85.69 | 91.73 | |
| With long scribble visual prompt | Our PDZSeg | 74.06 | 84.30 | 99.15 | 99.56 | 86.60 | 91.93 |