notesum.ai
Published at November 25StructFormer: Document Structure-based Masked Attention and its Impact on Language Model Pre-Training
cs.CL
Released Date: November 25, 2024
Authors: Kaustubh Ponkshe1, Venkatapathy Subramanian1, Natwar Modani2, Ganesh Ramakrishnan1
Aff.: 1Indian Institute of Technology Bombay; 2Adobe Research, India

| End-to-end (predicted input) | ||||||
|---|---|---|---|---|---|---|
| StructFormer | SciREX Baseline | |||||
| Task | Precision | Recall | F1 | Precision | Recall | F1 |
| Salient Clusters | 0.2581 | 0.61271 | 0.3419 | 0.2230 | 0.6000 | 0.3070 |
| Binary Relations | 0.0550 | 0.5100 | 0.0890 | 0.0650 | 0.4110 | 0.0960 |
| 4-ary Relations | 0.0019 | 0.2760 | 0.0037 | 0.0070 | 0.1730 | 0.0080 |