Abstract
This paper explores the task of radiology report generation, which aims at generating free-text descriptions for a set of radiographs. One significant challenge of this task is how to correctly maintain the consistency between the images and the lengthy report. Previous research explored solving this issue through planning-based methods, which generate reports only based on high-level plans. However, these plans usually only contain the major observations from the radiographs (e.g., lung opacity), lacking much necessary information, such as the observation characteristics and preliminary clinical diagnoses. To address this problem, the system should also take the image information into account together with the textual plan and perform stronger reasoning during the generation process. In this paper, we propose an Observation-guided radiology Report GenerAtioN framework (ORGAN). It first produces an observation plan and then feeds both the plan and radiographs for report generation, where an observation graph and a tree reasoning mechanism are adopted to precisely enrich the plan information by capturing the multiformats of each observation. Experimental results demonstrate that our framework outperforms previous state-of-the-art methods regarding text quality and clinical efficacy.
Original language | English |
---|---|
Title of host publication | Long Papers |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 8108-8122 |
Number of pages | 15 |
ISBN (Electronic) | 9781959429722 |
Publication status | Published - 2023 |
Event | 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023 - Toronto, Canada Duration: 9 Jul 2023 → 14 Jul 2023 |
Publication series
Name | Proceedings of the Annual Meeting of the Association for Computational Linguistics |
---|---|
Volume | 1 |
ISSN (Print) | 0736-587X |
Conference
Conference | 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023 |
---|---|
Country/Territory | Canada |
City | Toronto |
Period | 9/07/23 → 14/07/23 |
ASJC Scopus subject areas
- Computer Science Applications
- Linguistics and Language
- Language and Linguistics
Other files and links
Fingerprint
Dive into the research topics of 'ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning'. Together they form a unique fingerprint.
Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver
Hou, W., Xu, K., Cheng, Y., Li, W., & Liu, J. (2023). ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning. In Long Papers (pp. 8108-8122). (Proceedings of the Annual Meeting of the Association for Computational Linguistics; Vol. 1). Association for Computational Linguistics (ACL).
Hou, Wenjun ; Xu, Kaishuai ; Cheng, Yi et al. / ORGAN : Observation-Guided Radiology Report Generation via Tree Reasoning. Long Papers. Association for Computational Linguistics (ACL), 2023. pp. 8108-8122 (Proceedings of the Annual Meeting of the Association for Computational Linguistics).
@inproceedings{f53dd3b19ea94752bcabedcb3f0dc30a,
title = "ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning",
abstract = "This paper explores the task of radiology report generation, which aims at generating free-text descriptions for a set of radiographs. One significant challenge of this task is how to correctly maintain the consistency between the images and the lengthy report. Previous research explored solving this issue through planning-based methods, which generate reports only based on high-level plans. However, these plans usually only contain the major observations from the radiographs (e.g., lung opacity), lacking much necessary information, such as the observation characteristics and preliminary clinical diagnoses. To address this problem, the system should also take the image information into account together with the textual plan and perform stronger reasoning during the generation process. In this paper, we propose an Observation-guided radiology Report GenerAtioN framework (ORGAN). It first produces an observation plan and then feeds both the plan and radiographs for report generation, where an observation graph and a tree reasoning mechanism are adopted to precisely enrich the plan information by capturing the multiformats of each observation. Experimental results demonstrate that our framework outperforms previous state-of-the-art methods regarding text quality and clinical efficacy.",
author = "Wenjun Hou and Kaishuai Xu and Yi Cheng and Wenjie Li and Jiang Liu",
note = "Publisher Copyright: {\textcopyright} 2023 Association for Computational Linguistics.; 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023 ; Conference date: 09-07-2023 Through 14-07-2023",
year = "2023",
language = "English",
series = "Proceedings of the Annual Meeting of the Association for Computational Linguistics",
publisher = "Association for Computational Linguistics (ACL)",
pages = "8108--8122",
booktitle = "Long Papers",
address = "United States",
}
Hou, W, Xu, K, Cheng, Y, Li, W & Liu, J 2023, ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning. in Long Papers. Proceedings of the Annual Meeting of the Association for Computational Linguistics, vol. 1, Association for Computational Linguistics (ACL), pp. 8108-8122, 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023, Toronto, Canada, 9/07/23.
ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning. / Hou, Wenjun; Xu, Kaishuai; Cheng, Yi et al.
Long Papers. Association for Computational Linguistics (ACL), 2023. p. 8108-8122 (Proceedings of the Annual Meeting of the Association for Computational Linguistics; Vol. 1).
Research output: Chapter in book / Conference proceeding › Conference article published in proceeding or book › Academic research › peer-review
TY - GEN
T1 - ORGAN
T2 - 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023
AU - Hou, Wenjun
AU - Xu, Kaishuai
AU - Cheng, Yi
AU - Li, Wenjie
AU - Liu, Jiang
N1 - Publisher Copyright:© 2023 Association for Computational Linguistics.
PY - 2023
Y1 - 2023
N2 - This paper explores the task of radiology report generation, which aims at generating free-text descriptions for a set of radiographs. One significant challenge of this task is how to correctly maintain the consistency between the images and the lengthy report. Previous research explored solving this issue through planning-based methods, which generate reports only based on high-level plans. However, these plans usually only contain the major observations from the radiographs (e.g., lung opacity), lacking much necessary information, such as the observation characteristics and preliminary clinical diagnoses. To address this problem, the system should also take the image information into account together with the textual plan and perform stronger reasoning during the generation process. In this paper, we propose an Observation-guided radiology Report GenerAtioN framework (ORGAN). It first produces an observation plan and then feeds both the plan and radiographs for report generation, where an observation graph and a tree reasoning mechanism are adopted to precisely enrich the plan information by capturing the multiformats of each observation. Experimental results demonstrate that our framework outperforms previous state-of-the-art methods regarding text quality and clinical efficacy.
AB - This paper explores the task of radiology report generation, which aims at generating free-text descriptions for a set of radiographs. One significant challenge of this task is how to correctly maintain the consistency between the images and the lengthy report. Previous research explored solving this issue through planning-based methods, which generate reports only based on high-level plans. However, these plans usually only contain the major observations from the radiographs (e.g., lung opacity), lacking much necessary information, such as the observation characteristics and preliminary clinical diagnoses. To address this problem, the system should also take the image information into account together with the textual plan and perform stronger reasoning during the generation process. In this paper, we propose an Observation-guided radiology Report GenerAtioN framework (ORGAN). It first produces an observation plan and then feeds both the plan and radiographs for report generation, where an observation graph and a tree reasoning mechanism are adopted to precisely enrich the plan information by capturing the multiformats of each observation. Experimental results demonstrate that our framework outperforms previous state-of-the-art methods regarding text quality and clinical efficacy.
UR - http://www.scopus.com/inward/record.url?scp=85174397814&partnerID=8YFLogxK
M3 - Conference article published in proceeding or book
AN - SCOPUS:85174397814
T3 - Proceedings of the Annual Meeting of the Association for Computational Linguistics
SP - 8108
EP - 8122
BT - Long Papers
PB - Association for Computational Linguistics (ACL)
Y2 - 9 July 2023 through 14 July 2023
ER -
Hou W, Xu K, Cheng Y, Li W, Liu J. ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning. In Long Papers. Association for Computational Linguistics (ACL). 2023. p. 8108-8122. (Proceedings of the Annual Meeting of the Association for Computational Linguistics).