ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning (2024)

Abstract

This paper explores the task of radiology report generation, which aims at generating free-text descriptions for a set of radiographs. One significant challenge of this task is how to correctly maintain the consistency between the images and the lengthy report. Previous research explored solving this issue through planning-based methods, which generate reports only based on high-level plans. However, these plans usually only contain the major observations from the radiographs (e.g., lung opacity), lacking much necessary information, such as the observation characteristics and preliminary clinical diagnoses. To address this problem, the system should also take the image information into account together with the textual plan and perform stronger reasoning during the generation process. In this paper, we propose an Observation-guided radiology Report GenerAtioN framework (ORGAN). It first produces an observation plan and then feeds both the plan and radiographs for report generation, where an observation graph and a tree reasoning mechanism are adopted to precisely enrich the plan information by capturing the multiformats of each observation. Experimental results demonstrate that our framework outperforms previous state-of-the-art methods regarding text quality and clinical efficacy.

Original languageEnglish
Title of host publicationLong Papers
PublisherAssociation for Computational Linguistics (ACL)
Pages8108-8122
Number of pages15
ISBN (Electronic)9781959429722
Publication statusPublished - 2023
Event61st Annual Meeting of the Association for Computational Linguistics, ACL 2023 - Toronto, Canada
Duration: 9 Jul 202314 Jul 2023

Publication series

NameProceedings of the Annual Meeting of the Association for Computational Linguistics
Volume1
ISSN (Print)0736-587X

Conference

Conference61st Annual Meeting of the Association for Computational Linguistics, ACL 2023
Country/TerritoryCanada
CityToronto
Period9/07/2314/07/23

ASJC Scopus subject areas

  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics

Other files and links

Fingerprint

Dive into the research topics of 'ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning'. Together they form a unique fingerprint.

Cite this

  • APA
  • Author
  • BIBTEX
  • Harvard
  • Standard
  • RIS
  • Vancouver

Hou, W., Xu, K., Cheng, Y., Li, W., & Liu, J. (2023). ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning. In Long Papers (pp. 8108-8122). (Proceedings of the Annual Meeting of the Association for Computational Linguistics; Vol. 1). Association for Computational Linguistics (ACL).

Hou, Wenjun ; Xu, Kaishuai ; Cheng, Yi et al. / ORGAN : Observation-Guided Radiology Report Generation via Tree Reasoning. Long Papers. Association for Computational Linguistics (ACL), 2023. pp. 8108-8122 (Proceedings of the Annual Meeting of the Association for Computational Linguistics).

@inproceedings{f53dd3b19ea94752bcabedcb3f0dc30a,

title = "ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning",

abstract = "This paper explores the task of radiology report generation, which aims at generating free-text descriptions for a set of radiographs. One significant challenge of this task is how to correctly maintain the consistency between the images and the lengthy report. Previous research explored solving this issue through planning-based methods, which generate reports only based on high-level plans. However, these plans usually only contain the major observations from the radiographs (e.g., lung opacity), lacking much necessary information, such as the observation characteristics and preliminary clinical diagnoses. To address this problem, the system should also take the image information into account together with the textual plan and perform stronger reasoning during the generation process. In this paper, we propose an Observation-guided radiology Report GenerAtioN framework (ORGAN). It first produces an observation plan and then feeds both the plan and radiographs for report generation, where an observation graph and a tree reasoning mechanism are adopted to precisely enrich the plan information by capturing the multiformats of each observation. Experimental results demonstrate that our framework outperforms previous state-of-the-art methods regarding text quality and clinical efficacy.",

author = "Wenjun Hou and Kaishuai Xu and Yi Cheng and Wenjie Li and Jiang Liu",

note = "Publisher Copyright: {\textcopyright} 2023 Association for Computational Linguistics.; 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023 ; Conference date: 09-07-2023 Through 14-07-2023",

year = "2023",

language = "English",

series = "Proceedings of the Annual Meeting of the Association for Computational Linguistics",

publisher = "Association for Computational Linguistics (ACL)",

pages = "8108--8122",

booktitle = "Long Papers",

address = "United States",

}

Hou, W, Xu, K, Cheng, Y, Li, W & Liu, J 2023, ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning. in Long Papers. Proceedings of the Annual Meeting of the Association for Computational Linguistics, vol. 1, Association for Computational Linguistics (ACL), pp. 8108-8122, 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023, Toronto, Canada, 9/07/23.

ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning. / Hou, Wenjun; Xu, Kaishuai; Cheng, Yi et al.
Long Papers. Association for Computational Linguistics (ACL), 2023. p. 8108-8122 (Proceedings of the Annual Meeting of the Association for Computational Linguistics; Vol. 1).

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

TY - GEN

T1 - ORGAN

T2 - 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023

AU - Hou, Wenjun

AU - Xu, Kaishuai

AU - Cheng, Yi

AU - Li, Wenjie

AU - Liu, Jiang

N1 - Publisher Copyright:© 2023 Association for Computational Linguistics.

PY - 2023

Y1 - 2023

N2 - This paper explores the task of radiology report generation, which aims at generating free-text descriptions for a set of radiographs. One significant challenge of this task is how to correctly maintain the consistency between the images and the lengthy report. Previous research explored solving this issue through planning-based methods, which generate reports only based on high-level plans. However, these plans usually only contain the major observations from the radiographs (e.g., lung opacity), lacking much necessary information, such as the observation characteristics and preliminary clinical diagnoses. To address this problem, the system should also take the image information into account together with the textual plan and perform stronger reasoning during the generation process. In this paper, we propose an Observation-guided radiology Report GenerAtioN framework (ORGAN). It first produces an observation plan and then feeds both the plan and radiographs for report generation, where an observation graph and a tree reasoning mechanism are adopted to precisely enrich the plan information by capturing the multiformats of each observation. Experimental results demonstrate that our framework outperforms previous state-of-the-art methods regarding text quality and clinical efficacy.

AB - This paper explores the task of radiology report generation, which aims at generating free-text descriptions for a set of radiographs. One significant challenge of this task is how to correctly maintain the consistency between the images and the lengthy report. Previous research explored solving this issue through planning-based methods, which generate reports only based on high-level plans. However, these plans usually only contain the major observations from the radiographs (e.g., lung opacity), lacking much necessary information, such as the observation characteristics and preliminary clinical diagnoses. To address this problem, the system should also take the image information into account together with the textual plan and perform stronger reasoning during the generation process. In this paper, we propose an Observation-guided radiology Report GenerAtioN framework (ORGAN). It first produces an observation plan and then feeds both the plan and radiographs for report generation, where an observation graph and a tree reasoning mechanism are adopted to precisely enrich the plan information by capturing the multiformats of each observation. Experimental results demonstrate that our framework outperforms previous state-of-the-art methods regarding text quality and clinical efficacy.

UR - http://www.scopus.com/inward/record.url?scp=85174397814&partnerID=8YFLogxK

M3 - Conference article published in proceeding or book

AN - SCOPUS:85174397814

T3 - Proceedings of the Annual Meeting of the Association for Computational Linguistics

SP - 8108

EP - 8122

BT - Long Papers

PB - Association for Computational Linguistics (ACL)

Y2 - 9 July 2023 through 14 July 2023

ER -

Hou W, Xu K, Cheng Y, Li W, Liu J. ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning. In Long Papers. Association for Computational Linguistics (ACL). 2023. p. 8108-8122. (Proceedings of the Annual Meeting of the Association for Computational Linguistics).

ORGAN: Observation-Guided Radiology Report Generation via Tree Reasoning (2024)

References

Top Articles
Latest Posts
Article information

Author: Prof. An Powlowski

Last Updated:

Views: 6087

Rating: 4.3 / 5 (64 voted)

Reviews: 95% of readers found this page helpful

Author information

Name: Prof. An Powlowski

Birthday: 1992-09-29

Address: Apt. 994 8891 Orval Hill, Brittnyburgh, AZ 41023-0398

Phone: +26417467956738

Job: District Marketing Strategist

Hobby: Embroidery, Bodybuilding, Motor sports, Amateur radio, Wood carving, Whittling, Air sports

Introduction: My name is Prof. An Powlowski, I am a charming, helpful, attractive, good, graceful, thoughtful, vast person who loves writing and wants to share my knowledge and understanding with you.