• 国家药监局综合司 国家卫生健康委办公厅
  • 国家药监局综合司 国家卫生健康委办公厅

Hallucination Suppression Methods for Discharge Summary Generation Based on Large Model Fine-Tuning

Corresponding author: Li Yinchi, lyc@rjh.com.cn
DOI: 10.12201/bmr.202503.00005
Statement: This article is a preprint and has not been peer-reviewed. It reports new research that has yet to be evaluated and so should not be used to guide clinical practice.
  •  

    Abstract: Purpose/Significance This paper aims to address the hallucination problem in discharge summary generation using large-scale language models, particularly focusing on the accuracy and reliability issues when handling complex medical documents. As the complexity of medical data increases, enhancing the generative ability and contextual consistency of large language models becomes crucial. Method/Process This study constructs a high-quality, multi-level medical instruction dataset and employs a staged training-based instruction fine-tuning strategy to guide the model in learning tasks from simple to complex. Additionally, a data replay and mixed training mechanism is introduced during the fine-tuning process to ensure that the large model retains and utilizes existing knowledge when tackling new tasks. Result/Conclusion Experimental results show that the method proposed in this paper significantly reduces the occurrence of hallucinations in large model generation and improves the accuracy and reliability of medical text generation. The superiority of this method lies in the effective integration of curriculum learning theory and the replay mechanism, which not only enhances the models adaptability to complex tasks but also ensures the professionalism of the generated content, demonstrating high practicality and reliability.

    Key words: Discharge Summary Generation; Hallucination Suppression; Fine-Tuning Training

    Submit time: 3 March 2025

    Copyright: The copyright holder for this preprint is the author/funder, who has granted biomedRxiv a license to display the preprint in perpetuity.
  • 图表

  • niuyuxiang, geshanshan, wanglihua. Exploration and research of electronic medical record generation technology from traditional NLP to large language model. 2024. doi: 10.12201/bmr.202412.00080

    guo xuan zhi, zhou wu jie, shang xin, lian chun hua, zhan kai ming, lin long yong. The Model based on UNILM of question conditional generation in the field of Chinese medicine. 2021. doi: 10.12201/bmr.202110.00036

    WU Hong, HU Jun, CHEN Erzhen, DONG Chenjie, LI Jianhua, YE Qi. Construction of a Medical Quality Control Application System Based on Large Language Models. 2025. doi: 10.12201/bmr.202503.00004

    GE Xiaoling. Application of Artificial Intelligence Large Models in Healthcare:a Survey. 2024. doi: 10.12201/bmr.202408.00039

    LI Ru-fang, CAI Dan-dan, SHENG Xiao-wen, CHEN Xian-guo, XIONG Shang-hua. Study on the Impact of Early Discharge Guidance Based on Intelligent Education System on Postoperative Recovery in Young and Middle-aged Lung Cancer Patients. 2025. doi: 10.12201/bmr.202502.00057

    Negative Emotion and Ruminating Meditation as a Chain Mediator between the Quality of Discharge Guidance and the Readiness of Discharge after Coronary Artery Bypass Surgery.. 2024. doi: 10.12201/bmr.202410.00047

    kangyishuai, shaochenjie. An Algorithm for Generating TCM Document Questions Based on Unified Language Model. 2022. doi: 10.12201/bmr.202110.00044

    xie jia qi. Leveraging Pre-trained Language Model for Consumer Health Question Classification. 2021. doi: 10.12201/bmr.202101.00017

    Shi Chenghao, Tu Xinyi, Shi Jiawei, Chen Hongshuang, Wang Qinlu, Zou Haiou. A Scoping Review of the Application of Large Language Models in Clinical Practice. 2024. doi: 10.12201/bmr.202406.00001

    wangyaoguo, tangshishi, liuhongze, anyuting, zhouyi. Research on optimization of osteoporosis disease database construction process based on local large model. 2024. doi: 10.12201/bmr.202410.00002

  • ID Submit time Number Download
    1 2024-12-23

    bmr.202503.00005V1

    Download
  • Public  Anonymous  To author only

Get Citation

Jiang Shengyao, Yuan Cheng, Zhu Lifeng, Li Yinchi, Fan Yawei, Zhang Weiyan, Ruan Tong, Shao Wei. Hallucination Suppression Methods for Discharge Summary Generation Based on Large Model Fine-Tuning. 2025. biomedRxiv.202503.00005

Article Metrics

  • Read: 122
  • Download: 2
  • Comment: 0

Email This Article

User name:
Email:*请输入正确邮箱
Code:*验证码错误