IDEAS home Printed from https://ideas.repec.org/a/abx/journl/y2025id972.html
   My bibliography  Save this article

Application of Computer Vision for Automated Processing of Medical Documents

Author

Listed:
  • Y. A. Kurliuk

  • N. A. Larchenko
  • M. V. Davydov
  • E. K. Kurlyanskaya

Abstract

This paper examines the automation of medical image processing for diagnosing arterial hypertension using artiï¬ cial intelligence and computer vision technologies. A software component has been developed that automatically extracts and structures information from visual representations of medical documents (including biochemical analysis results, complete blood counts, and 24-hour blood pressure monitoring data), minimizing errors and accelerating the process of entering and interpreting medical information. Algorithms for image preprocessing (increasing image resolution, noise removal, and tilt correction), segmentation, and text recognition were developed and tested using the Real-ESRGAN and EasyOCR neural network models. Particular attention was paid to improving text recognition quality in the presence of characteristic artifacts that arise when scanning or photographing documents. CER and WER metrics were used to evaluate quality, and the module's performance was assessed with and without superresolution. The results of the study conï¬ rmed the effectiveness of the proposed approach and demonstrated that the integration of Real-ESRGAN technology improves the accuracy of medical image processing in the presence of signiï¬ cant noise and low-resolution source data. The practical signiï¬ cance of the study lies in simplifying and accelerating the process of diagnosing hypertension and creating the basis for a personalized approach to patient treatment.

Suggested Citation

  • Y. A. Kurliuk & N. A. Larchenko & M. V. Davydov & E. K. Kurlyanskaya, 2025. "Application of Computer Vision for Automated Processing of Medical Documents," Digital Transformation, Educational Establishment “Belarusian State University of Informatics and Radioelectronicsâ€, vol. 31(4).
  • Handle: RePEc:abx:journl:y:2025:id:972
    DOI: 10.35596/1729-7648-2025-31-4-55-64
    as

    Download full text from publisher

    File URL: https://dt.bsuir.by/jour/article/viewFile/972/374
    Download Restriction: no

    File URL: https://libkey.io/10.35596/1729-7648-2025-31-4-55-64?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:abx:journl:y:2025:id:972. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Ð ÐµÐ´Ð°ÐºÑ†Ð¸Ñ (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.