Treffer: An Intelligent Trial Eligibility Screening Tool Using Natural Language Processing With a Block-Based Visual Programming Interface: Development and Usability Study.
JMIR Med Inform. 2021 Aug 3;9(8):e24405. (PMID: 34342589)
N Engl J Med. 2014 Dec 25;371(26):2467-76. (PMID: 25493978)
Appl Clin Inform. 2023 Jan;14(1):108-118. (PMID: 36754066)
Front Neurol. 2021 Feb 09;11:629920. (PMID: 33633661)
N Engl J Med. 2012 Mar 22;366(12):1099-107. (PMID: 22435369)
JAMA Netw Open. 2020 Jan 3;3(1):e1919301. (PMID: 31940040)
J Am Med Inform Assoc. 2010 May-Jun;17(3):229-36. (PMID: 20442139)
JMIR Med Inform. 2019 Jul 24;7(3):e14185. (PMID: 31342909)
Nat Med. 2023 Aug;29(8):1930-1940. (PMID: 37460753)
Cardiol Rev. 2025 Jan-Feb 01;33(1):93-97. (PMID: 38305253)
BMC Med Res Methodol. 2007 Jul 03;7:30. (PMID: 17608932)
World Neurosurg. 2021 Jul;151:e552-e564. (PMID: 33933697)
Int J Med Inform. 2015 Apr;84(4):221-8. (PMID: 25547194)
Psychol Methods. 2012 Dec;17(4):600-14. (PMID: 22799624)
J Am Med Inform Assoc. 2017 Jul 1;24(4):841-844. (PMID: 28130331)
NPJ Digit Med. 2020 Feb 6;3:17. (PMID: 32047862)
J Am Med Inform Assoc. 2021 Apr 23;28(5):974-984. (PMID: 33517382)
BMC Med Inform Decis Mak. 2015 Apr 14;15:28. (PMID: 25881112)
Clin Exp Med. 2023 Oct;23(6):1867-1879. (PMID: 36602707)
Int J Med Inform. 2026 Jan;205:106104. (PMID: 40925145)
Medicine (Baltimore). 2020 Jul 31;99(31):e21182. (PMID: 32756096)
J Stroke Cerebrovasc Dis. 2018 Jul;27(7):2019-2025. (PMID: 29625799)
Stroke. 2015 Aug;46(8):2341-6. (PMID: 26152294)
JAMA. 2013 Apr 3;309(13):1351-2. (PMID: 23549579)
J Med Syst. 2023 Mar 04;47(1):33. (PMID: 36869927)
Nature. 2024 Oct;634(8032):61-68. (PMID: 39322679)
Int J Med Inform. 2018 Apr;112:149-157. (PMID: 29500013)
N Engl J Med. 1995 Dec 14;333(24):1581-7. (PMID: 7477192)
J Am Med Inform Assoc. 2021 Apr 23;28(5):899-906. (PMID: 33566093)
JMIR Med Inform. 2024 Apr 12;12:e55499. (PMID: 38607672)
Appl Clin Inform. 2017 Oct;8(4):1197-1207. (PMID: 29272901)
J Patient Saf. 2022 Sep 1;18(6):e999-e1003. (PMID: 35985047)
J Am Acad Dermatol. 2021 Sep;85(3):565-566. (PMID: 34153390)
Crit Care Med. 2008 Oct;36(10):2951-2. (PMID: 18812808)
Bioinformatics. 2022 Jun 13;38(12):3267-3274. (PMID: 35485748)
Circ Res. 2017 Feb 3;120(3):541-558. (PMID: 28154103)
Nurse Educ Today. 2020 Nov;94:104517. (PMID: 32853983)
Neurology. 2017 May 30;88(22):2074-2075. (PMID: 28455385)
Weitere Informationen
Background: Clinical trial eligibility screening using electronic medical records (EMRs) is challenging due to the complexity of patient data and the varied clinical terminologies. Manual screening is time-consuming, requires specialized knowledge, and can lead to inconsistent participant selection, potentially compromising patient safety and research outcomes. This is critical in time-sensitive conditions like acute ischemic stroke. While computerized clinical decision support tools offer solutions, most require software engineering expertise to update, limiting their practical utility when eligibility criteria change.
Objective: We developed and evaluated the intelligent trial eligibility screening tool (iTEST), which combines natural language processing with a block-based visual programming interface designed to enable clinicians to create and modify eligibility screening rules independently. In this study, we assessed iTEST's rule evaluation module using pre-configured rules and compared its effectiveness with that of standard EMR interfaces.
Methods: We conducted an experiment at a tertiary teaching hospital in Taiwan with 12 clinicians using a 2-period crossover design. The clinicians assessed the eligibility of 4 patients with stroke for 2 clinical trials using both standard EMR and iTEST in a counterbalanced order, resulting in 48 evaluation scenarios. The iTEST comprised a rule authoring module using Google Blockly and a rule evaluation module utilizing MetaMap Lite for extracting medical concepts from unstructured EMR documents and structured laboratory data. Primary outcomes included accuracy in determining eligibility. Secondary outcomes measured task completion time, cognitive workload using the National Aeronautics and Space Administration Task Load Index scale (range 0-100, with lower scores indicating a lower cognitive workload), and system usability through the system usability scale (range: 0-100, with higher scores indicating higher system usability).
Results: The iTEST significantly improved accuracy scores (from 0.91 to 1.00, P<.001) and reduced completion time (from 3.18 to 2.44 min, P=.004) compared to the standard EMR interface. Users reported lower cognitive workload (National Aeronautics and Space Administration Task Load Index scale, 39.7 vs 62.8, P=.02) and higher system usability scale scores (71.3 vs 46.3, P=.01) with the iTEST. Particularly notable improvements in perceived cognitive workload were observed in temporal demand, effort, and frustration levels.
Conclusions: The iTEST demonstrated superior performance in clinical trial eligibility screening, delivering improved accuracy, reduced completion time, lower cognitive workload, and better usability when evaluating preconfigured eligibility rules. The improved accuracy is critical for patient safety, as the misidentification of eligibility criteria could expose patients to inappropriate treatments or exclude them from beneficial trials. The adaptability and ability of the iTEST to process both structured and unstructured data make it valuable for time-sensitive scenarios and evolving research protocols. Future research should evaluate clinicians' ability to create and modify eligibility rules using the block-based authoring interface, as well as assess the iTEST across diverse types of clinical trials and health care settings.
(© Ya-Han Hu, Yi-Ying Cheng, Chung-Ching Lan, Yu-Hsiang Su, Sheng-Feng Sung. Originally published in JMIR Medical Informatics (https://medinform.jmir.org).)