Use my Search Websuite to scan PubMed, PMCentral, Journal Hosts and Journal Archives, FullText.
Kick-your-searchterm to multiple Engines kick-your-query now !>
A dictionary by aggregated review articles of nephrology, medicine and the life sciences
Your one-stop-run pathway from word to the immediate pdf of peer-reviewed on-topic knowledge.

suck abstract from ncbi


10.1093/jamia/ocab077

http://scihub22266oqcxt.onion/10.1093/jamia/ocab077
suck pdf from google scholar
33956981!8135980!33956981
unlimited free pdf from europmc33956981    free
PDF from PMC    free
html from PMC    free

suck abstract from ncbi


Deprecated: Implicit conversion from float 213.6 to int loses precision in C:\Inetpub\vhosts\kidney.de\httpdocs\pget.php on line 534

Deprecated: Implicit conversion from float 213.6 to int loses precision in C:\Inetpub\vhosts\kidney.de\httpdocs\pget.php on line 534
pmid33956981      J+Am+Med+Inform+Assoc 2021 ; 28 (8): 1703-1711
Nephropedia Template TP

gab.com Text

Twit Text FOAVip

Twit Text #

English Wikipedia


  • A neuro-symbolic method for understanding free-text medical evidence #MMPMID33956981
  • Kang T; Turfah A; Kim J; Perotte A; Weng C
  • J Am Med Inform Assoc 2021[Jul]; 28 (8): 1703-1711 PMID33956981show ga
  • OBJECTIVE: We introduce Medical evidence Dependency (MD)-informed attention, a novel neuro-symbolic model for understanding free-text clinical trial publications with generalizability and interpretability. MATERIALS AND METHODS: We trained one head in the multi-head self-attention model to attend to the Medical evidence Ddependency (MD) and to pass linguistic and domain knowledge on to later layers (MD informed). This MD-informed attention model was integrated into BioBERT and tested on 2 public machine reading comprehension benchmarks for clinical trial publications: Evidence Inference 2.0 and PubMedQA. We also curated a small set of recently published articles reporting randomized controlled trials on COVID-19 (coronavirus disease 2019) following the Evidence Inference 2.0 guidelines to evaluate the model's robustness to unseen data. RESULTS: The integration of MD-informed attention head improves BioBERT substantially in both benchmark tasks-as large as an increase of +30% in the F1 score-and achieves the new state-of-the-art performance on the Evidence Inference 2.0. It achieves 84% and 82% in overall accuracy and F1 score, respectively, on the unseen COVID-19 data. CONCLUSIONS: MD-informed attention empowers neural reading comprehension models with interpretability and generalizability via reusable domain knowledge. Its compositionality can benefit any transformer-based architecture for machine reading comprehension of free-text medical evidence.
  • |*Artificial Intelligence[MESH]
  • |*Clinical Trials as Topic[MESH]
  • |*Models, Neurological[MESH]
  • |*Natural Language Processing[MESH]
  • |COVID-19[MESH]
  • |Computer Simulation[MESH]
  • |Data Mining[MESH]
  • |Humans[MESH]
  • |Information Storage and Retrieval/*methods[MESH]


  • DeepDyve
  • Pubget Overpricing
  • suck abstract from ncbi

    Linkout box