Use my Search Websuite to scan PubMed, PMCentral, Journal Hosts and Journal Archives, FullText.
Kick-your-searchterm to multiple Engines kick-your-query now !>
A dictionary by aggregated review articles of nephrology, medicine and the life sciences
Your one-stop-run pathway from word to the immediate pdf of peer-reviewed on-topic knowledge.

suck abstract from ncbi


10.1007/s11030-025-11400-6

http://scihub22266oqcxt.onion/10.1007/s11030-025-11400-6
suck pdf from google scholar
41264203!?!41264203

Warning: file_get_contents(https://eutils.ncbi.nlm.nih.gov/entrez/eutils/elink.fcgi?dbfrom=pubmed&id=41264203&cmd=llinks): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests in C:\Inetpub\vhosts\kidney.de\httpdocs\pget.php on line 215

suck abstract from ncbi

pmid41264203      Mol+Divers 2025 ; ? (?): ?
Nephropedia Template TP

gab.com Text

Twit Text FOAVip

Twit Text #

English Wikipedia


  • A transfer learning framework for PTP1B inhibitor activity prediction: differential modeling of natural and non-natural products with web platform implementation #MMPMID41264203
  • Wang Z; Sun L; Ren A; Yang F; Chang Y
  • Mol Divers 2025[Nov]; ? (?): ? PMID41264203show ga
  • Protein tyrosine phosphatase 1B (PTP1B) is a key therapeutic target for diabetes, obesity, and cancer. However, the development of its inhibitors faces challenges including low selectivity and poor bioavailability. Although deep learning (DL) can accelerate drug discovery, prior models often overlooked structural distinctions between non-natural products (NNPs) and natural products (NPs) in chemical datasets. In this study, we separated PTP1B inhibitors and decoys into NPs and NNPs subsets to build activity prediction models tailored to their respective chemical spaces. Using transfer learning (TL), we enhanced model performance specifically for NPs. Five-fold cross-validation was used for hyperparameter optimization and for evaluating the activity prediction performance of the three model architectures. The results showed that Attentive FP (AFP) performed best among graph neural networks, Extended-Connectivity Fingerprints 4 (ECFP4) led in multi-layer perceptron (MLP) models using molecular fingerprints, and PubChem10M_SMILES_BPE_450k (P10M) excelled among SMILES-based Transformers. The new models for NPs, derived from the three model architectures via TL (pre-trained on NNPs then fine-tuned on NPs), all outperformed their original counterparts. Random splitting further confirmed the enhancing effect of TL on NPs activity prediction and the generalization ability of models. We also developed a web platform ( http://ptp1bpredict.top ) that allows for the independent use of the AFP, MLP-ECFP4, and P10M models, including their transfer-learned variants, to predict PTP1B inhibition by NNPs and NPs. In summary, this work provides a novel strategy for DL-based screening of PTP1B inhibitors.
  • ?


  • DeepDyve
  • Pubget Overpricing
  • suck abstract from ncbi

    Linkout box