Use my Search Websuite to scan PubMed, PMCentral, Journal Hosts and Journal Archives, FullText.
Kick-your-searchterm to multiple Engines kick-your-query now !>
A dictionary by aggregated review articles of nephrology, medicine and the life sciences
Your one-stop-run pathway from word to the immediate pdf of peer-reviewed on-topic knowledge.

suck abstract from ncbi


10.1371/journal.pcbi.1008127

http://scihub22266oqcxt.onion/10.1371/journal.pcbi.1008127
suck pdf from google scholar
33044953!7595646!33044953
unlimited free pdf from europmc33044953    free
PDF from PMC    free
html from PMC    free

Warning: file_get_contents(https://eutils.ncbi.nlm.nih.gov/entrez/eutils/elink.fcgi?dbfrom=pubmed&id=33044953&cmd=llinks): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests in C:\Inetpub\vhosts\kidney.de\httpdocs\pget.php on line 215

suck abstract from ncbi

pmid33044953      PLoS+Comput+Biol 2020 ; 16 (10): e1008127
Nephropedia Template TP

gab.com Text

Twit Text FOAVip

Twit Text #

English Wikipedia


  • The covariance perceptron: A new paradigm for classification and processing of time series in recurrent neuronal networks #MMPMID33044953
  • Gilson M; Dahmen D; Moreno-Bote R; Insabato A; Helias M
  • PLoS Comput Biol 2020[Oct]; 16 (10): e1008127 PMID33044953show ga
  • Learning in neuronal networks has developed in many directions, in particular to reproduce cognitive tasks like image recognition and speech processing. Implementations have been inspired by stereotypical neuronal responses like tuning curves in the visual system, where, for example, ON/OFF cells fire or not depending on the contrast in their receptive fields. Classical models of neuronal networks therefore map a set of input signals to a set of activity levels in the output of the network. Each category of inputs is thereby predominantly characterized by its mean. In the case of time series, fluctuations around this mean constitute noise in this view. For this paradigm, the high variability exhibited by the cortical activity may thus imply limitations or constraints, which have been discussed for many years. For example, the need for averaging neuronal activity over long periods or large groups of cells to assess a robust mean and to diminish the effect of noise correlations. To reconcile robust computations with variable neuronal activity, we here propose a conceptual change of perspective by employing variability of activity as the basis for stimulus-related information to be learned by neurons, rather than merely being the noise that corrupts the mean signal. In this new paradigm both afferent and recurrent weights in a network are tuned to shape the input-output mapping for covariances, the second-order statistics of the fluctuating activity. When including time lags, covariance patterns define a natural metric for time series that capture their propagating nature. We develop the theory for classification of time series based on their spatio-temporal covariances, which reflect dynamical properties. We demonstrate that recurrent connectivity is able to transform information contained in the temporal structure of the signal into spatial covariances. Finally, we use the MNIST database to show how the covariance perceptron can capture specific second-order statistical patterns generated by moving digits.
  • |*Models, Neurological[MESH]
  • |*Neural Networks, Computer[MESH]
  • |Algorithms[MESH]
  • |Animals[MESH]
  • |Computational Biology[MESH]
  • |Computer Simulation[MESH]
  • |Databases, Factual[MESH]
  • |Humans[MESH]
  • |Image Processing, Computer-Assisted[MESH]
  • |Learning/physiology[MESH]


  • DeepDyve
  • Pubget Overpricing
  • suck abstract from ncbi

    Linkout box