Neural Comput - Noise tolerance of attractor and feedforward memory models.

Tópicos

{ network(2748) neural(1063) input(814) }
{ algorithm(1844) comput(1787) effici(935) }
{ use(1733) differ(960) four(931) }
{ studi(1119) effect(1106) posit(819) }
{ perform(999) metric(946) measur(919) }
{ health(3367) inform(1360) care(1135) }
{ chang(1828) time(1643) increas(1301) }
{ data(1714) softwar(1251) tool(1186) }
{ control(1307) perform(991) simul(935) }
{ general(901) number(790) one(736) }
{ import(1318) role(1303) understand(862) }
{ imag(1947) propos(1133) code(1026) }
{ surgeri(1148) surgic(1085) robot(1054) }
{ model(2220) cell(1177) simul(1124) }
{ studi(1410) differ(1259) use(1210) }
{ activ(1138) subject(705) human(624) }
{ process(1125) use(805) approach(778) }
{ imag(2830) propos(1344) filter(1198) }
{ learn(2355) train(1041) set(1003) }
{ concept(1167) ontolog(924) domain(897) }
{ clinic(1479) use(1117) guidelin(835) }
{ howev(809) still(633) remain(590) }
{ spatial(1525) area(1432) region(1030) }
{ model(2656) set(1616) predict(1553) }
{ signal(2180) analysi(812) frequenc(800) }
{ cost(1906) reduc(1198) effect(832) }
{ sampl(1606) size(1419) use(1276) }
{ intervent(3218) particip(2042) group(1664) }
{ cancer(2502) breast(956) screen(824) }
{ result(1111) use(1088) new(759) }
{ estim(2440) model(1874) function(577) }
{ method(2212) result(1239) propos(1039) }
{ model(3404) distribut(989) bayesian(671) }
{ can(774) often(719) complex(702) }
{ data(1737) use(1416) pattern(1282) }
{ inform(2794) health(2639) internet(1427) }
{ system(1976) rule(880) can(841) }
{ measur(2081) correl(1212) valu(896) }
{ imag(1057) registr(996) error(939) }
{ bind(1733) structur(1185) ligand(1036) }
{ sequenc(1873) structur(1644) protein(1328) }
{ method(1219) similar(1157) match(930) }
{ featur(3375) classif(2383) classifi(1994) }
{ imag(2675) segment(2577) method(1081) }
{ patient(2315) diseas(1263) diabet(1191) }
{ take(945) account(800) differ(722) }
{ studi(2440) review(1878) systemat(933) }
{ motion(1329) object(1292) video(1091) }
{ assess(1506) score(1403) qualiti(1306) }
{ treatment(1704) effect(941) patient(846) }
{ framework(1458) process(801) describ(734) }
{ problem(2511) optim(1539) algorithm(950) }
{ error(1145) method(1030) estim(1020) }
{ extract(1171) text(1153) clinic(932) }
{ method(1557) propos(1049) approach(1037) }
{ design(1359) user(1324) use(1319) }
{ care(1570) inform(1187) nurs(1089) }
{ method(984) reconstruct(947) comput(926) }
{ search(2224) databas(1162) retriev(909) }
{ featur(1941) imag(1645) propos(1176) }
{ case(1353) use(1143) diagnosi(1136) }
{ data(3963) clinic(1234) research(1004) }
{ risk(3053) factor(974) diseas(938) }
{ research(1085) discuss(1038) issu(1018) }
{ system(1050) medic(1026) inform(1018) }
{ model(2341) predict(2261) use(1141) }
{ visual(1396) interact(850) tool(830) }
{ compound(1573) activ(1297) structur(1058) }
{ perform(1367) use(1326) method(1137) }
{ blood(1257) pressur(1144) flow(957) }
{ record(1888) medic(1808) patient(1693) }
{ model(3480) simul(1196) paramet(876) }
{ monitor(1329) mobil(1314) devic(1160) }
{ ehr(2073) health(1662) electron(1139) }
{ state(1844) use(1261) util(961) }
{ research(1218) medic(880) student(794) }
{ patient(2837) hospit(1953) medic(668) }
{ data(2317) use(1299) case(1017) }
{ age(1611) year(1155) adult(843) }
{ medic(1828) order(1363) alert(1069) }
{ group(2977) signific(1463) compar(1072) }
{ gene(2352) biolog(1181) express(1162) }
{ data(3008) multipl(1320) sourc(1022) }
{ first(2504) two(1366) second(1323) }
{ time(1939) patient(1703) rate(768) }
{ patient(1821) servic(1111) care(1106) }
{ use(2086) technolog(871) perceiv(783) }
{ can(981) present(881) function(850) }
{ analysi(2126) use(1163) compon(1037) }
{ health(1844) social(1437) communiti(874) }
{ structur(1116) can(940) graph(676) }
{ high(1669) rate(1365) level(1280) }
{ use(976) code(926) identifi(902) }
{ drug(1928) target(777) effect(648) }
{ implement(1333) system(1263) develop(1122) }
{ survey(1388) particip(1329) question(1065) }
{ decis(3086) make(1611) patient(1517) }
{ activ(1452) weight(1219) physic(1104) }
{ method(1969) cluster(1462) data(1082) }
{ detect(2391) sensit(1101) algorithm(908) }

Resumo

In short-term memory networks, transient stimuli are represented by patterns of neural activity that persist long after stimulus offset. Here, we compare the performance of two prominent classes of memory networks, feedback-based attractor networks and feedforward networks, in conveying information about the amplitude of a briefly presented stimulus in the presence of gaussian noise. Using Fisher information as a metric of memory performance, we find that the optimal form of network architecture depends strongly on assumptions about the forms of nonlinearities in the network. For purely linear networks, we find that feedforward networks outperform attractor networks because noise is continually removed from feedforward networks when signals exit the network; as a result, feedforward networks can amplify signals they receive faster than noise accumulates over time. By contrast, attractor networks must operate in a signal-attenuating regime to avoid the buildup of noise. However, if the amplification of signals is limited by a finite dynamic range of neuronal responses or if noise is reset at the time of signal arrival, as suggested by recent experiments, we find that attractor networks can outperform feedforward ones. Under a simple model in which neurons have a finite dynamic range, we find that the optimal attractor networks are forgetful if there is no mechanism for noise reduction with signal arrival but nonforgetful (perfect integrators) in the presence of a strong reset mechanism. Furthermore, we find that the maximal Fisher information for the feedforward and attractor networks exhibits power law decay as a function of time and scales linearly with the number of neurons. These results highlight prominent factors that lead to trade-offs in the memory performance of networks with different architectures and constraints, and suggest conditions under which attractor or feedforward networks may be best suited to storing information about previous stimuli.

Resumo Limpo

shortterm memori network transient stimuli repres pattern neural activ persist long stimulus offset compar perform two promin class memori network feedbackbas attractor network feedforward network convey inform amplitud briefli present stimulus presenc gaussian nois use fisher inform metric memori perform find optim form network architectur depend strong assumpt form nonlinear network pure linear network find feedforward network outperform attractor network nois continu remov feedforward network signal exit network result feedforward network can amplifi signal receiv faster nois accumul time contrast attractor network must oper signalattenu regim avoid buildup nois howev amplif signal limit finit dynam rang neuron respons nois reset time signal arriv suggest recent experi find attractor network can outperform feedforward one simpl model neuron finit dynam rang find optim attractor network forget mechan nois reduct signal arriv nonforget perfect integr presenc strong reset mechan furthermor find maxim fisher inform feedforward attractor network exhibit power law decay function time scale linear number neuron result highlight promin factor lead tradeoff memori perform network differ architectur constraint suggest condit attractor feedforward network may best suit store inform previous stimuli

Resumos Similares

Int J Neural Syst - Modeling fluctuations in default-mode brain network using a spiking neural network. ( 0,937985864548004 )
Int J Neural Syst - Adaptation-dependent synchronization transitions and burst generations in electrically coupled neural networks. ( 0,937451357397886 )
Neural Comput - Dynamical synapses enhance neural information processing: gracefulness, accuracy, and mobility. ( 0,932925747630769 )
Neural Comput - Echo state property linked to an input: exploring a fundamental characteristic of recurrent neural networks. ( 0,932865187456374 )
Neural Comput - Nearly extensive sequential memory lifetime achieved by coupled nonlinear neurons. ( 0,919600894475749 )
Neural Comput - Fragility in dynamic networks: application to neural networks in the epileptic cortex. ( 0,918558502248412 )
Neural Comput - Neutral stability, rate propagation, and critical branching in feedforward networks. ( 0,908119596115255 )
Neural Comput - Replicating receptive fields of simple and complex cells in primary visual cortex in a neuronal network model with temporal and population sparseness and reliability. ( 0,905855486642006 )
Neural Comput - Influence of external inputs and asymmetry of connections on information-geometric measures involving up to ten neuronal interactions. ( 0,904850259672011 )
Neural Comput - Analysis of the stabilized supralinear network. ( 0,901991304813916 )
Neural Comput - The competing benefits of noise and heterogeneity in neural coding. ( 0,898142028079138 )
Neural Comput - Anatomical constraints on lateral competition in columnar cortical architectures. ( 0,890840843078897 )
Neural Comput - Synaptic scaling stabilizes persistent activity driven by asynchronous neurotransmitter release. ( 0,888665015597619 )
J Chem Inf Model - Modeling complex metabolic reactions, ecological systems, and financial and legal networks with MIANN models based on Markov-Wiener node descriptors. ( 0,88128558410088 )
Neural Comput - Traveling bumps and their collisions in a two-dimensional neural field. ( 0,877935567733653 )
Comput Math Methods Med - Voxel scale complex networks of functional connectivity in the rat brain: neurochemical state dependence of global and local topological properties. ( 0,874461213535953 )
Neural Comput - Exact event-driven implementation for recurrent networks of stochastic perfect integrate-and-fire neurons. ( 0,869778206264634 )
Neural Comput - Natural gradient learning algorithms for RBF networks. ( 0,868426238966019 )
Neural Comput - Interactions of excitatory and inhibitory feedback topologies in facilitating pattern separation and retrieval. ( 0,867116895029729 )
Neural Comput - Motor cortex microcircuit simulation based on brain activity mapping. ( 0,866442034886622 )
Neural Comput - Computing with a canonical neural circuits model with pool normalization and modulating feedback. ( 0,866129575978867 )
Neural Comput - Collective stability of networks of winner-take-all circuits. ( 0,865420311889318 )
Int J Neural Syst - Effects of extremely low-frequency magnetic fields on the response of a conductance-based neuron model. ( 0,864978012119375 )
Neural Comput - Intrinsic adaptation in autonomous recurrent neural networks. ( 0,864814712571019 )
Int J Neural Syst - A neuron-based time-optimal controller of horizontal saccadic eye movements. ( 0,863938664852438 )
Neural Comput - Chaotic exploration and learning of locomotion behaviors. ( 0,862909793299574 )
Neural Comput - How to compress sequential memory patterns into periodic oscillations: general reduction rules. ( 0,859656038235272 )
IEEE Trans Neural Netw Learn Syst - Complex-Valued Recurrent Correlation Neural Networks. ( 0,857010025719159 )
Comput Math Methods Med - Concentration-invariant odor representation in the olfactory system by presynaptic inhibition. ( 0,853546993024344 )
Neural Comput - Short-term memory capacity in networks via the restricted isometry property. ( 0,851625987256442 )
Neural Comput - Design strategies for weight matrices of echo state networks. ( 0,84638895846168 )
Neural Comput - Information-geometric measures for estimation of connection weight under correlated inputs. ( 0,843109600742229 )
Neural Comput - Neural information processing with feedback modulations. ( 0,831128930601246 )
Neural Comput - Projective clustering using neural networks with adaptive delay and signal transmission loss. ( 0,826857108844812 )
Int J Neural Syst - A new work mechanism on neuronal activity. ( 0,826831374382334 )
Neural Comput - Subthreshold membrane depolarization as memory trace for perceptual learning. ( 0,824557608666671 )
Comput Math Methods Med - Analysis of epileptic seizures with complex network. ( 0,822672709413712 )
Neural Comput - A bayesian model of polychronicity. ( 0,821706642107632 )
Comput Math Methods Med - A signal-processing-based approach to time-varying graph analysis for dynamic brain network identification. ( 0,819736610297979 )
Neural Comput - Decorrelation by recurrent inhibition in heterogeneous neural circuits. ( 0,819672567197021 )
IEEE Trans Neural Netw Learn Syst - Phase Oscillatory Network and Visual Pattern Recognition. ( 0,819556022754316 )
Neural Comput - Facilitation of neuronal responses by intrinsic default mode network activity. ( 0,816623281689075 )
Neural Comput - Mechanisms that modulate the transfer of spiking correlations. ( 0,815766918153117 )
Neural Comput - Sparseness, antisparseness and anything in between: the operating point of a neuron determines its computational repertoire. ( 0,814936902420315 )
Comput. Biol. Med. - Connectivity analysis of multichannel EEG signals using recurrence based phase synchronization technique. ( 0,813970938499157 )
J Chem Inf Model - New Markov-autocorrelation indices for re-evaluation of links in chemical and biological complex networks used in metabolomics, parasitology, neurosciences, and epidemiology. ( 0,812546840571128 )
Neural Comput - Complete classification of the macroscopic behavior of a heterogeneous network of theta neurons. ( 0,811903372193935 )
Neural Comput - A new supervised learning algorithm for spiking neurons. ( 0,810350791874395 )
Brief. Bioinformatics - BioFNet: biological functional network database for analysis and synthesis of biological systems. ( 0,808876203597794 )
Int J Neural Syst - Hardware implementation of stochastic spiking neural networks. ( 0,807470429385775 )
Neural Comput - Spontaneous slow oscillations and sequential patterns due to short-term plasticity in a model of the cortex. ( 0,807202272513178 )
Int J Neural Syst - Optimal sparse approximation with integrate and fire neurons. ( 0,80629667966874 )
IEEE Trans Neural Netw Learn Syst - Stability Analysis of Distributed Delay Neural Networks Based on Relaxed Lyapunov-Krasovskii Functionals. ( 0,80572126865323 )
Neural Comput - The hippocampus as a stable memory allocator for cortex. ( 0,804824561403509 )
IEEE Trans Neural Netw Learn Syst - Properties and Performance of Imperfect Dual Neural Network-Based k WTA Networks. ( 0,802804779315421 )
Neural Comput - A self-organized neural comparator. ( 0,800006849077253 )
Neural Comput - Encoding binary neural codes in networks of threshold-linear neurons. ( 0,798444155725002 )
Neural Comput - A no-go theorem for one-layer feedforward networks. ( 0,794100543757148 )
Neural Comput - Predicting single-neuron activity in locally connected networks. ( 0,793463712375267 )
Neural Comput - Learning rule of homeostatic synaptic scaling: presynaptic dependent or not. ( 0,787952574639244 )
Neural Comput - Unsupervised formation of vocalization-sensitive neurons: a cortical model based on short-term and homeostatic plasticity. ( 0,786475501721558 )
Neural Comput - Synchronization and redundancy: implications for robustness of neural learning and decision making. ( 0,784795321637427 )
IEEE Trans Neural Netw Learn Syst - A Spiking Neural Simulator Integrating Event-Driven and Time-Driven Computation Schemes Using Parallel CPU-GPU Co-Processing: A Case Study. ( 0,783796012301391 )
Neural Comput - Multilayer perceptron classification of unknown volatile chemicals from the firing rates of insect olfactory sensory neurons and its application to biosensor design. ( 0,782805747926813 )
Int J Neural Syst - Relationship between applicability of current-based synapses and uniformity of firing patterns. ( 0,780169785012195 )
Neural Comput - Insights from a simple expression for linear fisher information in a recurrently connected population of spiking neurons. ( 0,77904861516814 )
Neural Comput - Statistical computer model analysis of the reciprocal and recurrent inhibitions of the Ia-EPSP in a-motoneurons. ( 0,778684889842987 )
Comput Math Methods Med - Computational models of neuron-astrocyte interactions lead to improved efficacy in the performance of neural networks. ( 0,773721517375512 )
Neural Comput - Regulation of ambient GABA levels by neuron-glia signaling for reliable perception of multisensory events. ( 0,772828083288732 )
Comput Math Methods Med - Weighted phase lag index and graph analysis: preliminary investigation of functional connectivity during resting state in children. ( 0,772310931866467 )
Neural Comput - Discovering functional neuronal connectivity from serial patterns in spike train data. ( 0,770104146044238 )
J Biomed Inform - FALCON or how to compute measures time efficiently on dynamically evolving dense complex networks? ( 0,769361893184869 )
Int J Neural Syst - Perceptual suppression revealed by adaptive multi-scale entropy analysis of local field potential in monkey visual cortex. ( 0,768454027002748 )
Brief. Bioinformatics - Identifying protein complexes and functional modules--from static PPI networks to dynamic PPI networks. ( 0,766148094790846 )
Neural Comput - Spike-timing-dependent construction. ( 0,765458016725687 )
Neural Comput - A spike-timing-based integrated model for pattern recognition. ( 0,764724476472397 )
Neural Comput - Formation and regulation of dynamic patterns in two-dimensional spiking neural circuits with spike-timing-dependent plasticity. ( 0,763767758216017 )
Neural Comput - Neuronal responses below firing threshold for subthreshold cross-modal enhancement. ( 0,761241751864593 )
Int J Neural Syst - Evolving RBF neural networks for adaptive soft-sensor design. ( 0,759284173097276 )
Neural Comput - Self-organization of topographic bilinear networks for invariant recognition. ( 0,758467909844474 )
IEEE Trans Neural Netw Learn Syst - A two-layer recurrent neural network for nonsmooth convex optimization problems. ( 0,757551539602968 )
Neural Comput - Supervised learning in multilayer spiking neural networks. ( 0,75739381702995 )
Neural Comput - Dissociable forms of repetition priming: a computational model. ( 0,755697031363077 )
Comput Methods Programs Biomed - Characterizing electrical signals evoked by acupuncture through complex network mapping: a new perspective on acupuncture. ( 0,75560205972916 )
Comput Math Methods Med - Results on a binding neuron model and their implications for modified hourglass model for neuronal network. ( 0,754492945601257 )
Neural Comput - Neuronal assembly dynamics in supervised and unsupervised learning scenarios. ( 0,753410906017961 )
Neural Comput - Rewiring-induced chaos in pulse-coupled neural networks. ( 0,752384124862446 )
Neural Comput - Information recall using relative spike timing in a spiking neural network. ( 0,750417172137942 )
Neural Comput - A self-organized artificial neural network architecture for sensory integration with applications to letter-phoneme integration. ( 0,750145425615389 )
Neural Comput - Sequential activity in asymmetrically coupled winner-take-all circuits. ( 0,749358524958888 )
Neural Comput - Broken symmetries in a location-invariant word recognition network. ( 0,749297990351626 )
IEEE Trans Neural Netw Learn Syst - Exponential Stabilization of Memristor-based Chaotic Neural Networks with Time-Varying Delays via Intermittent Control. ( 0,748488688732519 )
Neural Comput - Transmission of population-coded information. ( 0,748377759302712 )
Neural Comput - On antiperiodic solutions for Cohen-Grossberg shunting inhibitory neural networks with time-varying delays and impulses. ( 0,746202507244636 )
Neural Comput - A framework for simulating and estimating the state and functional topology of complex dynamic geometric networks. ( 0,744324757264916 )
Int J Neural Syst - From sensors to spikes: evolving receptive fields to enhance sensorimotor information in a robot-arm. ( 0,742401008861474 )
Neural Comput - Tuning low-voltage-activated A-current for silent gain modulation. ( 0,735575334265733 )
Neural Comput - Neural associative memory with optimal Bayesian learning. ( 0,735118712187997 )
Neural Comput - Fisher and Shannon information in finite neural populations. ( 0,734243014944769 )
Int J Neural Syst - Real-time EEG-based detection of fatigue driving danger for accident prediction. ( 0,73410673051978 )