Options
Wörgötter, Florentin Andreas
Loading...
Preferred name
Wörgötter, Florentin Andreas
Official Name
Wörgötter, Florentin Andreas
Alternative Name
Wörgötter, Florentin A.
Worgotter, Florentin
Wörgötter, Florentin
Wörgötter, F.
Woergoetter, Florentin Andreas
Worgotter, Florentin A.
Worgotter, F. A.
Wörgötter, F. A.
Woergoetter, Florentin A.
Woergoetter, F. A.
Woergoetter, Florentin
Woergoetter, F.
Worgotter, F.
Worgotter, Florentin Andreas
Main Affiliation
Now showing 1 - 8 of 8
2012Journal Article [["dc.bibliographiccitation.artnumber","UNSP 36"],["dc.bibliographiccitation.journal","Frontiers in Computational Neuroscience"],["dc.bibliographiccitation.volume","6"],["dc.contributor.author","Tetzlaff, Christian"],["dc.contributor.author","Kolodziejski, Christoph"],["dc.contributor.author","Timme, Marc"],["dc.contributor.author","Woergoetter, Florentin"],["dc.date.accessioned","2018-11-07T09:09:14Z"],["dc.date.available","2018-11-07T09:09:14Z"],["dc.date.issued","2012"],["dc.description.abstract","Conventional synaptic plasticity in combination with synaptic scaling is a biologically plausible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties."],["dc.identifier.doi","10.3389/fncom.2012.00036"],["dc.identifier.fs","597278"],["dc.identifier.isi","000305330100001"],["dc.identifier.pmid","22719724"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/7780"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/26210"],["dc.notes.intern","Merged from goescholar"],["dc.notes.intern","Merged from goescholar"],["dc.notes.status","zu prüfen"],["dc.notes.submitter","Najko"],["dc.publisher","Frontiers Res Found"],["dc.relation","info:eu-repo/grantAgreement/EC/FP7/270273/EU//Xperience"],["dc.relation.issn","1662-5188"],["dc.relation.orgunit","Fakultät für Physik"],["dc.rights","Goescholar"],["dc.rights.uri","https://goescholar.uni-goettingen.de/licenses"],["dc.title","Analysis of synaptic scaling in combination with Hebbian plasticity in several simple networks"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.peerReviewed","yes"],["dc.type.status","published"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI PMID PMC WOS2010Journal Article [["dc.bibliographiccitation.firstpage","255"],["dc.bibliographiccitation.issue","4"],["dc.bibliographiccitation.journal","Biological Cybernetics"],["dc.bibliographiccitation.lastpage","271"],["dc.bibliographiccitation.volume","103"],["dc.contributor.author","Kulvicius, Tomas"],["dc.contributor.author","Kolodziejski, Christoph"],["dc.contributor.author","Tamosiunaite, Minija"],["dc.contributor.author","Porr, Bernd"],["dc.contributor.author","Woergoetter, Florentin"],["dc.date.accessioned","2018-11-07T08:38:17Z"],["dc.date.available","2018-11-07T08:38:17Z"],["dc.date.issued","2010"],["dc.description.abstract","Understanding closed loop behavioral systems is a non-trivial problem, especially when they change during learning. Descriptions of closed loop systems in terms of information theory date back to the 1950s, however, there have been only a few attempts which take into account learning, mostly measuring information of inputs. In this study we analyze a specific type of closed loop system by looking at the input as well as the output space. For this, we investigate simulated agents that perform differential Hebbian learning (STDP). In the first part we show that analytical solutions can be found for the temporal development of such systems for relatively simple cases. In the second part of this study we try to answer the following question: How can we predict which system from a given class would be the best for a particular scenario? This question is addressed using energy, input/output ratio and entropy measures and investigating their development during learning. This way we can show that within well-specified scenarios there are indeed agents which are optimal with respect to their structure and adaptive properties."],["dc.identifier.doi","10.1007/s00422-010-0396-4"],["dc.identifier.isi","000281667700001"],["dc.identifier.pmid","20556620"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/5158"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/18732"],["dc.notes.intern","Merged from goescholar"],["dc.notes.status","zu prüfen"],["dc.notes.submitter","Najko"],["dc.publisher","Springer"],["dc.relation.issn","0340-1200"],["dc.rights","Goescholar"],["dc.rights.uri","https://goescholar.uni-goettingen.de/licenses"],["dc.title","Behavioral analysis of differential hebbian learning in closed-loop systems"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.peerReviewed","yes"],["dc.type.status","published"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI PMID PMC WOS2008Journal Article [["dc.bibliographiccitation.firstpage","259"],["dc.bibliographiccitation.issue","3"],["dc.bibliographiccitation.journal","Biological Cybernetics"],["dc.bibliographiccitation.lastpage","272"],["dc.bibliographiccitation.volume","98"],["dc.contributor.author","Kolodziejski, Christoph"],["dc.contributor.author","Porr, Bernd"],["dc.contributor.author","Woergoetter, Florentin"],["dc.date.accessioned","2018-11-07T11:17:22Z"],["dc.date.available","2018-11-07T11:17:22Z"],["dc.date.issued","2008"],["dc.description.abstract","A confusingly wide variety of temporally asymmetric learning rules exists related to reinforcement learning and/or to spike-timing dependent plasticity, many of which look exceedingly similar, while displaying strongly different behavior. These rules often find their use in control tasks, for example in robotics and for this rigorous convergence and numerical stability is required. The goal of this article is to review these rules and compare them to provide a better overview over their different properties. Two main classes will be discussed: temporal difference (TD) rules and correlation based (differential hebbian) rules and some transition cases. In general we will focus on neuronal implementations with changeable synaptic weights and a time-continuous representation of activity. In a machine learning (non-neuronal) context, for TD-learning a solid mathematical theory has existed since several years. This can partly be transfered to a neuronal framework, too. On the other hand, only now a more complete theory has also emerged for differential Hebb rules. In general rules differ by their convergence conditions and their numerical stability, which can lead to very undesirable behavior, when wanting to apply them. For TD, convergence can be enforced with a certain output condition assuring that the delta-error drops on average to zero (output control). Correlation based rules, on the other hand, converge when one input drops to zero (input control). Temporally asymmetric learning rules treat situations where incoming stimuli follow each other in time. Thus, it is necessary to remember the first stimulus to be able to relate it to the later occurring second one. To this end different types of so-called eligibility traces are being used by these two different types of rules. This aspect leads again to different properties of TD and differential Hebbian learning as discussed here. Thus, this paper, while also presenting several novel mathematical results, is mainly meant to provide a road map through the different neuronally emulated temporal asymmetrical learning rules and their behavior to provide some guidance for possible applications."],["dc.identifier.doi","10.1007/s00422-007-0209-6"],["dc.identifier.isi","000253623800006"],["dc.identifier.pmid","18196266"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?goescholar/3068"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/54790"],["dc.notes.intern","Merged from goescholar"],["dc.notes.status","zu prüfen"],["dc.notes.submitter","Najko"],["dc.publisher","Springer"],["dc.relation.issn","0340-1200"],["dc.rights","Goescholar"],["dc.rights.uri","https://goescholar.uni-goettingen.de/licenses"],["dc.title","Mathematical properties of neuronal TD-rules and differential Hebbian learning: a comparison"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.peerReviewed","yes"],["dc.type.status","published"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI PMID PMC WOS2013Journal Article [["dc.bibliographiccitation.artnumber","e1003307"],["dc.bibliographiccitation.issue","10"],["dc.bibliographiccitation.journal","PLoS Computational Biology"],["dc.bibliographiccitation.volume","9"],["dc.contributor.author","Tetzlaff, Christian"],["dc.contributor.author","Kolodziejski, Christoph"],["dc.contributor.author","Timme, Marc"],["dc.contributor.author","Tsodyks, Misha"],["dc.contributor.author","Woergoetter, Florentin"],["dc.date.accessioned","2018-11-07T09:18:47Z"],["dc.date.available","2018-11-07T09:18:47Z"],["dc.date.issued","2013"],["dc.description.abstract","Memory storage in the brain relies on mechanisms acting on time scales from minutes, for long-term synaptic potentiation, to days, for memory consolidation. During such processes, neural circuits distinguish synapses relevant for forming a long-term storage, which are consolidated, from synapses of short-term storage, which fade. How time scale integration and synaptic differentiation is simultaneously achieved remains unclear. Here we show that synaptic scaling - a slow process usually associated with the maintenance of activity homeostasis - combined with synaptic plasticity may simultaneously achieve both, thereby providing a natural separation of short-from long-term storage. The interaction between plasticity and scaling provides also an explanation for an established paradox where memory consolidation critically depends on the exact order of learning and recall. These results indicate that scaling may be fundamental for stabilizing memories, providing a dynamic link between early and late memory formation processes."],["dc.identifier.doi","10.1371/journal.pcbi.1003307"],["dc.identifier.isi","000330355300055"],["dc.identifier.pmid","24204240"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/9440"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/28483"],["dc.notes.intern","Merged from goescholar"],["dc.notes.status","zu prüfen"],["dc.notes.submitter","Najko"],["dc.publisher","Public Library Science"],["dc.relation.issn","1553-7358"],["dc.rights","CC BY 2.5"],["dc.rights.uri","https://creativecommons.org/licenses/by/2.5"],["dc.title","Synaptic Scaling Enables Dynamically Distinct Short- and Long-Term Memory Formation"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.peerReviewed","yes"],["dc.type.status","published"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI PMID PMC WOS2011Journal Article [["dc.bibliographiccitation.artnumber","P372"],["dc.bibliographiccitation.issue","Suppl 1"],["dc.bibliographiccitation.journal","BMC Neuroscience"],["dc.bibliographiccitation.volume","12"],["dc.contributor.author","Tetzlaff, Christian"],["dc.contributor.author","Kolodziejski, Christoph"],["dc.contributor.author","Timme, Marc"],["dc.contributor.author","Wörgötter, Florentin"],["dc.date.accessioned","2011-07-22T22:25:45Z"],["dc.date.accessioned","2011-07-23T15:34:56Z"],["dc.date.accessioned","2021-10-11T11:26:03Z"],["dc.date.available","2011-07-22T22:25:45Z"],["dc.date.available","2011-07-23T15:34:56Z"],["dc.date.available","2021-10-11T11:26:03Z"],["dc.date.issued","2011"],["dc.date.updated","2011-07-22T22:25:46Z"],["dc.identifier.doi","10.1186/1471-2202-12-S1-P372"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/6832"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/90535"],["dc.language.iso","en"],["dc.notes.intern","Merged from goescholar"],["dc.relation.orgunit","Fakultät für Physik"],["dc.rights","CC BY 2.0"],["dc.rights.access","openAccess"],["dc.rights.holder","et al.; licensee BioMed Central Ltd."],["dc.rights.uri","http://creativecommons.org/licenses/by/2.0/"],["dc.subject.ddc","530"],["dc.subject.ddc","573"],["dc.subject.ddc","573.8"],["dc.subject.ddc","612"],["dc.subject.ddc","612.8"],["dc.title","Synaptic scaling generically stabilizes circuit connectivity"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.peerReviewed","yes"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI2013Journal Article [["dc.bibliographiccitation.artnumber","UNSP 183"],["dc.bibliographiccitation.journal","Frontiers in Computational Neuroscience"],["dc.bibliographiccitation.volume","7"],["dc.contributor.author","Faghihi, Faramarz"],["dc.contributor.author","Kolodziejski, Christoph"],["dc.contributor.author","Fiala, Andre"],["dc.contributor.author","Woergoetter, Florentin"],["dc.contributor.author","Tetzlaff, Christian"],["dc.date.accessioned","2018-11-07T09:16:28Z"],["dc.date.available","2018-11-07T09:16:28Z"],["dc.date.issued","2013"],["dc.description.abstract","Fruit flies (Drosophila melanogaster) rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the fly's olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons (Kenyon cells) was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the system efficiency will be substantially reduced."],["dc.description.sponsorship","Open-Access-Publikationsfonds 2014"],["dc.identifier.doi","10.3389/fncom.2013.00183"],["dc.identifier.fs","603070"],["dc.identifier.isi","000329170400001"],["dc.identifier.pmid","24391579"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/9796"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/27943"],["dc.notes.intern","Merged from goescholar"],["dc.notes.intern","Merged from goescholar"],["dc.notes.status","zu prüfen"],["dc.notes.submitter","Najko"],["dc.publisher","Frontiers Media Sa"],["dc.relation.issn","1662-5188"],["dc.relation.orgunit","Fakultät für Physik"],["dc.rights","CC BY 3.0"],["dc.rights.uri","https://creativecommons.org/licenses/by/3.0"],["dc.title","An information theoretic model of information processing in the Drosophila olfactory system: the role of inhibitory neurons for system efficiency"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.peerReviewed","yes"],["dc.type.status","published"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI PMID PMC WOS2011Journal Article [["dc.bibliographiccitation.artnumber","47"],["dc.bibliographiccitation.journal","Frontiers in Computational Neuroscience"],["dc.bibliographiccitation.volume","5"],["dc.contributor.author","Tetzlaff, Christian"],["dc.contributor.author","Kolodziejski, Christoph"],["dc.contributor.author","Timme, Marc"],["dc.contributor.author","Woergoetter, Florentin"],["dc.date.accessioned","2018-11-07T08:49:53Z"],["dc.date.available","2018-11-07T08:49:53Z"],["dc.date.issued","2011"],["dc.description.abstract","Synaptic scaling is a slow process that modifies synapses, keeping the firing rate of neural circuits in specific regimes. Together with other processes, such as conventional synaptic plasticity in the form of long term depression and potentiation, synaptic scaling changes the synaptic patterns in a network, ensuring diverse, functionally relevant, stable, and input-dependent connectivity. How synaptic patterns are generated and stabilized, however, is largely unknown. Here we formally describe and analyze synaptic scaling based on results from experimental studies and demonstrate that the combination of different conventional plasticity mechanisms and synaptic scaling provides a powerful general framework for regulating network connectivity. In addition, we design several simple models that reproduce experimentally observed synaptic distributions as well as the observed synaptic modifications during sustained activity changes. These models predict that the combination of plasticity with scaling generates globally stable, input-controlled synaptic patterns, also in recurrent networks. Thus, in combination with other forms of plasticity, synaptic scaling can robustly yield neuronal circuits with high synaptic diversity, which potentially enables robust dynamic storage of complex activation patterns. This mechanism is even more pronounced when considering networks with a realistic degree of inhibition. Synaptic scaling combined with plasticity could thus be the basis for learning structured behavior even in initially random networks."],["dc.identifier.doi","10.3389/fncom.2011.00047"],["dc.identifier.fs","585625"],["dc.identifier.isi","000299567700001"],["dc.identifier.pmid","22203799"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/7779"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/21564"],["dc.notes.intern","Merged from goescholar"],["dc.notes.intern","Merged from goescholar"],["dc.notes.status","zu prüfen"],["dc.notes.submitter","Najko"],["dc.publisher","Frontiers Res Found"],["dc.relation","info:eu-repo/grantAgreement/EC/FP7/270273/EU//Xperience"],["dc.relation.issn","1662-5188"],["dc.relation.orgunit","Fakultät für Physik"],["dc.rights","Goescholar"],["dc.rights.uri","https://goescholar.uni-goettingen.de/licenses"],["dc.title","Synaptic scaling in combination with many generic plasticity mechanisms stabilizes circuit connectivity"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.peerReviewed","yes"],["dc.type.status","published"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI PMID PMC WOS2012Journal Article [["dc.bibliographiccitation.firstpage","715"],["dc.bibliographiccitation.issue","11-12"],["dc.bibliographiccitation.journal","Biological Cybernetics"],["dc.bibliographiccitation.lastpage","726"],["dc.bibliographiccitation.volume","106"],["dc.contributor.author","Tetzlaff, Christian"],["dc.contributor.author","Kolodziejski, Christoph"],["dc.contributor.author","Markelic, Irene"],["dc.contributor.author","Woergoetter, Florentin"],["dc.date.accessioned","2018-11-07T09:02:44Z"],["dc.date.available","2018-11-07T09:02:44Z"],["dc.date.issued","2012"],["dc.description.abstract","After only about 10 days would the storage capacity of our nervous system be reached if we stored every bit of input. The nervous system relies on at least two mechanisms that counteract this capacity limit: compression and forgetting. But the latter mechanism needs to know how long an entity should be stored: some memories are relevant only for the next few minutes, some are important even after the passage of several years. Psychology and physiology have found and described many different memory mechanisms, and these mechanisms indeed use different time scales. In this prospect we review these mechanisms with respect to their time scale and propose relations between mechanisms in learning and memory and their underlying physiological basis."],["dc.identifier.doi","10.1007/s00422-012-0529-z"],["dc.identifier.isi","000312070900009"],["dc.identifier.pmid","23160712"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/10505"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/24753"],["dc.notes.intern","Merged from goescholar"],["dc.notes.status","zu prüfen"],["dc.notes.submitter","Najko"],["dc.publisher","Springer"],["dc.relation","info:eu-repo/grantAgreement/EC/FP7/270273/EU//Xperience"],["dc.relation.issn","1432-0770"],["dc.relation.issn","0340-1200"],["dc.relation.orgunit","Fakultät für Physik"],["dc.rights","CC BY 3.0"],["dc.rights.uri","http://creativecommons.org/licenses/by/3.0/"],["dc.subject.mesh","Animals"],["dc.subject.mesh","Humans"],["dc.subject.mesh","Learning"],["dc.subject.mesh","Memory"],["dc.subject.mesh","Models, Biological"],["dc.subject.mesh","Nervous System"],["dc.subject.mesh","Nervous System Physiological Phenomena"],["dc.subject.mesh","Neuronal Plasticity"],["dc.subject.mesh","Time Factors"],["dc.title","Time scales of memory, learning, and plasticity"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.peerReviewed","yes"],["dc.type.status","published"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI PMID PMC WOS