翻訳と辞書
Words near each other
・ Lead poisoning
・ Lead programmer
・ LEAD Public Schools
・ Lead pursuit
・ LEAA
・ Leabgarrow
・ Leabhar Adhamh Ó Cianáin
・ Leabhar Clainne Suibhne
・ Leabhar Cloinne Aodha Buidhe
・ Leabhar Cloinne Maoil Ruanaidh
・ Leabhar Donn
・ Leabhar na nGenealach
・ Leabhar Oiris
・ Leabhar Ua Maine
・ Leabharcham
Leabra
・ Leabridge (ward)
・ Leabrook, South Australia
・ Leabrooks, Derbyshire
・ Leabua Jonathan
・ Leaburg, Oregon
・ Leaburu
・ Leach
・ Leach (automobile)
・ Leach (steam automobile company)
・ Leach (surname)
・ Leach Airport
・ Leach Botanical Garden
・ Leach Creek
・ Leach Highway


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Leabra : ウィキペディア英語版
Leabra
Leabra stands for "Local, Error-driven and Associative, Biologically Realistic Algorithm". It is a model of learning which is a balance between Hebbian and error-driven learning with other network-derived characteristics. This model is used to mathematically predict outcomes based on inputs and previous learning influences. This model is heavily influenced by and contributes to neural network designs and models.
This algorithm is the default algorithm in Emergent (successor of PDP++) when making a new project, and is extensively used in various simulations.
Hebbian learning is performed using conditional principal components analysis (CPCA) algorithm with correction factor for sparse expected activity levels.
Error-driven learning is performed using GeneRec, which is a generalization of the Recirculation algorithm, and approximates Almeida-Pineda recurrent backpropagation. The symmetric, midpoint version of GeneRec is used, which is equivalent to the contrastive Hebbian learning algorithm (CHL). See O'Reilly (1996; Neural Computation) for more details.
The activation function is a point-neuron approximation with both discrete spiking and continuous rate-code output.
Layer or unit-group level inhibition can be computed directly using a k-winners-take-all (KWTA) function, producing sparse distributed representations.
The net input is computed as an average, not a sum, over connections, based on normalized, sigmoidally transformed weight values, which are subject to scaling on a connection-group level to alter relative contributions. Automatic scaling is performed to compensate for differences in expected activity level in the different projections.
Documentation about this algorithm can be found in the book "Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain" published by MIT press. and in the (Emergent Documentation )
== Overview of the Leabra Algorithm ==
The pseudocode for Leabra is given here, showing exactly how the
pieces of the algorithm described in more detail in the subsequent
sections fit together.

Iterate over minus and plus phases of settling for each event.
o At start of settling, for all units:
- Initialize all state variables (activation, v_m, etc).
- Apply external patterns (clamp input in minus, input & output in
plus).
- Compute net input scaling terms (constants, computed
here so network can be dynamically altered).
- Optimization: compute net input once from all static activations
(e.g., hard-clamped external inputs).
o During each cycle of settling, for all non-clamped units:
- Compute excitatory netinput (g_e(t), aka eta_j or net)
-- sender-based optimization by ignoring inactives.
- Compute kWTA inhibition for each layer, based on g_i^Q:

* Sort units into two groups based on g_i^Q: top k and
remaining k+1 -> n.

* If basic, find k and k+1th highest
If avg-based, compute avg of 1 -> k & k+1 -> n.

* Set inhibitory conductance g_i from g^Q_k and g^Q_k+1
- Compute point-neuron activation combining excitatory input and
inhibition
o After settling, for all units, record final settling activations
as either minus or plus phase (y^-_j or y^+_j).
After both phases update the weights (based on linear current
weight values), for all connections:
o Compute error-driven weight changes with CHL with soft weight bounding
o Compute Hebbian weight changes with CPCA from plus-phase activations
o Compute net weight change as weighted sum of error-driven and Hebbian
o Increment the weights according to net weight change.


抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Leabra」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.