Recoding
recoding
The smart image compression algorithm in the retina: a theoretical study of recoding inputs in neural circuits
Computation in neural circuits relies on a common set of motifs, including divergence of common inputs to parallel pathways, convergence of multiple inputs to a single neuron, and nonlinearities that select some signals over others. Convergence and circuit nonlinearities, considered individually, can lead to a loss of information about the inputs. Past work has detailed how to optimize nonlinearities and circuit weights to maximize information, but we show that selective nonlinearities, acting together with divergent and convergent circuit structure, can improve information transmission over a purely linear circuit despite the suboptimality of these components individually. These nonlinearities recode the inputs in a manner that preserves the variance among converged inputs. Our results suggest that neural circuits may be doing better than expected without finely tuned weights.
Analogical encodings and recodings
This talk will focus on the idea that the kind of similarity driving analogical retrieval is determined by the kind of features encoded regarding the source and the target cue situations. Emphasis will be put on educational perspectives in order to show the influence of world semantics on learners’ problem representations and solving strategies, as well as the difficulties arising from semantic incongruence between representations and strategies. Special attention will be given to the recoding of semantically incongruent representations, a crucial step that learners struggle with, in order to illustrate a promising path for going beyond informal strategies.
The smart image compression algorithm in the retina: recoding inputs in neural circuits
COSYNE 2022
The smart image compression algorithm in the retina: recoding inputs in neural circuits
COSYNE 2022