## Neural Nets, Connectionism, Perceptrons, etc.

*03 Jun 2016 15:44*

I'm mostly interested in them as a means of machine learning or statistical inference. I am particularly interested in their role as models of dynamical systems (via recurrent nets, generally), and as models of transduction.

I need to understand better how the analogy to spin glasses works, but then, I need to understand spin glasses better too.

The arguments that connectionist models are superior, for purposes
of cognitive science, to more "symbolic"
ones I find unconvincing. (Saying that they're more biologically realistic is
like saying that cars are better models of animal locomotion than bicycles,
because cars have four appendages in contact with the ground and not two.)
This is not to say, of course, that some connectionist models of cognition
aren't interesting, insightful and valid; but the same is true of many symbolic
models, and there seems no compelling reason for abandoning the latter in favor
of the former. (For more on this point, see Marcus, and my forthcoming review
of his book.) --- *Of course* a cognitive model which cannot be
implemented in real brains must be rejected; connecting neurobiology to
cognition can hardly be too ardently desired. The point is that the elements
in connectionist models called "neurons" bear only the sketchiest resemblance
to the real thing, and neural nets are no more than caricatures of real
neuronal circuits. Sometimes sketchy resemblances and caricatures are enough
to help us learn, which is why Hebb, McCulloch and Neural
Computation are important for both connectionism and neurobiology.

- Recommended (big picture):
- Larry Abbot and Terrence Sejnowski (eds.), Neural Codes and Distributed Representations
- Michael A. Arbib, Brains, Machines and Mathematics [1964; a model of clarity in exposition and thought]
- Michael A. Arbib (ed.), The Handbook of Brain Theory and Neural Networks
- Dana Ballard, An Introduction to Natural Computation [Review: Not Natural Enough]
- M. J. Barber, J. W. Clark and C. H. Anderson, "Neural Representation of Probabilistic Information," cond-mat/0108425
- Maureen Caudill and Charles Butler, Naturally Intelligent Systems
- Patricia Churchland and Terrence Sejnowski, The Computational Brain
- Chris Eliasmith and Charles Anderson, Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Systems
- Donald O. Hebb, The Organization of Behavior: A Neuropsychological Theory
- Hinton and Sejnowski (eds.), Unsupervised Learning [A sort of "Neural Computation's Greatest Hits" compilation]
- Gary F. Marcus, The Algebraic Mind: Integrating Connectionism and Cognitive Science [On the limits of the connectionist approach to cognition, with special reference to language and grammar.]
- Warren S. McCulloch, Embodiments of Mind
- Brian Ripley, Pattern Recognition and Neural Networks
- V. N. (=Vladimir Naumovich) Vapnik, The Nature of Statistical Learning Theory [Review: A Useful Biased Estimator]
- T. L. H Watkin, A. Rau and M. Biehl, "The Statistical Mechanics of
Learning a Rule," Reviews of Modern
Physics
**65**(1993): 499--556 - Achilleas Zapranis and Apostolos-Paul Refenes, Principles of Neural Model Identification, Selection and Adequacy, with Applications to Financial Econometrics [Their English is less than perfect, but they've got very sound ideas about all the important topics]

- Recommended (close-ups; very misc. and small):
- Martin Anthony and Peter C. Bartlett, Neural Network Learning: Theoretical Foundations
- M. J. Barber, J. W. Clark and C. H. Anderson, "Neural Representation of Probabilistic Information", Neural Computation
**15**(2003): 1843--1864 = arxiv:cond-mat/0108425 - Suzanna Becker, "Unsupervised Learning Procedures for Neural Networks", International Journal of Neural Systems
**2**(1991): 17--33 - Surya Ganguli, Dongsung Huh and Haim Sompolinsky, "Memory
traces in dynamical systems", Proceedings of the National Academy
of Sciences (USA)
**105**(2008): 18970--18975 - Anders Krogh and Jesper Vedelsby, "Neural Network Ensembles, Cross Validation, and Active Learning", NIPS 7 (1994): 231--238
- Mathukumalli Vidyasagar, A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems [Extensive discussion of the application of statistical learning theory to neural networks, along with the purely computational difficulties. Mini-review]

- To read [with abundant thanks to Osame Kinouchi for recommendations]:
- Daniel Amit, Modelling Brain Function
- V. M. Becerra, F. R. Garces, S. J. Nasuto and W. Holderbaum, "An
Efficient Parameterization of Dynamic Neural Networks for Nonlinear System
Identification", IEEE
Transactions on Neural Networks
**16**(2005): 983--988 - William Bechtel and Adele Abrahamsen, Connectionism and the Mind: Parallel Processing, Dynamics, and Evolution in Networks
- William Bechtel and Robert C. Richardson, Discovering Complexity: Decomposition and Localization as Strategies in Scientific Research
- Randall Beer, Intelligence as Adaptive Behavior: An Experiment in Computational Neuroethology [Simulated bugs!]
- Hugues Berry and Mathias Quoy, "Structure and Dynamics of Random
Recurrent Neural Networks", Adaptive
Behavior
**14**(2006): 129--137 - Dimitri P. Bertsekas and John N. Tsitsiklis, Neuro-Dynammic Programming
- Michael Biehl, Reimer Kühn, Ion-Olimpiu Stamatescu, "Learning structured data from unspecific reinforcement," cond-mat/0001405
- D. Bollé and P. Kozlowski, "On-line learning and generalisation in coupled perceptrons," cond-mat/0111493
- Christoph Bunzmann, Michael Biehl, and Robert Urbanczik, "Efficient
training of multilayer perceptrons using principal component analysis", Physical Review
E
**72**(2005): 026117 - Gail A. Carpenter and Stephen Grossberg (eds.), Pattern Recognition by Self-Organizing Neural Networks
- Nestor Caticha and Osame Kinouchi, "Time ordering in the evolution
of information processing and modulation systems," Philosophical
Magazine B
**77**(1998): 1565--1574 - Axel Cleeremans, Mechanisms of Implicit Learning: Connectionist Models of Sequence Processing
- A. C. C. Coolen, "Statistical Mechanics of Recurrent Neural Networks": part I, "Statics," cond-mat/0006010 and part II, "Dynamics," cond-mat/0006011
- A. C. C. Coolen, R. Kuehn, and P. Sollich, Theory of Neural Information Processing Systems
- A. C. C. Coolen and D. Saad, "Dynamics of Learning with Restricted
Training Sets," Physical Review E
**62**(2000): 5444--5487 - Mauro Copelli, Antonio C. Roque, Rodrigo F. Oliveira and Osame Kinouchi, "Enhanced dynamic range in a sensory network of excitable elements," cond-mat/0112395
- Valeria Del Prete and Alessandro Treves, "A theoretical model of neuronal population coding of stimuli with both continuous and discrete dimensions," cond-mat/0103286
- M. C. P. deSouto, T. B. Ludermir and W. R. deOliveira, "Equivalence
Between RAM-Based Neural Networks and Probabilistic Automata", IEEE Transactions on
Neural Networks
**16**(2005): 996--999 - Eytan Domany, Jan Leonard van Hemmen and Klaus Schulten (eds.), Models of Neural Networks III: Association, Generalization, and Representation
- Viktor Dotsenko, Introduction to the Theory of Spin Glasses and Neural Networks
- Liat Ein-Dor and Ido Kanter, "Confidence in prediction by neural
networks," Physical Review E
**60**(1999): 799--802 - Chris Eliasmith, "A Unified Approach to Building and Controlling
Spiking Attractor Networks", Neural
Computation
**17**(2005): 1276--1314 - Elman et al., Rethinking Innateness
- Frank Emmert-Streib
- "Self-organized annealing in laterally inhibited neural networks shows power law decay", cond-mat/0401633
- "A Heterosynaptic Learning Rule for Neural Networks", cond-mat/0608564

- Andreas Engel and Christian P. L. Van den Broeck, Statistical Mechanics of Learning
- Magnus Enquist and Stefano Ghirlanda, Neural Networks and Animal Behavior
- Michael Feindt, "A Neural Bayesian Estimator for Conditional Probability Densities", physics/0402093
- Gary William Flake, "The Calculus of Jacobian Adaptation" [Not confined to neural nets]
- Leonardo Franco, "A measure for the complexity of Boolean functions related to their implementation in neural networks," cond-mat/0111169
- Jürgen Franke and Michael H. Neumann, "Bootstrapping Neural
Networks," Neural Computation
**12**(2000): 1929--19949 - Gardenfors, Conceptual Spaces: The Geometry of Thought
- Ian Goodfellow, Yoshua Bengio and Aaron Courville, Deep Learning
- F. A. von Hayek, The Sensory Order
- Michiel Hermans and Benjamin Schrauwen, "Recurrent Kernel Machines: Computing with Infinite Echo State Networks", Neural Computation
**24**(2012): 104--133 - D. Herschkowitz and M. Opper, "Retarded Learning: Rigorous Results from Statistical Mechanics," cond-mat/0103275
- Dirk Husmeier, Neural Networks for Conditional Probability Estimation
- Jun-ichi Inoue and A. C. C. Coolen, "Dynamics of on-line Hebbian learning with structurally unrealizable restricted training sets," cond-mat/0105004
- Henrik Jacobsson, "Rule Extraction from Recurrent Neural Networks:
A Taxonomy and
Review", Neural
Computation
**17**(2005): 1223--1263 - Jim W. Kay and D. M. Titterington (eds.), Statistics and Neural Networks: Advances at the Interface
- I. Kanter, W. Kinzel and E. Kanter, "Secure exchange of information by synchronization of neural networks," cond-mat/0202112
- Alon Keinan, Ben Sandbank, Claus C. Hilgetag, Isaac Meilijson and
Eytan Ruppin, "Fair Attribution of Functional Contribution in Artificial and
Biological Networks", Neural
Computation
**16**(2004): 1887--1915 - Beom Jun Kim, "Performance of networks of artificial neurons: The role of clustering", q-bio.NC/0402045
- Osame Kinouchi and Nestor Caticha, "Optimal Generalization in
Perceptrons," Journal of Physics A
**25**(1992): 6243--6250 - W. Kinzel
- "Statistical Physics of Neural Networks," Computer
Physics Communications,
**122**(1999): 86--93 - "Phase transitions of neural networks,"
Philosophical Magazine B
**77**(1998): 1455--1477

- "Statistical Physics of Neural Networks," Computer
Physics Communications,
- W. Kinzel, R. Metzler and I. Kanter, "Dynamics of Interacting
Neural Networks," Journal of Physica A
**33**(2000): L141--L147 - Konstantin Klemm, Stefan Bornholdt and Heinz Georg Schuster, "Beyond Hebb: XOR and biological learning," adap-org/9909005
- G.A. Kohring, "Artificial Neurons with Arbitrarily Complex Internal Structures," cs.NE/0108009
- Kohonen, Self-organization and associative memory [Start of the huge literature on self-organizing maps, which I ought to get a grip on]
- John F. Kolen (ed.), A Field Guide to Dynamical Recurrent Networks
- Krogh et al., Introduction to the Theory of Neural Computation
- Hannes Leitgeb, "Interpreted Dynamical Systems and Qualitative
Laws: From Neural Networks to Evolutionary Systems", Synthese
**146**(2005): 189--202 ["Interpreted dynamical systems are dynamical systems with an additional interpretation mapping by which propositional formulas are assigned to system states. The dynamics of such systems may be described in terms of qualitative laws for which a satisfaction clause is defined. We show that the systems C and CL of nonmonotonic logic are adequate with respect to the corresponding description of the classes of interpreted ordered and interpreted hierarchical systems, respectively"] - Andrea Loettgers, "Getting Abstract Mathematical Models in Touch
with
Nature", Science in
Context
**20**(2007): 97--124 [Intellectual history of the Hopfield model and its reception] - Yonatan Loewenstein, and H. Sebastian Seung, "Operant matching is a
generic outcome of synaptic plasticity based on the covariance between reward
and neural activity", Proceedings of the
National Academy of Sciences (USA)
**103**(2006): 15224--15229 [The abstract promises a result about all possible neural mechanisms having some fairly generic features; this is clearly the right way to do theoretical neuroscience, but rarely done...] - Wolfgang Maass (ed.), Pulsed Neural Networks
- Wolfgang Maass and Eduardo D. Sontag, "Neural Systems as Nonlinear
Filters," Neural Computation
**12**(2000): 1743--1772 - M. S. Mainieri and R. Erichsen Jr, "Retrieval and Chaos in Extremely Diluted Non-Monotonic Neural Networks," cond-mat/0202097
- Daniele Marinazzo, Mario Pellicoro, Sebastiano Stramaglia, "Causal interactions and delays in a neuronal ensemble", cond-mat/0609523
- McClelland and Rumelhart (ed.), Parallel Distributed Processing
- Patrick C. McGuire, Henrik Bohr, John W. Clark, Robert Haschke, Chris Pershing and Johann Rafelski, "Threshold Disorder as a Source of Diverse and Complex Behavior in Random Nets," cond-mat/0202190
- Richard Metzler, Wolfgang Kinzel, Liat Ein-Dor and Ido Kanter, "Generation of anti-predictable time series by a Neural Network," cond-mat/0011302
- R. Metzler, W. Kinzel and I. Kanter, "Interacting Neural
Networks," Physical Review E
**62**(2000): 2555--2565 [abstract] - Minsky and Papert, Perceptrons
- Seiji Miyoshi, Kazuyuki Hara, and Masato Okada, "Analysis of
ensemble learning using simple perceptrons based on online learning theory",
Physical Review
E
**71**(2005): 036116 - Javier R. Movellan, Paul Mineiro, and R. J. Williams, "A Monte
Carlo EM Approach for Partially Observable Diffusion Processes: Theory and
Applications to Neural Networks," Neural Computation
**14**(20020: 1507--1544 - Randall C. O'Reilly, "Generalization in Interactive Networks: The
Benefits of Inhibitory Competition and Hebbian Learning," Neural
Computation
**13**(2001): 1199--1241 - Steven Phillips, "Systematic Minds, Unsystematic Models:
Learning Transfer in Humans and Networks", Minds and Machines
**9**(1999): 383--398 - Patrick D. Roberts, "Dynamics of Temporal Learning Rules,"
Physical Review E
**62**(2000): 4077--4082 - Fabrice Rossi, Brieuc Conan-Guez, "Functional Multi-Layer Perceptron: a Nonlinear Tool for Functional Data Analysis", arxiv:0709.3642
- Fabrice Rossi, Nicolas Delannay, Brieuc Conan-Guez, Michel Verleysen, "Representation of Functional Data in Neural Networks", arxiv:0709.3641
- Ines Samengo, "Independent neurons representing a finite set of
stimuli: dependence of the mutual information on the number of units sampled,"
Network: Computation in Neural Systems,
**12**(2000): 21--31, cond-mat/0202023 - Ines Samengo and Alessandro Treves, "Representational capacity of a set of independent neurons," cond-mat/0201588
- Vitaly Schetinin and Anatoly Brazhnikov, "Diagnostic Rule Extraction Using Neural Networks", cs.NE/0504057
- Philip Seliger, Stephen C. Young, and Lev S. Tsimring, "Plasticity and learning in a network of coupled phase oscillators," nlin.AO/0110044
- Paul Smolensky and Géraldine Legendre, The Harmonic Mind: From Neural Computation to Optimality-Theoretic Grammar
- Dietrich Stauffer and Amnon Aharony, "Efficient Hopfield pattern recognition on a scale-free neural network," cond-mat/0212601
- Samy Tindel, "The stochastic calculus method for spin systems",
Annals of
Probability
**33**(2005): 561--581 = math.PR/0503652 [One of the kind of spin systems being perceptrons] - Marc Toussaint
- "On model selection and the disability of neural networks to decompose tasks," nlin.AO/0202038
- "A neural model for multi-expert architectures," nlin.AO/0202039

- T. Uezu and A. C. C. Coolen, "Hierarchical Self-Programming in Recurrent Neural Networks," cond-mat/0109099
- Robert Urbanczik, "Statistical Physics of Feedforward Neural Networks," cond-mat/0201530
- Leslie G. Valiant
- Circuits of the Mind
- "Memorization and Association on a Realistic Neural Model",
Neural
Computation
**17**(2005): 527--555 ["A central open question of computational neuroscience is to identify the data structures and algorithms that are used in mammalian cortex to support successive acts of the basic cognitive tasks of memorization and association. This letter addresses the simultaneous challenges of realizing these two distinct tasks with the same data structure, and doing so while respecting the following four basic quantitative parameters of cortex: the neuron number, the synapse number, the synapse strengths, and the switching times. Previous work has not succeeded in reconciling these opposing constraints, the low values of synapse strengths that are typically observed experimentally having contributed a particular obstacle. In this article, we describe a computational scheme that supports both memory formation and association and is feasible on networks of model neurons that respect the widely observed values of the four quantitative parameters. Our scheme allows for both disjoint and shared representations. The algorithms are simple, and in one version both memorization and association require just one step of vicinal or neighborly influence. The issues of interference among the different circuits that are established, of robustness to noise, and of the stability of the hierarchical memorization process are addressed. A calculus therefore is implied for analyzing the capabilities of particular neural systems and subsystems, in terms of their basic numerical parameters."]

- Frank van der Velde and Marc de Kamps, "Neural blackboard
architectures of combinatorial structures in
cognition", Behavioral and Brain
Sciences
**29**(2006): 37--70 [+ peer commentary] - W. A. van Leeuwen and Bastian Wemmenhove, "Learning by a neural net in a noisy environment - The pseudo-inverse solution revisited," cond-mat/0205550
- Renato Vicente, Osame Kinouchi and Nestor Caticha, "Statistical
mechanics of online learning of drifting concepts: A variational approach,"
Machine Learning
**32**(1998): 179--201 [abstract] - Hiroshi Wakuya and Jacek M. Zurada, "Bi-directional computing
architecture for time series prediction," Neural Networks
**14**(2001): 1307--1321 - C. Xiang, S. Ding and T. H. Lee, "Geometrical Interpretation and
Architecture Selection of MLP", IEEE Transactions on Neural
Networks
**16**(2005): 84--96 [MLP = multi-layer perceptron]