Turned up to eleven: Fair and Balanced 
Front page
Random thoughts, sometimes deep, mostly not, about politics, war, science, religion, life in general
by Paul Orwin

Wednesday, May 01, 2002
Posted
7:50 AM
by Paul
Before getting started with the neuroscience/mathematics, there are some basic definitions required. Many are familiar, but some are not. I will attempt to give reasonably accessible definitions of all terms I use, but, as always, comments are appreciated! Neuron The basic cellular unit of the nervous system, the neuron is composed of dendrites, a cell body (soma), and an axon. (use that link for a good basic review of neuron architecture and function). An electrical signal is transmitted from where it enters the cell through the dendrite, integrated with other incoming signals in the cell body, and, if it exceeds a certain threshold (that last part is very important), an Action Potential is sent down the axon to the synapse, which is where the neuron is connected to other neurons in a neural network, and signals are sent chemically across the synapse, mediated by diffusion (Fick's Law). Now, the physiology of neurons is fascinating, but what we are interested in here is the nature of the Neural Networks, and how artificial neural networks (ANN) can help us form some testable hypotheses about biological neural networks. An aside; this is an interesting symmetry, because the understanding of the function of the biological neuron led directly to the development of the artificial neural network, starting with the perceptron, which only has input and output layers, to the more complex ANN that includes a theoretically unlimited number of intermediate layers. Turning to mathematical definitions, it is useful to define a few terms here as well. Graph A graph (as in Graph Theory) is simply a connection of edges and vertices that can be manipulated in certain ways. Certain rules describing the number of edges possible in a graph, and the paths through that graph will be useful in the discussion. We will be combining the notion of a graph with the notion of a neural network to try and develop a mathematical basis for intelligence and consciousness (ambitious, no?) Ndimensional Space Not the final frontier! In a mathematical sense, a space is simply the defined region in which one can work, in which the defined axioms and subsequent theorems apply. An Ndimensional space is one that has N orthogonal axes. To illustrate this, think of a piece of graph paper (not the graph described above, but the conventional kind). This paper has two axes, X and Y (if you are one of those people who likes fancy names, you can call them the ordinate and abscissa). These two axes are orthogonal in the sense that 1) you cannot describe one in terms of the other, and 2) any point on the plane is described by a set of two numbers, one corresponding to the value on the X, and one to the value on the Y. If you want a three dimensional space, you can add an axis (the Zaxis), that is perpendicular (or orthogonal) to both of the axes previously described. If you want a four dimensional space, you can do that again, and so on (we can't do this physically, but mathematics is never held back by reality!! An Ndimensional space simply has N of these axes, and thus a point in N dimensional space is described by exactly N terms. Importantly, a vector in Ndimensional space can be described similarly. Vector A vector is the line described by a set of N terms such that each term describes the magnitude and direction of the vector along each axis of the space. It is often convenient for vectors to start at the origin (i.e. [0,0...,0]), but not necessary. Mathematical manipulation of a vector in Nspace is performed using matrix algebra. I will try to avoid using too much of this, and keep the discussion at a conceptual level, rather than explicit math. This discussion of terms has probably scared off most of my audience, perhaps justifiably so. Nevertheless, I will persist in trying to explain my thoughts using these tools.
