Breakthrough in Biologic Intelligence Software – Humanizing Tech

By Prosyscom
In March 20, 2018
106 Views


Why synaptic growth & pruning, time, and the fractal nature of our universe are important for man-made Intelligence

I. Setting the Stage

Back in 2014, you may have seen a video make its way across the internet to some of the strangest places in the world. It was picked up by most major tech media outlets and reported on in countries you probably can’t pronounce. The video below has nearly 1.3 million views as of the time of this analysis. For a science video. STEM is alive and well, my friends.

It shows, for the first time, how a human mimicked an animal’s brain and nervous system into software, then used that Intelligence to control a robotic vehicle. A 3-dimensional object moving around a 3-dimensional world. In short, the beginnings of a self-driving car.

You can see the brain and nervous system lighting up with the green boxes, and observe a robot moving and behaving like an animal. But that’s where it stops. Just a one-to-one copy of an organism’s sensory-input-to-motor output.

So what has happened over the last few years since then? We’ve transformed that very basic starting point into an artificial connectome, what has now been coined as Biologic Intelligence.

3D Biologic Intelligence, showing each connection between brain and nervous system neurons.

The system has gone through multiple iterations to push its efficiency as a control mechanism as far as it can go with cheap, off-the-shelf parts. Today, we’re at about 6,000 lines of code, are using a handful of $20 sonar sensors and 6 Raspberry Pis, all self-contained with no internet connection to the cloud, no open-source libraries, and no deep learning or machine learning.

An entire “Mars Rover” for less than $1,000 using an altogether different approach than the rest of the artificial intelligence community. This isn’t a math equation that you feed data through to try to minimize some error function. This is biological, and works more like nature.

We’ve already had one breakthrough for how this Biologic Intelligence learns, by observing new synaptic connections forming completely on their own. It was emergent. And, with the deep thinking that happened over the recent holiday break, we had another, equally as exciting breakthrough.

II. The Breakthrough in Biologic Science

If the prior breakthrough was modeling synaptic growth, the current is epiphany is about modeling synaptic pruning. In essence, the opposite.

For example, if synapitc growth (i.e., excitatory pathways) were all that existed, we would find quite rapidly that our brains would become an extra large collection of strong pathways. These would intertwine, growing and growing until every neuron in the brain would be activated for every sensory input. The resulting creation? A network completely out of control and useless for any type of discrimination.

In short, our robot would have an epileptic seizure.

Therefore, we have mirrored the inhibitory networks that tune the excitatory network and will decrease weighted values if stimulated enough over time, as well as, minimize the interaction of multiple excitatory neurons for a given pathway. These two inhibitory networks allow pathways to become more refined and tuned for fine discrimination of sensory-input-to-motor-output.

Over time, the inhibitory neurons retard the excitatory neurons but do not prevent the excitatory network from functioning at all.

Biology is a game of subtleties, you see.

One Inhibitory network simply diminishes the excitation of surrounding neurons or what’s commonly known as winner-take-all. As a neuron fires, it sends out a low level inhibitory signal to its neighbors so that over time, a single excitatory neuron will dominate a specific sensory input. There is no permanence in this action.

The second Inhibitory network refines the excitatory network by diminishing the excitatory connections, or from a neuroscience perspective, the inhibitory network causes synaptic pruning whereas, as the inhibitory network stimulation increases, the excitatory synaptic connections will diminish by lowering weighted values or removing synaptic connections all together.

This is a one-to-one excitatory neuron to inhibitory neuron and back again to the same excitatory neuron. Every time the excitatory neurons fires, it also fires to the mirror inhibitory neuron until the inhibitory neuron becomes excited enough to fire negative weights back to the excitatory neuron and diminish its ability to fire.

Through time, depending on the amount of activity, the inhibitory neuron can become stronger and weaken the excitatory neuron until it begins to reverse its excitatory accumulation and begin to reduce its strength to the postsynaptic excitatory neurons and over more time, removing synaptic connections all together.

III. Synaptic Pruning & Time

Synaptic pruning is governed by accumulation of the negative weights over time in the excitatory neurons, and similar to learning and growth development, as the inhibitory accumulation reaches a specific threshold, the action of weight reduction and pruning will take place. The timing on the pruning accumulation before being set to zero is much larger than the few milliseconds in excitatory accumulation (seconds instead of milliseconds) and with a large threshold.

With higher thresholds and connectomic timing overall, inhibitory pathways do their job to prevent runaway and chaotic growth. Most people understand the all-or-none response of a neuron where a certain amount of accumulated dendritic input reaches a threshold and the neuron fires. Once it fires, the neuron immediately returns to a resting state and the process starts over again. If an accumulation is not obtained within a certain timeframe (milliseconds), the neuron will not fire and return to its resting state.

Connectomic timing, or pathway to pathway communications, plays just as an important role in our abilities to learn. Inhibitory neurons do not immediately start inhibiting the network. It takes excitation over time to bring them to a threshold that will cause them to fire to their postsynaptic neurons. Therefore, excitation and excitatory growth is immediate but is tempered once the inhibitory network is excited enough to bring their influence.

This is probably where the phrase “old enough to know better but young enough to do it anyways” comes from?

Time is an incredibly important factor in what we’re doing. It’s a different way of thinking about how software is built, and even more fundamentally, now it works. There is no database. Instead, it’s more like a lightning bolt shooting throughout the brain and body over a set period of time, that encodes the information in Biologic Intelligence.

Time 1, lightining bolt in a particular shape. Time 2, lightning bolt branches differently into a new shape. Time 3, lightning bolt branches even further, but back onto itself as a new bolt commences. It’s a fractal, recurrent, and timely. It’s nature and physics.

Our excitatory networks will get us into trouble but eventually our inhibitory network catches up and allows us to realize what mistakes we made. We tune our Biologic Intelligence to make the proper decisions, and this inhibitory network is the tuner.

We have applied this to our robotic learning with great success. With only a few trials, the robots can learn to discriminate the proper actions. That’s altogether different than the deep learning approach which sometimes requires 100,000 to 1 million or more trials to get to any sort of accuracy. That’s because it’s fine tuning the mathematic error function.

But this is Biologic Intelligence, and what comes next in AI after deep learning has reached its fundamental limit of capability.

— Timothy Busbice

قالب وردپرس