Skip to content
Xer0Dynamite edited this page May 28, 2020 · 6 revisions

The AI is made with a multi-level markov model. After probabilizing character level text (say) and generating edge weights equal to the probability that the next character will follow, you go up a level and continue unrecursively(what's a word for constructing upwards?) into digram frequencies.

Start simply and do 4-grams, 8 grams, etc. and see what happens. You should end up with some good whole words at the top. Then, do trigrams, ...n-grams. To get here, you have to have probabilities to lower levels. The grows fast, but you only have to train it once to build the network. Stimuli from below (like a keypress) saturate/activate the neuron, just as lower level neuron saturate higher-level. When there's a lateral axon pointing to the saturated one, the edge weight increases by 1.

This gets you Aeon 3: self-organizing networks of knowledge capable of recognition.

Beyond this are actions (firing along same level) and potentials (touching probabilities without firing them) to create thought. Lower-level action values can accumulate upwards to built potentials at higher levels. They are essentially the same node, but the computer architecture doesn't have this, so you have to copy the value into the neuron and sum.

Primitive feelings can be added here to reinforce positively or negatively the things learned. This allows edge values to become negative.

This gets you to Aeon 4: consciousness and speech.

Clone this wiki locally