As Icarus discovered, it is difficult to imitate something when you don’t understand its guiding principles properly. Our rapidly-increasing ability to process data algorithmically has so far outstripped our limited understanding of what ‘intelligence’ might actually be as to set us on a well-funded race towards utility and productivity for new machine systems which are capable of learning, either supervised or semi-supervised.

There are not many AI projects so abstract as to let a machine entity proceed from the same kind of ad hoc data ingestion by which humans familiarise themselves with their new world at the start of their lives. A free-thinking machine that has experienced, explored and reasoned its way into anything resembling ‘understanding’ or ‘consciousness’ may not actually want to stack shelves for Amazon or put your GP out of work in some Watson-powered medical application.

But a group of British mathematicians are attempting to explore such an entity, in the form of an ‘analogue’ artificial intelligence which at least learns in a similar way to infant humans – including the learning of songs as formative development.

“Memory foam” approach to unsupervised learning, Oct 2015

“Memory foam” approach to unsupervised learning, Oct 2015

“Memory foam” approach to unsupervised learning [PDF] by Natalia 
B. Janson and Christopher J. Marsden 
at the School of Mathematics at Loughborough University presents initial research into non-algorithmic learning approaches for machines. The ‘Memory Foam’ intelligence is named after resilient orthopedic mattresses which retain the shape of the body pressed on them, but gradually restore themselves to equilibrium. The ‘indentation’ appears analogous to the strong impression of current experience, its re-equilibrium comparable to a ‘experience’ as a filter for new data input.

The Memory Foam model is intended to shape its vector response and actually adjust its architecture according to sensory input, and in this sense the researchers liken its learning approach to ‘the first component of the thinking process’, whilst noting with emphasis that the model does not rely on any biological knowledge.

Musical note recognition in Memory Foam model

Musical note recognition in Memory Foam model

One of the early sensory stimuli to which a Memory Foam model is exposed, is the children’s song ‘Mary Had a Little Lamb’, performed by an amateur musician as an 8kHz wave file (downloadable here). The simplest iteration of the Memory Foam model is exposed multiple times to the song and eventually discovers the likeliest frequencies, assimilating and remembering the musical model, and the music itself.

The researchers’ paper asserts that the memory foam approach ‘could pave the way to create a new generation of information processing machines. Unlike digital computers or discrete-state neural networks, these devices will be fully analogue and in this sense closer to biological brains.’

Whether or not the approach is a viable one to an ‘analogue’ or ‘organic’ model of computer learning, the paper is critical of the algorithm-driven impetus of current research into AI, at least in so far as progress in that kind of methodology is compared to human learning. It observes:

‘The performance of modern AI devices is based on algorithms, i.e. while fulfilling their goal they perform a sequence of pre-defined commands. Even the later generation of AI devices, that are based on neural networks, employ algorithms at least at the stage of learning…Contrary to that, it seems that a biological brain does not naturally execute a sequence of commands, although it can be trained to do so (often with some effort, e.g. when solving routine mathematical problems). In particular, the brain does not seem to learn by an algorithm.’

The researchers further describe even neural network learning as a systematised inculcation of coupling-values – or ‘weights’ – made possible by established training patterns – suggesting by inference that the rigidity of the first steps in NN learning patterns make it impossible for AIs derived by these techniques to ever break out of the subsequent algorithm-derived fractals of approaches to ‘thought’. The memory foam model is designed instead to learn autonomously, a proposition which presents considerable complications for equivalent neural network approaches.