Consciousness and Neuromorphic Chips: A Case for Embodiment

4th of August, 2015
Five years ago I wrote a paper with a few colleagues on consciousness and neuromorphic computing. With all of the news coverage of deep learning, artificial intelligence and computing lately, I thought it might be time to reproduce that paper here for fun. I’ve also included (not wisely) the video version of my talk that was presented at a grad conference on philosophy of mind at Boston University. Ah, youth…. 


Deep Learning, IoT Sensor Data…and Bats!

15th of September, 2014

At the very center of Internet of Things excitement is the sensor. Not just one sensor, mind you, but a sensor that normally would just be sending a data stream to who knows where would now have access to the information from another sensor measuring something completely different. Now imagine your entire office building awash with dozens, hundreds, or even thousands of light, temperature, humidity, water, motion, image and other sensors. This is a staggering cornucopia  of data pulsating across a network at one time.

So what do we do with all that data? Several articles written on IoT from the CIO’s perspective tend to focus on the headache of attempting to store it all.  In other words, it’s viewed as a Bigger Data problem. I think this line of thinking suffers from short-term memory less when remembering why someone would connect together  hundreds of sensors in the first place. All those sensors should be interacting to solve a problem. For data geeks like myself, all that data makes me smile, but Billy the Building Manager just wants a product that makes his life easier. So how do we move beyond ex post facto statistical analysis to real-time processing, decision making, and visualization to help Billy reduce energy bills.

A good place to start our hunt is computational neuroscience. Researchers in this field want to know how the brain functions, so they mix mathematics with neurobiology, psychology, cognitive science and computer science to produce models of how neural inputs lead to neural activity and, in some cases, produce an external behavior. Google has been talking a lot about this field with its big bet on deep learning being the future of machine learning and AI. Another popular term to look out for is the recurrent neural network, which basically points out the importance of feedback within a network of biological neurons. The act of local competition and feedback among neurons is what makes neural activity so complex and powerful.

This may sound all well and good for processing visual, auditory, motor, or touch inputs but the human brain has no carbon monoxide sensor input processing system! The human brain has processing mechanisms for a number of sensor inputs, but what about nonbiological inputs like carbon monoxide detection? This is where IoT sensor processing and information discovery can get creepy and/or interesting. Some of the basic principles developed in deep learning over the past several decades are the concepts of unsupervised learning, hierarchical neural networks, and recurrence. There are certainly genetic differences across brain regions, but it’s stunning how similar the fundamental mechanisms of neural computation is across the neocortex. If this concept is true (and there are those who take issue with this), then we can begin constructing models of nonbiological sensors in a fashion inspired by the mathematical models of computational neuroscience.

Thankfully we don’t need to start from scratch. The bat, for example the neural processing of target distance by echolocating bats is a great starting point for coding up sonar sensor processing. Once you have a number of sensors processing inputs in their own hierarchical deep learning networks, you have to then learn higher-level features when you shake it all up. Enter multimodal deep learning. Hierarchical processing has shown great promise in image and language processing, yet the hurdles line up exponentially when you start talking about multimodal processing. This is the wild, wild west of AI. And once we begin to discover what combinations of sensor produce interesting patterns and useful outputs, the realm of IoT sensor data processing will be poised to make a giant step forward.

IoT Boston skyline in abstract signal

Boston (You’re My IoT Startup Home)

23rd of October, 2013

This is my first post in over two years. Slacker. In that 24-month gap I wrapped up my PhD in computational neuroscience, helped jumpstart “brains for bots” startup Neurala, and consulted for some amazing Boston tech companies in Cambridge this year. What I’ve learned in the past few years is that Boston has a vibrant and exciting startup community that is thriving! Whether it’s the cream of the crop at TechStars and the rest of Kendall Square or the buzz around Boston’s Innovation District, there is a lot happening in this city.

One especially cool tech trend is the number of VCs and angels starting to invest in hardware again. Accelerators like Bolt and R/GA Connected Devices, coupled with the popularity of Arduinos and 3D printing, are just a few good signs that beautiful new hardware products are heading our way. And now we get to the central point of why I’m writing blog posts again after a long hiatus. I’ve  become obsessed with the Internet of Things (or IoT for short). Heavy hitters like IBM, Cisco and Xively are salivating over market forecasts into the trillions, but the real intrigue is how IoT can, ironically, start to recede technology into the background.