22 July 2020
Computer chip with a human-like brain
One of the pivotal plot elements in the “Terminator” movie series is the Cyberdyne Corporation that developed the self-aware Skynet system.
In the movie, the foundation technology was pioneered by a Computer Scientist (Miles Dyson) who based his work on a broken computer chip, a remnant from the brain CPU of the first terminator (played by Arnold Schwarzenegger). The Terminator was crushed in a hydraulic press in a factory, leaving behind only a robot arm and a broken silicon chip - the Terminator's CPU.
Miles Dyson said of the chip "It was a radical design, ideas we would never have thought of..."
According to the movie plot, the Terminator CPU was an artificial neural network with the ability to learn and adapt.
In the Eighties, a neural network based CPU was pure futuristic fantasy.
Terminator 1 was released December 1984 - over 35 years ago.
From science-fiction to reality
Australian company Brainchip Holdings is about to launch the first of a series of chips supporting brain-like neural processing that will likely change the world.
The new chip called Akida™ is designed to learn and adapt.
This could be a Skynet moment.
I commented recently the pace of innovation had slowed with no new “game changing” technologies developed for over a decade.
However, with Brainchip about to launch the first practical neuromorphic “system on a chip”, and with Intel, IBM, NVIDIA and others following closely behind – that’s about to change.
Neuromorphic computing is based on studying how the function of neural brain cells can be replicated in silicon to perform cognitive computing
Powerful AI at the edge without remote data links
Their new Akida™ neuromorphic chip has a target price of $10 and will feature very low power consumption making embedded AI applications at the edge a practical reality.
"The edge" is a term to describe devices that interface with the real world - like facial and object recognition cameras, sensors that "smell", audio sensors and others.
At the edge, sensor inputs are analyzed at the point of acquisition rather than through transmission via the cloud to a data center.
And that's a game changer because current "powerful" sensor devices that use AI to process data (such as facial recognition, voice interpretation, or vibration analysis) rely on remote heavy-duty computing to perform their AI, sending data back through the internet to servers that undertake the heavy processing required to perform deep analysis.
This has obvious limitations, data connections are not always available nor reliable and link speed prevents real time results.
Real time results are clearly important for systems that control motion. For collision avoidance or steering vehicles around landscape features, decision making needs to be instant. But, even for AI that doesn't need to be real-time, the grunt needed exceeds the space and electrical power available on remote devices.
Akida™ has been designed to address this problem; the new chip supports brain like high speed self-learning systems and pattern recognition in real time with power consumption at less than 1 watt.
To be clear, Akida™ supports both on-chip-learning and inference, meaning learning mechanisms are built in as well as later using the learnings to drive intelligent pattern recognition (inference).
The fact that Akida™ can replace power hungry and comparatively large servers (for AI applications) with one chip consuming less than 1 watt illustrates the efficiency of neural network processing and provides clues as to why the human brain is so powerful.
Compact high speed AI processing with low energy consumption enables powerful functionality in small portable devices. Low cost means widespread deployment of these devices.
Akida™ - a key enabler for the growing world of sensors
Sensors are devices that translate the real world into electrical signals.
We are familiar with simple sensors that detect (for example) voltage, current, resistance, temperature, linear or rotational speed. But, there are also complex sensors such as video cameras, vibration sensors, microphones, finger-print touch pads, and even olfactory (smell) sensors that produce complex signals.
Complex sensors generate volumes of data that require serious computing grunt for processing and analysis in order to produce actionable information. Information that can form the basis for automated decision making.
Information from multiple sensors can be collected to provide huge datasets to describe the real world.
Interpreting information from these sensors for decision making IS NOT as simple as (say) from a single voltage source which might be...
If [voltage] > 12.6 then [switch on motor]
However, when data sets become complex, the ability to recognise objects or other patterns requires initially learning what these patterns are, and then later scanning future new data to see if those patterns are present.
To be clear, we aren’t just talking about recording the data (logging) or detecting simple trends, that’s relatively easy. We are talking about deeply analysing the information to develop powerful insights.
Reliably detecting the DIFFERENCE between shadows from overhead tree foliage, or a bird flying directly into the camera COMPARED TO a pedestrian in streaming sensor data (from Lidar, Video, or some other sensor) is a non-trivial problem.
Processing complex data to identify patterns that later are recognized (or pattern matched) is called Artificial Intelligence, or simply written as AI
Why human brains are so powerful
It's called 'Artificial' Intelligence because essentially the brain is a device that does just that - it stores (learns) patterns and later matches new data to already stored patterns, while at the same time tweaking existing patterns and identifying new ones; getting smarter through experience.
The human mind is not a very large machine and it doesn’t consume much power (comparatively) but punches above its weight in terms of information processing grunt.
It achieves this by abstracting information (simplifying it) to reduce the processing workload.
Even though we think we “see”, “hear”, “taste”, “smell” and “feel” the world clearly – what our brain is doing is converting the signals that it receives from our senses into simplified and abstract data.
The human mind is a device that models the real world, and neuromorphic computing is based on replicating this abstraction and simplification.
The human brain is inherently sparse in its approach to data processing; stripping out redundant information to increase speed and reduce storage overhead
Artificial Intelligence introduces the topic of machine learning.
It works like this. Instead of programming a video device to recognise a cat (for example), you point the device at a series of cats from various angles and tell the machine "these are all cats." Such learning usually also includes similar sized animals such as small dogs and rabbits "by the way, these are NOT cats".
The learning algorithm identifies the salient and differentiating features, from then on the machine will recognise cats on its own.
Such a device could be used to manufacture deluxe cat doors that let your cat through but keeps other neighborhood moggies out (and for the USA market - shoots them on sight). Today, the technology to do that, based on video recognition, exists but would be stupidly expensive. But, at $10 per chip, Akida™ will make such applications commonplace. Akida™ has the facility to receive direct video input to process pixels on the chip.
Human brains do not store pixel perfect images, they store abstractions - the minimum amount of data points needed to support future pattern recognition. Akida™ replicates this approach in silicon.
Recognising objects is nothing new in the world of AI - however, doing it "at the edge" (i.e within a small low powered device not connected to a power hungry server in the cloud) is revolutionary.
With a truly bewildering array of off-the-shelf sensors already on the market, the development of astoundingly intelligent and useful enhancements will soon proliferate by including within the sensor package a nueromorphic processor like Akida™.
Hybridised sensors will also emerge that don't just look for patterns in the data generated by one sensor but process real time datasets generated by multiple sensors. This is after all, what the human brain does.
For building truly autonomous human like robots – this is the missing link.
Robophobia - robots might rule the world.
Humanoid robots is just one of thousands of potential applications.
However, mischievously I have taken a bit of creative licence here. Talking about humanoid robots connotes autonomous thinking, self-direction, and perhaps even artificial consciousness (but, if something is conscious – is it artificial?).
Robo-phobic thinking inculcated through years of imaginative fiction leads us to expect such Robots might exceed their brief to the detriment of humanity.
No need to panic, Akida™is not in this ballpark.
This is where the reader may wish to research the difference between Narrow AI and General AI. Akida™is designed for task specific (or narrow) AI where its cleverness is applied to fast learning and processing for very specific useful applications.
While it does apply brain architecture, Neuromorphic computing is nowhere near capable of emulating the power of the human brain. Although, it could well be the first crack of the Pandora lid.
However, Akida™ could be applied to helping humanoid robots maintain balance, control their limbs and navigate around objects. It could also be used for very narrow autonomy like...
IF [Floor is Dirty] AND [Cat Outside] THEN START [Vacuuming Routine]
Robotics is certainly an obvious field of application for Akida™, but probably more for robots that look like this...
On their website Brainchip has identified the obvious applications of this new technology such as: person-type recognition (businessman versus delivery driver for example), hand gesture control, data-packet sniffing (looking for dodgy internet traffic), voice recognition, complex object recognition, financial transaction analysis, and autonomous vehicle control.
One can't help thinking that this list is mysteriously modest.
Broad spectrum breath testing
They also mention olfactory signal processing (smell) suggesting use for devices that can identify the presence of molecules in air. Such "bloodhound" devices could be deployed for advanced medical diagnostics that could identify an extraordinary range of diseases and medical conditions simply through sampling a person's breath.
The impact on the medical diagnostics industry could be, er - breath taking.
On 23 July 2020 the company announced that Nobel Prize Laureate Professor Barry Marshall has joined Brainchip's scientific advisory board. Prof. Marshall was awarded the Nobel Prize in 2005 for his pioneering work in the discovery of the bacterium Helicobacter pylori and its role in gastritis and peptic ulcer disease. The gold-standard test for Helicobacter pylori involves analysing a person's breath.
Detection of endogenous (originating from within a person, animal or organism) volatile organic compounds resulting from various disease states has been one of the primary diagnostic tools used by physicians since the days of Hippocrates. With the advent of blood testing, biopsies, X-rays, and CT scans the use of breath to detect medical problems fell out of clinical practice.
The modern era of breath testing commenced in 1971, when Nobel Prize winner Linus Pauling demonstrated that human breath is complex, containing well over 200 different volatile organic compounds.
Certainly works for Police breath testing to measure blood alcohol levels.
Sensing and quantifying the presence of organic compounds (and other molecules) using electronic olfactory sensors feeding into an AI processing system could be the basis for low-cost immediate medical diagnostics assisting clinicians to more quickly identify conditions warranting further investigation and providing earlier indications long before symptoms are apparent.
Star Trek Medical Tricorder (The Independent, How close are we to a real Star Trek medical Tricorder?)
But, who knows what applications will emerge when every tech-head from whizz-kid teenagers to seasoned industrial engineers get their hands on Akida™?
History shows the launch of a first generation chip leads to rapid development
As a moment in the history of technological achievement, the launch of Akida™ could be likened to the launch of Intel's 8008 microprocessor in 1972, the first 8 Bit CPU that spawned the development of a new generation of desktop computing, embedded device control, Industry 3.0 automation and the ubiquity of the internet.
Nearly forty years later, we know how that has turned out. Barely a device exists that doesn't have at least one microprocessor chip in it. Recent estimates suggest the average household has 50 microprocessors tucked away in washing machines, air-conditioning systems, motor vehicles, game consoles, calculators, dish-washers and almost anything that has electricity running through it.
The clock speed of the 8008 back in 1972 was a leisurely 0.8 MHz and the chip housed 3,500 transistors. 40 years later the 8th generation Intel chips run at clock speeds 5,000 times faster and house billions of transistors.
Price has been a key enabler for market proliferation. Putting a microprocessor in a washing machine (instead of an electro-mechanical cam wheel timer) is feasible because the microprocessor chip costs a few cents and is quicker and simpler to factory-program, orders of magnitude more reliable, and delivers far greater functionality (although seriously, most of us wash clothes on the same settings all the time).
Akida™ could choose the optimum settings based on scanning the clothes being put in the machine, weighing them, and perhaps even also smelling them.
Machine contains 4.5 kg of ALL WHITES --> Add 35 ml of bleach.
The game changing significance of Brainchip's work cannot be overstated.
AI at the edge will dramatically improve the performance of artificial limbs.
AI Chip market size and growth
Attempting to forecast future market demand is always difficult particularly for ground-breaking new technology.
"I think there is a world market for maybe five computers." (Thomas Watson, president of IBM, 1943)
However, Deloitte have given it a shot and published an article (Bringing AI to the device: Edge AI chips come into their own TMT Predictions 2020 – refer ‘read more’ below).
We predict that in 2020, more than 750 million edge AI chips—chips or parts of chips that perform or accelerate machine learning tasks on-device, rather than in a remote data center—will be sold. This number, representing a cool US$2.6 billion in revenue.
"…and is more than twice the 300 million edge AI chips Deloitte predicted would sell in 2017 a three-year compound annual growth rate (CAGR) of 36 percent.
"Further, we predict that the edge AI chip market will continue to grow much more quickly than the overall chip market.
"By 2024, we expect sales of edge AI chips to exceed 1.5 billion, possibly by a great deal. This represents annual unit sales growth of at least 20 percent, more than double the longer-term forecast of 9 percent CAGR for the overall semiconductor industry.”
However, another estimate suggests the global silicon wafers market (which includes everything from the ubiquitous 555 timer chip to full sized processors) is expected to reach a total market size of US$9.56 billion in 2023, rising from US$7.88 billion in 2017 at a CAGR of 3.29% throughout the forecast period.
So, you could conclude that AI chips could grow to roughly 15 to 20% of the global silicon market.
It’s difficult to put an exact number on it, but the market size will be mutli-billion dollar and growth rates will be double digit.
However, like the 1943 IBM example, until a new technology is fully understood and its applications reach maturity, it is impossible to predict future market size. New technologies create new markets. This author (that's me) thinks the above estimates are modest. Edge AI processing chips spawned by Akida™ and closely followed by IBM, Intel and others will see exponential growth.
Note: in an article in the Sydney Morning Herald (28 September 2020: Moment of truth coming for the billion-dollar BrainChip by Alan Kruger) stated "This is why giants like Intel and IBM are investing in the sector which is expected to grow to a $US65 billion ($92 billion) market by 2025."
These estimates are wildly different. I'll do some more research on this and see if I can get more clarity. It illustrates the point - nobody really knows, other than to say - it would be a mistake not to realise neuromorphic chips will be BIG.
Conventional computer systems process data completely differently to biological brains in a way which can only support useful AI through brute force i.e. high powered, big computers.
Neuromorphic processing is based on observing that the human brain supports massive 'actual' Intelligence in a small space and at very modest power consumption. The approach is to mimic the sparse spiking neuronal synaptic architecture and organic self-organisation of the human brain.
Easier said than done, and Brainchip has been working on it for nearly two decades.
The ability to capture masses of data, discern the salient features of that data and later to instantaneously recognise those features – is central to high speed decision making. The sense of this can be summed-up in a familiar phrase we have all uttered “I’ve seen this before”.
"Salient features" is the operative term in the above paragraph. This introduces the concept of Spike Neural Processing and from where the name "Akida" derives. Akida is Greek for spike. Whereas brute force AI (using traditional Von Neumann architecture) must process all data looking for patterns, brains cut the workload down through spiking events where a neuron reaches a set threshold in response to new sensor data before transmitting a signal to a cascading sequence of subsequent neurons (a neural network).
The neural spike thresholds are the triggers for recognising the salient features.
The difference between the two architectures can be illustrated using fitting geometric shaped blocks into a board as an analogy...
Traditional (Von Neuman) computing works sequentially – try every shape one at a time in the remaining holes until they are all fitted.
Spike Neural processing – pick them all up at once and insert them at the same time – parallel processing.
Shape boards are used by cognitive psychologists to measure brain development in children. At a certain age (usually by 2 years) a child switches from trial and error to recognising which shapes fit in which holes. They only fit them one at a time because it's physically difficult to handle them all at once.
Thus both human brains and Akida™ only process the spike events, and process them in parallel (simultaneously) which dramatically cuts the workload, speeds response time dramatically, and reduces power consumption.
Learning is about setting and fine tuning spike thresholds and setting-up a pattern of links to other neurons. This is how the brain operates and Akida™ mimics this structure in the nueromorphic fabric at the core of the chip.
Even as I write this, I realise the above explanation is a bit like explaining motor cars as simply four tyres and a steering wheel. Obviously, the devil is in the detail.
Suffice to say Brainchip have worked the problem and the result is Akida™.
Akida™ is not an AI accelerator
A new class of devices called "neural accelerators" are emerging that act as enhancers to existing microprocessor based digital circuits. The heavy AI processing (machine learning and subsequent machine pattern recognition) tasks are handed-off to the accelerator.
However, systems based on this architecture lack flexibility, and are inherently power hungry, require more footprint and are not an elegant solution.
The approach with Akida™ is to change all that by providing everything needed in one integrated circuit package to support an optimised application. With Akida™ all of the system components needed (CPU, memory, sensor interfaces, and data interfaces) to build an application are in one package.
Akida™ is NOT an accelerator; it is a processing system. Everything needed for AI in one chip.
This is leading edge technology and will enable future innovation to both improve existing devices and spawn an era of new thinking that will lead to - who knows?
Hand gesture control is one of many possible applications.
Brainchip's Akida™ System on a chip
The chip contains all of the elements required to interface conventional digital systems to the neuromorphic fabric, thus providing a “system on a chip” rather than an inconvenient stand-alone neuromorphic device that might have engineers scratching their heads trying to figure out how to work with it.
Akida™ supports programming in existing high-level languages like Python (an open source language gaining huge popularity due to its universality, simplicity and power) and has onboard standard digital interfaces - PCI-Express 2.1, USB 3.0, I3S and I2S as well as external memory expansion interfaces.
Brainchip has taken the heavy lifting out of interfacing standard digital signals with the analogue and "spike-event" neuromorphic core, and supports ganging together chips with built in serial connectivity to allow up to 64 devices to be arrayed for a single solution.
At a target price of roughly $10 per chip, the cost for tinkerers to play with the new technology will be very affordable as will subsequent deployment in working devices at scale when manufactured.
Brainchip announced to the market on 2 July 2020, they had successfully produced the first Akida™ chip silicon wafers and the company is now testing and evaluating the first batch of devices. Pending the need for further manufacturing fine-tuning , Akida™ is not far from commercial release.
Applications are already in development
Brainchip has seeded the market by making available a development environment for software engineers to begin "playing" with the system and skilling-up using the application development tools. Brainchip has been working with leading global technology companies through its Early Access Program (EAP). The first batch of Akida™ chips will soon be fitted to evaluation boards and shipped to Early Access Partners.
The complete application development environment toolkit, guides and examples can be downloaded here at no cost.
Before the first chips are available, there are likely already many applications in final design stages waiting for the hardware.
The First Generation Akida™ chip isn't theory; it's practical reality.
And with other manufacturers with Spike Neural Processing chips nearing final development, soon powerful AI applications will become common place.
The world is about to change fast.
"Is the cat outside?"
- Brainchip Inc. official website
- Neuromorphic Chip Maker Takes Aim At The Edge [The Next Platform]
- Spiking Neural Networks, the Next Generation of Machine Learning
- Deep Learning With Spiking Neurons: Opportunities and Challenges
- How close are we to a real Star Trek style medical Tricorder? [Despina Moschou, The Independent]
- If only AI had a brain [Lab Manager]
- Intel’s Neuromorphic Chip Scales Up [and It Smells] [HPC Wire]
- Neuromorphic Computing - Beyond Today’s AI - New algorithmic approaches emulate the human brain’s interactions with the world. [Intel Labs]
- Neuromorphic Chips Take Shape [Communications of the ACM]
- Why Neuromorphic Matters: Deep Learning Applications - A must read.
- The VC’s Guide to Machine Learning
- What is AI? Everything you need to know about Artificial Intelligence.
- BRAINCHIP ASX: Revolutionising AI at the edge. Could this be the future SKYNET? [YouTube Video - a very good summary]
- In depth analysis of Brainchip as a stock [with discussion about the technology] [YouTube Video]
- Intel’s Neuromorphic System Hits 8 Million Neurons, 100 Million Coming by 2020 [The Spectrum]
- Bringing AI to the device: Edge AI chips come into their own TMT Predictions 2020 [Deloitte Insights]
- Inside Intel’s billion-dollar transformation in the age of AI [Fastcompany]
- Spiking Neural Networks for more efficient AI - [Chris Eliasmith Centre for Theoretical Neuroscience University of Waterloo], really fantastic explanation - Jan 21, 2020 (predates launch of Akida)
- Software Engineering in the era of Neuromorphic Computing - "Why is NASA interested in Neuromorphic Computing?" [Michael Lowry NASA Senior Scientist for Software Reliability]
- Could Breathalyzers Make Covid Testing Quicker and Easier? [Keith Gillogly, Wired - 15 Sept 2020]
- Artificial intelligence creates perfumes without being able to smell them [DW Made For Minds]
- Epileptic Seizure Detection Using a Neuromorphic-Compatible Deep Spiking Neural Network [Zarrin P.S., Zimmer R., Wenger C., Masquelier T. ]
- Flux sur Autoroute, processing d’image [Highway traffic, image processing] [Simon Thorpe CerCo (Centre de Recherche Cerveau & Cognition, UMR 5549 & SpikeNet Technology SARL, Toulouse])
- Exclusive: US and UK announce AI partnership [AXIOS] 26 September 2020
- Breath analysis could offer a non-invasive means of intravenous drug monitoring if robust correlations between drug concentrations in breath and blood can be established. From: Volatile Biomarkers, 2013 [Science Direct]
- Moment of truth coming for the billion-dollar BrainChip [Sydney Morning Herald by Alan Kruger - note: might be pay-walled]
- Introducing a Brain-inspired Computer - TrueNorth's neurons to revolutionize system architecture [IBM website]
- Assessment of breath volatile organic compounds in acute cardiorespiratory breathlessness: a protocol describing a prospective real-world observational study [BMJ Journals]
- What’s Next in AI is Fluid Intelligence [IBM website]
- ARM sold to Nvidia for $40billion [Cambridge Independent]
- Interview with Brainchip CEO Lou Dinardo [16 September 2020] [Video: Pitt Street Research]
- BrainChip Holdings [ASX:BRN]: Accelerating Akida's commercialisation & collaborations [1 September 2020:Video: tcntv]
- BrainChip Awarded New Patent for Artificial Intelligence Dynamic Neural Network [Business Wire]
- BrainChip and VORAGO Technologies agree to collaborate on NASA project [Tech Invest]
- Intel inks agreement with Sandia National Laboratories to explore neuromorphic computing [Venture Beat - The Machine - Making Sense of AI]
- Congress Wants a 'Manhattan Project' for Military Artificial Intelligence [Military.com]
- Future Defense Task Force: Scrap obsolete weapons and boost AI [US Defense News]
- Is Brainchip A Buy? [ASX: BRN] | Stock Analysis | High Growth Tech Stock [YouTube - Project One]
- Is Brainchip Holdings [ASX: BRN] still a buy after its announcement today? | ASX Growth Stocks [YouTube ASX Investor]
- Hot AI Chips To Look Forward To In 2021 [Analytics India Magazine]
- Scientists linked artificial and biological neurons in a network — and it worked [The Mandarin]
- Scary applications of AI - "Slaughterbot" Autonomous Killer Drones | Technology [YouTube Video]
- BrainChip aims to cash in on industrial automation with neuromorphic chip Akida [Tech Channel - Naushad K. Cherrayil]
- How Intel Got Blindsided and Lost Apple’s Business [Marker]
- Neuromorphic Revolution - Will Neuromorphic Architectures Replace Moore’s Law? [Kevin Morris - Electronic Engineering Journal]
- Insights and approaches using deep learning to classify wildlife [Scientific Reports]
- Artificial Neural Nets Finally Yield Clues to How Brains Learn [Quanta Magazine]
- Rob May, General Partner PJC Venture Capital Conversation 181 with Inside Market Editor, Phil Carey - discuss edge computing on YouTube.
- Why Amazon, Google, and Microsoft Are Designing Their Own Chips [Bloomberg Businessweek: Ian King and Dina Bass]