The Internet of

Artificially Intelligent Things

Internet of Things Conference, Malmö∂, Sweden

MobiCycle Ltd, May 2017

AI  &  \(A^2 I\)

That is why machine learning in IoT is vital. A Tesla, fresh off the production line, will have all the information collected by all the other Teslas that are currently on the road. Any new variable that is encountered can then be learned and shared with all connected cars, making the autopilot mode that much safer for all Tesla drivers.


-Megan Ray Nichols, Special Correspondent, Science Writer,  28 March 2017

Show of Hands, Please

That's news to me, TELL ME MORE

I pretty much already knew that...


  1. Why MobiCycle?
  2. Machine Learning
  3. \(A^2 I\)
  4. IoT, AI and Machine Learning
  5. Future Considerations

Why MobiCycle for

AI in IoT?

The Smart Home

E(lectronic) Advisor, by MobiCycle Ltd, is a

digital voice assistant for Amazon Alexa,

Google Home et al.

E-Advisor helps you repair, recycle or sell your abandoned electronics.

Smart City Infrastructure

from Reactive...

Instead, we should make informed contextual decisions and proactively offer suggestions

 AI Caramba! The IoT Challenge Anticipatory

We currently respond to user prompts and sensor triggers

Predict when and where an electronic is going to be discarded


Engage our customers earlier in their disposal process.



Machine Learning Works

(Forward Propagation)

Human Neural Network

The human brain has a network of neurons.  Any neuron can talk to another neuron. Together, they work to find the right answer or produce a desired outcome.

Artificial Neural Network (1/2)

Artificial networks have layers.


Neurons live within these layers.


Input layers accept text, audio or video data.


Hidden layer neurons then analyse the data.

Artificial Neural Network (2/2)

Hidden layer neurons assign an accuracy weighting to the input.  


If the weighting passes a set threshold, the input graduates to the next layer for further analysis.


The output layer finally delivers the results.

The Peace & Love Bus

Layer2, Neuron1:

Is there a 'door'? Affirmative. I am 60% confident

Layer1, Neuron1:

Is there a 'wheel'?

Yes, with 70% certainty

Input Layer: Peace&Love.png

Output Layer:

I think it's a bus!

The success of neural networks depends on how well you construct your layers. Many focus on

width or depth...

Choose the source of your input data carefully in order to

construct robust layers

Rule #1: Know Thy Customer

consumers want artificially intelligent things to engage them in a naturally occurring way; i.e, not insincere or awkward

Other Customer Types (B2B)

Retailers: mine customer data for brand preferences

Insurance: profile individual risk

Oil & Gas: find the optimal drilling and extraction sites

Pharmaceuticals: reproduce a drug’s effects on a molecule

Our IoT Focus

"\(A^2 I\)"

Poor Quality Consumer Experiences

Why \(A^2\) ?

the term "artificial" has two meanings

1. artificial

…Ďňźt…™ňąf…™ É(…ô)l/

made or produced by human beings rather than occurring naturally, especially as a copy of something natural.

2. artificial

…Ďňźt…™ňąf…™ É(…ô)l/

feigned, insincere, false, mannered, unnatural, stilted, contrived, pretended, put-on, exaggerated, actorly, overdone

Artificial and Artificial(\(A^2 I\))

occurs when IoT developers leverage machine learning in a way that creates a

false user experience

\(A^2\) in Action

 I felt as if I was talking to a

primitive robot, however in the case of [a competitor], it felt natural, as if you were talking to a person.


Swapnil Bhartiya, star Thought Leader, CIO

What MobiCycle's Customers Said

Quality Matters

Brands...have to ensure they deliver quality intelligent experiences, or

risk consumers seeking smarter options elsewhere.

Consumers are Open to Artificial Intelligence, but Quality Experiences are Key, Advertising Week

Sam Costello, Associate Director, Digitas LBI

Artificial Intelligence(AI)?

A recommendation algorithm, model or technique; e.g., seq2seq, word2vec?

A voice interface that

can understand a small set of phrases?


machine learning that evolves on its own, based on its experiences

\(A^2 I\)

machine learning that fails to evolve on its own, based on its experiences, resulting in a false user experience

Growth is Key

Users should see improvements in quality with each new interaction.

Barriers for Digital Assistants

  • high investment required to build an open-domain voice agent such as Apple's Siri, Google Assistant or Amazon's Alexa
  • long 'right tail' of interesting, rarely uttered words
  • fixed input sizes for conversation strings

Growth Blockers in AI/Machine Learning

  • blind acceptance of seq2seq, word2vec
  • vague quality standards (HCI)
  • Python fixation
  • cultural biases¬†abound

Barriers in IoT

  • no root access to base stations
  • being forced to stream my sensor data to the hardware provider's platform
  • developer fixation on Java
  • size, weight, and power (SWaP) tradeoffs

Let's address problems in IoT, then AI/Machine Learning and finally Digital Assistants

IoT Architecture

Accelerator Wars


6 to 10 ASICs

equal one FPGA by Xilinx or Altera(Intel)‚Äč


One Google TPU is

15-30x faster than Nvidia’s K80 GPU

upon closer inspection of the market and products, you may come up against...

FPGA-based applications are characterized by parallelization and by having the application performance bound limited by compute instead of data transfer



The First Question to Ask:

Is My IoT Use Case 

I/O Bound?

CPU Bound?

Input Output I/O Bound v CPU Bound

read data from disk

transfer data

e.g., streams

v perform calculations

v crunch numbers

v SHA-1 checksums

Why IoT is not CPU bound (generally)

Computers that predominantly use peripherals* are characterized as I/O bound.

*able to be attached to and used with a computer, though not an integral part of it.

I/O Operation Problems

the programme waits while communication is in progress

your processor remains

idle, waiting for I/O operations to


A Solution: Non Blocking I/O

asynchronous I/O or non sequential I/O ‚Äčpermits other processing¬†to continue before the transmission¬†has finished.

Why NodeJS?

  • a runtime for JavaScript that encourages non-blocking I/O¬†
  • npm, is the largest ecosystem of open source libraries in the world
  • written in C++, executes JavaScript using Chrome‚Äôs V8 engine

Why JavaScript?

JavaScript and Java retain their respective positions atop our rankings.

-Redmonk Language Rankings(2017)

Atwood's Law

 The less powerful the language, the more you can do with the data stored in that language.

Machine Learning Tools

Mind, Synaptic, ConvNetJS, Machine Learning, Machine Learning Tools, RedTrail*

However, in Node.js, JavaScript code is synchronous, single threaded


If your algorithm spawns 10,000 operations per second, your application will hang while the computation runs

Open Computing Language

Open CL is a framework for writing programs

It solves the 'one algorithm at a time' problem

Other Frameworks

  • CUDA
  • Vulkan
  • OpenGL¬†
  • Caffe
  • Theano

with CPU Bound Accelerators(GPU, FPGA, ASIC)

So, rather than a CPU

Revised Architecture



'I/O Bound' Accelerators

Case in Point

Siemens Smart Grid uses HTML5, JavaScript, Node.js and MongoDB on Linux

What's Up with All the Acronyms?

ASIC 1/3

application specific integrated circuits

any custom designed chip

$$$ high non-recurring engineering development costs


*application-specific standard parts

*essentially the same as ASICs

*more general-purpose devices

*intended for use by multiple system design houses


  • system on a chip
  • ASIC or ASSP + processor core
  • integrates as many components of a computer as possible in a chip
  • raspberry pi, intel edison on arduino, et al


field-programmable gate array

you programme your algorithm(s) into the board's blocks

All Hail Queen GPU

The CPU runs the remainder of the code.

A standard Graphics Processor (GPU) receives compute-intensive portions of the code.

*typically, not suitable for IoT

Training + Inference 

Hardware solves the 'Growth' problem because growth in ML is defined as 

Train in the Cloud...

and infer on the device

*cloud GPUs are 10 Xs faster than IoT GPUs

*run AI algorithms locally on the FPGA

*use less power

Tomorrow's Workshop:

Train in the Cloud, Infer on the Device [TCID]

TCID Stated Another Way


Embeds Machine Learning Algorithms at the Edge


A Word on Algorithms

Algorithm development is increasingly becoming open source.


You can leverage the efforts of others. 

Open Source Algorithms

  • TensorFlow (Google)
  • Open AI (Amazon, E. Musk)
  • Torch (Facebook, Software)

It's time to go deeper

Deep Neural Networks

unsupervised learning

more hidden layers

parallel computing

less manual intervention

DNN & Backward Propagation

sensor data is voluminous with complex patterns!

‚Äčso, continually adjust the weights

‚Äčand 'propagate' or push the error value backwards through the network

corrected weights lead to better inferences

To Summarise

1.) \(A^2 I\) asks you to put the customer first

2.)Node.js + OpenCL + Eyeriss(or similar)

3.)Deep learning enables error correction of the weights which leads to better conversations

And finally,

The Future:

Quantum AI

Is Quantum Computing faster than Machine Learning or

Deep Learning?

Within 10 years+: quantum AI algorithms will replace Boolean logic by quantum law at the algorithmic level

Within 5 years: small devices emerge with quantum-classical hybrid algorithms (but without full error correction)

Within 20+ years: Artificial superintelligence triggers abrupt runaway technological growth, resulting in unfathomable changes to human civilization.