Tuesday, 22 August 2017

Artificial Intelligence: what will they think of next?



Back in the 1980's, computers started to hit the masses with products like the Sinclair ZX80, the Commodore 64 and the BBC Micro. Many kids at school got the computing bug. This was nothing to do with gaming (that did not exist yet), it was all to do with coding.

There was a saying which neatly summed up the interactions with those early computers:

Computers don't always do what you want. They always do what you tell them.

The apparent "intelligence" of a computer was not derived from the computer itself but from the intelligence of the coders who wrote the programmes which it ran. So if you wrote bad code, the computer appeared to misbehave. My computer spent a lot of time on the naughty step.

I remember being impressed and amused when I came across an early computer running a programme called Eliza (developed originally in the mid 1960's). Communicating via the keyboard, you could actually converse with an imaginary person within the computer. The responses implied some form of intelligence, but after a short period of time, you realised Eliza was pretty dumb but nevertheless likeable. Eliza was simply a set of algorithms and rules which interacted with the patterns in your text in order to mimic intelligent conversation.

One of the most significant definitions of thinking machines came from Alan Turing (best known for code breaking in the second world war and for being the main character in the film The Imitation Game). He proposed a test in which a machine would be set up to converse with a human remotely via a screen and text only. If the human could not detect that he was conversing with a machine, then the so called Turing Test would be passed.

Eliza was the first example of a machine passing this test but was certainly not an example of artificial intelligence.

There is no universal definition of artificial intelligence, but here are some characteristics which commonly apply:

A machine, computer system or software which can:
  • carry out tasks normally requiring human intelligence
  • think for itself and make informed decisions
  • learn for itself, adding to its pool of knowledge beyond that of the original programme
  • generate original ideas
  • make predictions based on analysis of past (or perhaps simulated) experiences

There have been philosophical debates about whether man made machines can truly "think" going back to Socrates and Plato. Questions of sentience, feelings and the possession of a soul by machines have been explored by science fiction writers like Philip K Dick in "Do Androids Dream of Electric Sheep" through to the adventures of Data in Star Trek the Next Generation.

The challenge we are about to face is more down to earth. Philosophy aside, if a machine can accurately mimic the behaviour of a human, think and carry out tasks for itself, then why not have it do those jobs which humans are doing today?

We are living in an information age. The vast majority of us are "information workers". Robots in this context are not restricted by physical form and artificial intelligence can already carry out information worker tasks as well or better, and certainly cheaper than humans. Eliza is arguably coming around to haunt us in the guise of the latest generation chatbots. These 21st century editions are not constrained by keyboards and text (although you will find this communication mode used on websites), nowadays, they can use natural speech to converse with customers and even translate between languages. We are even seeing 24 hour shopkeepers and helper systems being put into homes with names like Alexa, Cortana and unimaginatively, Google Home.

These forms of artificial intelligence are still very basic and have no feelings for sure and will quite happily make you redundant. Predictions about how many jobs are truly at risk vary enormously but have one thing in common - big numbers. I am heartened by one of the lessons of history. Back in the 60's, around the time Eliza and myself were being conceived, household appliances like washing machines, blenders and toasters were set to make life easy. Similar advances at work including the rise of computers were going to lead to three day weeks and a leisurely life of luxury. This did not come to pass. Instead, ever increasing demand for more machines, computers, blenders and all manner of other things has led to shops open on Sundays, 60 hour weeks and a whole lot more stress in our lives. My point is, demand for human labour does not diminish, but the roles in demand will change.

Artificial intelligence is already moving beyond the bounds of the digital world and into the physical one. Driverless cars are a reality and if you are thinking, well I haven't seen one, I can assure you it won't be too long. Southern Rail staff in the UK have been going through a long painful battle against the introduction of trains with driver operated doors on the grounds that not having a guard do this reduces passenger safety. A time will come when driverless trains will prove to be safer than human operated ones and unfortunately, train driving as a career will be heading to a siding.

If you are thinking this topic is getting a bit dark, you ain't seen nothing yet. In 2016, one of the greatest thinkers alive today had this to say about AI:

"The primitive forms of artificial intelligence we already have, have proved very useful. But I think the development of full artificial intelligence could spell the end of the human race,"  Stephen Hawking in an interview with the BBC.

The book by Nick Bostrom, "Superintelligence: Paths, Dangers, Strategies" has become a bestseller. Bostrom is the founding director of Oxford university’s Future of Humanity Institute and so spends a lot of time thinking about this stuff. Not all agree about the timing of when computer intelligence will match that of a human but sometime well before the end of this century seems to be a consensus. Bostrom does not predict it will all go wrong but highlights a number of possible ways that it could.


In the meantime, trying not to have nightmares about Terminator coming true, artificial intelligence is on the cusp of making a real difference to our lives both at home and in our work. Computers may start doing more than what we tell them and actually start showing some initiative. And that has surely got to be a good thing.

Gartner's latest view on AI - here

Wednesday, 9 August 2017

What is Docker containerisation

Docker is an open source platform which encapsulates applications into highly efficient “containers”. Many containers can run on a host without interfering with each other. The community edition is available for free.

3 minute video version of this post is here.


Docker Inc – is a commercial organisation which sells Docker enterprise edition to organisations which need tools which are fully tested, validated and supported for use in an enterprise setting.

You may be familiar with server virtualisation from vendors like VMware and Microsoft. It enables the creation of virtual machines. Many virtual machines can run on a single host without interfering with each other.

It sounds a bit like Docker containers but the origin of server virtualisation is different to that of containers and on the inside they are different too.

The adoption of server virtualisation has been driven by the needs of IT departments to drastically reduce IT infrastructure running costs. Physical servers historically ran a single application to ensure no conflicts occurred between them. With requirements for lots of applications, that meant lots of expensive servers, all needing space, power and cooling. Advances in computing power and the development of virtualisation technology means that a single host server can replace many physical servers and contain many virtual machines - each with its own operating system and application.

Containerisation, on the other hand, has been driven by the needs of application developers. As businesses increasingly use digital technology to become more competitive, so the need to update and add new applications faster has become critical. It is a technology which is linked to the rise in DevOps, whereby businesses are putting dedicated teams in place to develop new applications. These projects must easily migrate from test and development phase into production so that they can run reliably and at scale. Ongoing application development continues to require the code to move between environments seamlessly.

A big overhead when developing applications turns out to be the IT infrastructure itself.  Software is typically developed by teams using a range of platforms like PC's, laptops, servers and even cloud based systems. These may not be compatible. Each platform move requires customisation and testing for reliable operations. This really slows down progress and hinders collaboration across developer teams.

Containerisation starts with the application. An application does not need an entire dedicated operating system, it can make do with a cut down set of files called “bins and libraries” – together these can be put in a container. The container needs access to a shared operating system plus the Docker software. The operating system is typically Linux with Microsoft Windows a more recent option.

Containers don’t each hold an operating system so they are much smaller than virtual machines and many more can fit into a single host. A container is much faster to start up and easier to maintain than a virtual machine, helping developers be more productive.

Although virtual machines can be easily moved around between hosts or even out to the cloud – this is typically under the control of IT departments. This functionality often comes at a cost. With containers, the developers don’t need to ask IT for help. Sharing containers, jointly developing applications and moving them from test to production to the cloud – and back – is quick and easy.

So when comparing containers with virtual machines – its less to do with what they are, and more to do with why they came about in the first place. 

 - Server virtualisation is helping IT departments do more for less.
 - Containerisation, led by Docker, is helping developers do more, faster.

Luckily they can also co-exist, containers can run inside virtual machines -  so organisations can benefit from the best of both. 

Although Docker is not the only container technology, it is the most widely adopted and the most talked about. More information is available at www.docker.com

Sunday, 19 February 2017

Internet of Things: the WHAT and WHY

There is a lot of hype about the Internet of Things. There must be something in it - but what?

50 billion devices connected to the Internet. 10 trillion Dollar opportunity. 17 zettabytes of data storage. 

I can read (or make up) statistics as well as the next person, but do these figures actually mean anything? I suspect they are somewhat missing the point.

In order to make some sense of IoT, we need to look at it from two different but inter-dependant perspectives:

  • What it is
  • Why it is


Watch the video version of this post here

The WHY needs to be understood for each and every IoT project. Without some sort of business return, projects will never get off the ground. However, without looking at the WHAT, it's very hard to start imagining how it can help us in the first place. So let's start there.

Most definitions of IoT start with something like "the interconnection via the Internet of devices (things), enabling them to send and receive data." This is not the whole picture. We need to include in our definition what can be done with this new data through analytics or data visualisation. We can potentially discover new business insights to help us either reduce costs, provide new services or revenue streams or to improve our competitive edge.

This all sounds a bit fluffy until you apply it to an example:

  • Car insurance is highly competitive. Premiums are affected by a driver's history with factors like length of driving experience, no claims periods and previous accidents. IoT has enabled insurers to modify premiums based on actual driving behaviours. The addition of a "black box" to the insured car enables tracking of acceleration, speed and position. These devices are connected to the internet via mobile network. Not only can the insurer monitor individual drivers, they can also use data collected across all their customers to start building a much richer picture of risks. For example, identifying high risk areas for collisions. This data could potentially be shared in anonymous form to external parties for a fee. Examples could be the police who may have an interest in areas where speeding is rife, or the Highways Agency interested in accident blackspots. With a better understanding of risks, the insurer can potentially undercut their competition or devise new policies which are tailored for specific demographics.

There are some technology trends which are combining to enable organisations to realise value from IoT projects which would not have been feasible in the past. 

These are why IoT has become such a hot topic.
  • Sensors can be wirelessly connected to the internet within small battery powered modules which can last for years without replacement.
  • Mobile carriers are using IP based protocols and developing low cost narrowband connections. These support use cases which are becoming commercially feasible for the first time.
  • Real-time and historical analytics capability has become affordable and readily available to a wide audience and can be consumed in multiple ways including on-premise or cloud.
  • A number of vendors are productising these solutions and making it easier and quicker for customers to leverage the Internet of Things.
The two main challenges for anyone wanting to cash in on IoT are directly related to our two perspectives:
  • WHAT is it
    Projects span a huge range of technologies from "things" through to data analytics. It will be hard for a singe supplier to provide expertise and services capability across the whole spectrum. A partnership approach will often be the only workable option.
  • WHY is it
    Even with expertise across the IoT spectrum, projects will never fly without a strong business case. Suppliers will need to couple their IoT skills with a deep understanding of their customer's business and engage using a consultative approach.
Finally, IoT projects can be bespoke and costly. The secret to success will be the identification of verticals where solutions can be developed with a wider market appeal.