Tuesday 22 August 2017

Artificial Intelligence: what will they think of next?



Back in the 1980's, computers started to hit the masses with products like the Sinclair ZX80, the Commodore 64 and the BBC Micro. Many kids at school got the computing bug. This was nothing to do with gaming (that did not exist yet), it was all to do with coding.

There was a saying which neatly summed up the interactions with those early computers:

Computers don't always do what you want. They always do what you tell them.

The apparent "intelligence" of a computer was not derived from the computer itself but from the intelligence of the coders who wrote the programmes which it ran. So if you wrote bad code, the computer appeared to misbehave. My computer spent a lot of time on the naughty step.

I remember being impressed and amused when I came across an early computer running a programme called Eliza (developed originally in the mid 1960's). Communicating via the keyboard, you could actually converse with an imaginary person within the computer. The responses implied some form of intelligence, but after a short period of time, you realised Eliza was pretty dumb but nevertheless likeable. Eliza was simply a set of algorithms and rules which interacted with the patterns in your text in order to mimic intelligent conversation.

One of the most significant definitions of thinking machines came from Alan Turing (best known for code breaking in the second world war and for being the main character in the film The Imitation Game). He proposed a test in which a machine would be set up to converse with a human remotely via a screen and text only. If the human could not detect that he was conversing with a machine, then the so called Turing Test would be passed.

Eliza was the first example of a machine passing this test but was certainly not an example of artificial intelligence.

There is no universal definition of artificial intelligence, but here are some characteristics which commonly apply:

A machine, computer system or software which can:
  • carry out tasks normally requiring human intelligence
  • think for itself and make informed decisions
  • learn for itself, adding to its pool of knowledge beyond that of the original programme
  • generate original ideas
  • make predictions based on analysis of past (or perhaps simulated) experiences

There have been philosophical debates about whether man made machines can truly "think" going back to Socrates and Plato. Questions of sentience, feelings and the possession of a soul by machines have been explored by science fiction writers like Philip K Dick in "Do Androids Dream of Electric Sheep" through to the adventures of Data in Star Trek the Next Generation.

The challenge we are about to face is more down to earth. Philosophy aside, if a machine can accurately mimic the behaviour of a human, think and carry out tasks for itself, then why not have it do those jobs which humans are doing today?

We are living in an information age. The vast majority of us are "information workers". Robots in this context are not restricted by physical form and artificial intelligence can already carry out information worker tasks as well or better, and certainly cheaper than humans. Eliza is arguably coming around to haunt us in the guise of the latest generation chatbots. These 21st century editions are not constrained by keyboards and text (although you will find this communication mode used on websites), nowadays, they can use natural speech to converse with customers and even translate between languages. We are even seeing 24 hour shopkeepers and helper systems being put into homes with names like Alexa, Cortana and unimaginatively, Google Home.

These forms of artificial intelligence are still very basic and have no feelings for sure and will quite happily make you redundant. Predictions about how many jobs are truly at risk vary enormously but have one thing in common - big numbers. I am heartened by one of the lessons of history. Back in the 60's, around the time Eliza and myself were being conceived, household appliances like washing machines, blenders and toasters were set to make life easy. Similar advances at work including the rise of computers were going to lead to three day weeks and a leisurely life of luxury. This did not come to pass. Instead, ever increasing demand for more machines, computers, blenders and all manner of other things has led to shops open on Sundays, 60 hour weeks and a whole lot more stress in our lives. My point is, demand for human labour does not diminish, but the roles in demand will change.

Artificial intelligence is already moving beyond the bounds of the digital world and into the physical one. Driverless cars are a reality and if you are thinking, well I haven't seen one, I can assure you it won't be too long. Southern Rail staff in the UK have been going through a long painful battle against the introduction of trains with driver operated doors on the grounds that not having a guard do this reduces passenger safety. A time will come when driverless trains will prove to be safer than human operated ones and unfortunately, train driving as a career will be heading to a siding.

If you are thinking this topic is getting a bit dark, you ain't seen nothing yet. In 2016, one of the greatest thinkers alive today had this to say about AI:

"The primitive forms of artificial intelligence we already have, have proved very useful. But I think the development of full artificial intelligence could spell the end of the human race,"  Stephen Hawking in an interview with the BBC.

The book by Nick Bostrom, "Superintelligence: Paths, Dangers, Strategies" has become a bestseller. Bostrom is the founding director of Oxford university’s Future of Humanity Institute and so spends a lot of time thinking about this stuff. Not all agree about the timing of when computer intelligence will match that of a human but sometime well before the end of this century seems to be a consensus. Bostrom does not predict it will all go wrong but highlights a number of possible ways that it could.


In the meantime, trying not to have nightmares about Terminator coming true, artificial intelligence is on the cusp of making a real difference to our lives both at home and in our work. Computers may start doing more than what we tell them and actually start showing some initiative. And that has surely got to be a good thing.

Gartner's latest view on AI - here

Wednesday 9 August 2017

What is Docker containerisation

Docker is an open source platform which encapsulates applications into highly efficient “containers”. Many containers can run on a host without interfering with each other. The community edition is available for free.

3 minute video version of this post is here.


Docker Inc – is a commercial organisation which sells Docker enterprise edition to organisations which need tools which are fully tested, validated and supported for use in an enterprise setting.

You may be familiar with server virtualisation from vendors like VMware and Microsoft. It enables the creation of virtual machines. Many virtual machines can run on a single host without interfering with each other.

It sounds a bit like Docker containers but the origin of server virtualisation is different to that of containers and on the inside they are different too.

The adoption of server virtualisation has been driven by the needs of IT departments to drastically reduce IT infrastructure running costs. Physical servers historically ran a single application to ensure no conflicts occurred between them. With requirements for lots of applications, that meant lots of expensive servers, all needing space, power and cooling. Advances in computing power and the development of virtualisation technology means that a single host server can replace many physical servers and contain many virtual machines - each with its own operating system and application.

Containerisation, on the other hand, has been driven by the needs of application developers. As businesses increasingly use digital technology to become more competitive, so the need to update and add new applications faster has become critical. It is a technology which is linked to the rise in DevOps, whereby businesses are putting dedicated teams in place to develop new applications. These projects must easily migrate from test and development phase into production so that they can run reliably and at scale. Ongoing application development continues to require the code to move between environments seamlessly.

A big overhead when developing applications turns out to be the IT infrastructure itself.  Software is typically developed by teams using a range of platforms like PC's, laptops, servers and even cloud based systems. These may not be compatible. Each platform move requires customisation and testing for reliable operations. This really slows down progress and hinders collaboration across developer teams.

Containerisation starts with the application. An application does not need an entire dedicated operating system, it can make do with a cut down set of files called “bins and libraries” – together these can be put in a container. The container needs access to a shared operating system plus the Docker software. The operating system is typically Linux with Microsoft Windows a more recent option.

Containers don’t each hold an operating system so they are much smaller than virtual machines and many more can fit into a single host. A container is much faster to start up and easier to maintain than a virtual machine, helping developers be more productive.

Although virtual machines can be easily moved around between hosts or even out to the cloud – this is typically under the control of IT departments. This functionality often comes at a cost. With containers, the developers don’t need to ask IT for help. Sharing containers, jointly developing applications and moving them from test to production to the cloud – and back – is quick and easy.

So when comparing containers with virtual machines – its less to do with what they are, and more to do with why they came about in the first place. 

 - Server virtualisation is helping IT departments do more for less.
 - Containerisation, led by Docker, is helping developers do more, faster.

Luckily they can also co-exist, containers can run inside virtual machines -  so organisations can benefit from the best of both. 

Although Docker is not the only container technology, it is the most widely adopted and the most talked about. More information is available at www.docker.com