I was always going to end up in robotics – this was probably clear to my horrified parents from the moment their five-year-old appeared in the kitchen having skinned the fur off an electric dog. I promise you this is not as ghoulish as it sounds. It's just that the fur and cute eyes got in the way of what was going on underneath. How did this thing walk? What made it segue from walking to sitting to barking? But most of all, what trickery could give the impression of intent in 1978 via a couple of D-cell batteries?
As a child and through my teens I was utterly fascinated about how to synthesise something that acts in the world on our behalf and with independence. This interest was fed, as it was for many of us, by icons of kids' science-fiction films. That influence was fun then, but now I worry about what it obscures in a grownup world. It gets in the way of things adults need to talk about.
Our perspective of robotics is often polluted by easy-to-write scare stories adorned with easy-to-source pictures of science-fiction baddy-robots. Of course we all know science-fiction characters are are, by design, wild flights of fancy. Yet many articles can go out of their way to muddle science fiction, fantasy and fears. I've lost count of the times I've read stories of doom and obsolescence adorned with a charming photo of Arnie's Terminator. I groan. Why so negative? We don't routinely put pictures of wild uncontrolled explosions next to articles on chemistry, or ray-gun pictures next to text on electronics.
The truth is there is a vast disparity between what we might believe or fear from a position informed by fiction-science and the reality of robotics-science. Don't get me wrong. We've made amazing progress, and so much good is now within our reach – but our best machines are not even close to the cognitive and mobile ability of a mole. Ants outperform our best endeavours.
So why are everyday tasks so hard? Looking at coffee-making is helpful. I'm writing this at my kitchen table, having just made a coffee. Let's deconstruct the "add milk to taste" part and reflect on the complexity at play. I had to jolt open a fridge with a sticky seal then stop the door from crashing into my three-year-old by catching it with my foot. I then peered past yoghurt pots, precariously balanced soup cartons, two bottles of I-know-not-what to spot the milk. I reached in, steering past all this to grasp the condensation-covered milk carton with some fingers.
Of course, by now my arm was obscuring my view, so I used the changes of torque and load perceived by my grasp to infer its collisions with fridge items and my short-term spatial memory to navigate the milk out. And all that before the pouring bit. Please believe me when I say achieving that with a low failure rate is fantastically hard – many, many years away for a machine.
This state of play then is not very threatening – in fact, isn't it all a bit underwhelming? Not at all. You see, we can build somewhat smart machines that do our bidding – they can work for us, help us, keep us safe, even repair us. Not so for moles and ants – they are always self-interested and particularly hard to communicate with. From this simple observation much good comes. Robots are, for now, just smart tools, devices built by us to make life better.
We need to be able to have a grownup conversation about robotics. We need to make sure we consider the full gamut of what robotics and automation will bring us – the brilliant, "obvious win" stuff as well as the stuff that needs us to be careful. We certainly talk about the obvious example: drones. Like all weapons (longbow included), these non-thinking tools that we can use to project force at a distance impose a moral duty to think about how we use and develop them.
This is down to all of us. If we have a view we can get involved, but to be helpful we must be informed by reality and not science fiction. Furthermore, if only for balance, we really owe it to ourselves to look at and anticipate some of the astounding and wholly positive outcomes of robotics and automation.
Robotics and its sibling disciplines like computer vision and machine learning will, I am sure, offer us better lives. The word "offer" is important as we do get to make choices. Engineers will create the building blocks and innovations that will underpin new artefacts and services – but only if it makes sense to do so. We will be offered better surgery, better security, better warehousing, more efficient ports, safer mines, curated farms with reduced pesticide requirements and higher yields. We'll do science on other planets and at the bottom of our oceans. We'll have smarter and more flexible manufacturing creating new jobs supported by new tools.
We'll offer ourselves smart prosthetics and, maybe, home care – not a replacement for human care but an extra presence for when mum falls over in the night. We'll have cars that can drive you when you are exhausted, sick, ageing or simply unwilling to waste your time in traffic. We'll get new help in cleaning up our nuclear legacy, fighting fires, and rescuing families from collapsed buildings. I could go on.
Although major technical hurdles remain, these options and offers of help are coming. Yes, in many cases they are decades away, but we should anticipate, welcome and, of course, regulate. We can drop the droid talk and replace it with a proper sense of opportunity, benefit and maybe childlike wonder at what our creations will offer us. This is going to be good.
• Professor Paul Newman will deliver this year's Oxford London lecture, in association with the Guardian, on Tuesday 12 March at 6.45pm in Church House, Westminster. Tickets are available on the door, priced £15 for adults, £8 concessions