Toy robot with Bright brain branding illustration

“Unleashing the power of AI is a top priority in our plan…”

No, these aren’t the maniacal words of Oscar Isaac’s Nathan in Ex Machina or some sleazy Omni Consumer Products (OCP) slogan from Robocop.

Instead, it’s from Digital Secretary Oliver Dowden’s Ten Tech Priorities; a plan to power a golden age of tech in the UK. Fitting then, that this new strategy to harness the power of artificial intelligence will be released in 2021. 

Because it’s a big year for the bots already.

2021: A blockbuster birthday for the bot

Released in 1991, Terminator 2: Judgment Day turns a whopping 30 years old this year. Just enough time for my John Connor-style curtains to recede into oblivion.

 

But if you can believe it, A.I. Artificial Intelligence is celebrating its 20th birthday as well. It feels like only yesterday that we were left wondering what could have been if Kubrick had directed the project as originally planned. Well, whatever your take is on Spielberg’s version, it’s impact on cinema - not to mention the way we think about machine learning - can’t be denied.

But are these once fantastical visions of technology really so far fetched or are we closer to the future depicted in science fiction than we think? We’re revisiting two of the most momentous times that movies did machines to find out.

T2: Judgement Day – Learning the value of human life

It’s 2029, pilotless planes hover above blackened skies. They’re searching with one primary objective: to exterminate all human life. At least that’s how James Cameron sees the future in Terminator 2: Judgment Day, anyway.

But we might not be as far off as you think.

Human annihilation aside, Amazon won FAA approval to launch their Prime Air drone delivery fleet last year and despite not being ready to fly yet, it might be a different story in eight years time.

In T2, the protagonists are fighting to change a future where Cyberdyne Systems has monopolised the market for military software; Skynet the pinnacle of technology. But when the film was released in 1991, there was another technological battle going on: the “console wars''. With edgy marketing campaigns and the creation of Sonic; the antithesis of Mario; Sega was actually leading rival Nintendo at the time. In fact, the Genesis (or Mega Drive) was outselling the Super NES at a rate of 2:1, with 1.6 million consoles sold over the year. But aside from the fact that console games, like the Terminator franchise, had become a technological and cultural phenomenon by 1991, it’s significant for another reason.

Sega stands for Service Games; this was a company originally set-up to make entertainment solutions for the military. Because in science fiction, art imitates life.

But it can also predict the future.

Drones aside, Skynet is probably more Google than Amazon. Although they’ve promised not to use their artificial intelligence in military weapons, their algorithms use machine learning the same way the terminator does.

In his own words, the T-800 is a “learning computer” - much like Google’s RankBrain. Essentially, the two harness data to become more human. The only difference is that the T-800 uses data to fit in; Rankbrain uses it to solve queries.

But are we any closer to a real cybernetic organism 30 years on?

Mikhail Lebedev, academic supervisor at HSE University’s Centre for Bioelectric Interfaces, seems to think so. He spoke recently to Robotics and Automation News about advances in this field.

“Progress may be held back by the lack of computer power, but development over the past ten years has also been enormous here. It is likely that we will soon see people around us using light, comfortable exoskeletons rather than wheelchairs or strollers to get around,” he said.

For example, the Russian ExoAtlet project is already developing exoskeletons for the rehabilitation of people with motor disabilities. With that in mind, maybe the future depicted in T2 isn’t as far off as we think. Lebedev is adamant: “Human cyborgs will become commonplace,” he said.

But developing human artificial intelligence from scratch is a different story. And 10 years on from T2, cinema would begin to take a more cerebral look into machine learning. Less interested in machines overpowering the human race; more obsessed with how they can fulfil our roles instead.

Ai: Artificial Intelligence - do androids dream of augmented reality?

Set in a future “...after the ice caps had melted... because of the greenhouse gases, and the oceans had risen…”, Ai: Artificial Intelligence feels more relevant today than it did 20 years ago.

In Ai’s future, robots - or “mechas''- exist to clean up the planet; the ideal choice as they don’t consume any of our depleted resources. But in reality, these plant-preserving robots - much like Wall-E or the gardening droids from Silent Running - aren’t quite as implausible as they seemed.

Wall-E Toy

In fact, it was recently reported in Interesting Engineering that engineers have been using artificial intelligence to ‘communicate’ with plants. Monitoring how a plant responds to its environment, the device can transmit movement instructions to help it flourish. In theory, green-fingered robots could use this technology to pick up delicate objects, nurture plants and help combat food scarcity.

But much like real-life where an Apple update renders the “latest” device obsolete, the technology in Ai grows useless and outdated just as fast. If you’re lucky that means living out your days in a box at the top of the cupboard like the ‘super toy’ Teddy; at worst you end up at the Flesh Fair.

This macabre event pits robots against each other for human entertainment. It’s colourful and over the top but not as far fetched as it looks. In fact, the concept already existed in 2001. The British public was so infatuated with Robot Wars that the show ran twice (from 1998 to 2004 and 2016 to 2018).

But Ai predicted the future in more subtle ways, too.

David and Gigolo Joe struggle to get the answers they’re looking for when they question Dr. Know - an animated professor/come search engine - about the Blue Fairy.

“You must take care not to raise your voice up at the end of a sentence,” the gigolo tells David when he accidentally wastes one of his questions. Fast forward ten years and this fictitious exchange would become a reality; Ai predicted the real-world headache of accidentally questioning Siri long before voice assistants became mainstream.

It was only when David refined his search and showed his true intentions that he was given the answer he was looking for. Google may have launched in 1998, but it wasn’t until their Hummingbird update fifteen years later that the search engine started displaying results based on intent as well as language.

But although David's intentions might feel true, they’re still only a pre-programmed objective. And much like HAL in 2001: A Space Odyssey or Alexa giving you the "I'm sorry, I didn't quite get that" treatment, there’s just no reasoning with a machine.

Open the pod bay doors please, Alexa

In T2, both the T-1000 and T-800 use facial recognition technology to identify their targets. 30 years later and this technology’s now become a reality.

At the time, we could never have imagined facial recognition could unlock phones or automatically tag people and places in images; it’s now part of our day-to-day lives.
However, the real-world application of this technology has come with its own set of challenges. When used in certain scenarios, such as police screenings, evidence suggests there’s still work that needs to be done to get things right.

“Some experts see the potential of artificial intelligence to bypass human error and biases. But algorithms used in artificial intelligence are only as good as the data used to create them—data that often reflect racial, gender, and other human biases,” experts warn in ‘Facing Bias in Facial Recognition Technology’ from The Regulatory Review.

But in some scenarios, artificial intelligence has proven to bypass human error. Ready Player One and its vibrant portrayal of augmented reality may have felt futuristic when the book was released in 2011 but, only a year later, Oculus Rift was in the works with its very own kickstarter. And by the time the movie adaptation hit the cinemas in 2018, a computer program had actually defeated the world champion in GO - the first time this has ever happened.

Although sci-fi has toyed with ideas of virtual realities for years; from Tron and Lawnmower Man to Demolition Man and The Matrix; we’ve never been more tuned into these virtual spaces than now.

They're just not always the ones that you’d come to expect from the movies.

Released in 2013, Hinge is the digital matchmaker that’s quite literally “made to be deleted”. But is that easy for us to let go? In the same year Hinge was released, Spike Jonze’s Her looks at what happens when the app is more appealing than its human recommendations.

Worryingly, the fact that we’re now developing apps to limit our screen time makes you wonder if this isn’t already the case.

So, what can we expect next from machine learning in the movies? And, perhaps more importantly, how will it affect us?

In recent years, Alex Garland’s sci-fi thriller Ex Machina saw an artificial life pass the Turing Test, make a human examiner fall in love with her and, consequently, escape into the world. Seven years later, we’re still no closer to walking amongst machines. But if the movies have taught us anything, don’t count out technology just yet.

Dont’ forget to share this post


Related Articles

Back to blog