Five movies that got technology wrong

Given we have our phones glued to our hands almost 24/7 and computer screens seem to dominate every school, workplace and home, you’d think that in the 21st Century most people would be fairly tech-literate.

But that hasn’t stopped Hollywood sometimes getting things hopelessly wrong. Here are five tech mistakes from silver screen classics.

1. Swordfish (2001)

What makes the hacking scene even less believable is that Hugh Jackman only has 60 seconds in which to complete it.

Hacking in the movies always looks incredibly exciting. A good example is tech-thriller Swordfish in which Hugh Jackman plays a white-hat (good) hacker trying to crack a terrorist network.

Sadly, in real life hacking just looks like someone typing endless lines of code into a computer. Or, more likely, the computer just running on its own, trying to find a way into whatever the hacker wants to access.

It hardly makes for great cinema however, so it’s no wonder that movies like Swordfish feature frenzied keyboard-bashing, ticking timers, and file systems that are represented on the screen like video-games.

2. Jurassic Park (1993)

Given how it turned out, maybe it would have been better to have had access denied to the whole of Jurassic Park.

We’ve all seen films where the hacker works feverishly to gain access to a network only to be met with an ominous 'ACCESS DENIED' screen. More often than not, this will be in HUGE letters on a scary red background. Then, a bit of Olympics-worthy typing later . . . success and an equally prominent 'ACCESS GRANTED' screen. (1993’s Jurassic Park contains one of the earliest examples.)

But when did you last get a message like that? Never, that’s when. A friendly 'Have you forgotten your password?' prompt is more likely, without any red-screened drama.

3. Blade Runner (1982)

Bladerunner didn't just invent a new technological system, it also invented a language called Cityspeak.

This common technology error started with Ridley Scott’s classic science fiction movie Blade Runner in 1982, but now turns up incredibly frequently. The main character stares intently at a blurry picture, often a suspect’s face or piece of unreadable text. “Enhance the image!” she demands, and a pin-sharp picture is revealed.

In reality, digital pictures are made of tiny dots of colour called pixels, and low-resolution images have far fewer pixels than high-resolution ones. If the image is low-resolution there’s no way of ‘adding pixels’ that weren’t caught by the camera to make it a high-resolution one. While imaging software has improved massively in recent years, the ability to zoom and enhance as we see in Blade Runner and other films is still not possible - that blurry face is going to remain blurry.

4. Independence Day (1996)

Writing a deadly virus that destroys an unknown, alien operating system. Hmm...

One of the classic techno-goofs in movies occurs in 1996’s alien invasion film Independence Day. Jeff Goldblum plays a brilliant scientist who writes a deadly computer virus in just a few hours (itself pretty much impossible), before uploading it to the hostile beasties’ mothership. The result? The invaders are destroyed and the Earth is saved. Hooray!

The problem? Computer viruses are written for specific operating systems, so Windows viruses are written for machines using Windows, viruses that affect Macs are written specifically for them and so on.

Unless Jeff Glodblum’s character had a good, intimate knowledge of the alien’s computer systems well in advance, it would be nigh on impossible to write a virus that would corrupt a completely unknown, totally alien operating system. Epic tech-fail!

5. 2001: A Space Odyssey (1968)

It may look beautiful, but in reality it just wouldn't happen.

Stanley Kubrick’s 2001: A Space Odyssey (1968), Ridley Scott’s Alien (1979) and Steven Spielberg’s Jurassic Park (1993) all feature computer monitors so bright that you can almost read what’s on them projected onto the face of whoever’s using the computer. It’s a visually stunning idea, so you can see why directors would like it.

But in reality, a screen so bright would firstly be so dazzling it would be very difficult to use, and secondly wouldn’t project words or images sharply enough for anyone watching to be able to read them.

Have you spotted any more tech fails in films? Tweet @BBCBitesize with the hashtag #TechFails and tell us about the worst ones you’ve seen on the big screen.

Six blockbuster films that got history wrong
When film and TV got the future spot-on
Five movies that get dating wrong