Five movies that got technology wrong
Given we have our phones glued to our hands almost 24/7 and computer screens seem to dominate every school, workplace and home, you’d think that in the 21st Century most people would be fairly tech-literate.
But that hasn’t stopped Hollywood sometimes getting things hopelessly wrong. Here are five tech mistakes from silver screen classics.
1. Swordfish (2001)
Hacking in the movies always looks incredibly exciting. A good example is tech-thriller Swordfish in which Hugh Jackman plays a white-hat (good) hacker trying to crack a terrorist network.
Sadly, in real life hacking just looks like someone typing endless lines of code into a computer. Or, more likely, the computer just running on its own, trying to find a way into whatever the hacker wants to access.
It hardly makes for great cinema however, so it’s no wonder that movies like Swordfish feature frenzied keyboard-bashing, ticking timers, and file systems that are represented on the screen like video-games.
2. Jurassic Park (1993)
We’ve all seen films where the hacker works feverishly to gain access to a network only to be met with an ominous 'ACCESS DENIED' screen. More often than not, this will be in HUGE letters on a scary red background. Then, a bit of Olympics-worthy typing later . . . success and an equally prominent 'ACCESS GRANTED' screen. (1993’s Jurassic Park contains one of the earliest examples.)
But when did you last get a message like that? Never, that’s when. A friendly 'Have you forgotten your password?' prompt is more likely, without any red-screened drama.
3. Blade Runner (1982)
This common technology error started with Ridley Scott’s classic science fiction movie Blade Runner in 1982, but now turns up incredibly frequently. The main character stares intently at a blurry picture, often a suspect’s face or piece of unreadable text. “Enhance the image!” she demands, and a pin-sharp picture is revealed.
In reality, digital pictures are made of tiny dots of colour called pixels, and low-resolution images have far fewer pixels than high-resolution ones. If the image is low-resolution there’s no way of ‘adding pixels’ that weren’t caught by the camera to make it a high-resolution one. While imaging software has improved massively in recent years, the ability to zoom and enhance as we see in Blade Runner and other films is still not possible - that blurry face is going to remain blurry.
4. Independence Day (1996)
One of the classic techno-goofs in movies occurs in 1996’s alien invasion film Independence Day. Jeff Goldblum plays a brilliant scientist who writes a deadly computer virus in just a few hours (itself pretty much impossible), before uploading it to the hostile beasties’ mothership. The result? The invaders are destroyed and the Earth is saved. Hooray!
The problem? Computer viruses are written for specific operating systems, so Windows viruses are written for machines using Windows, viruses that affect Macs are written specifically for them and so on.
Unless Jeff Glodblum’s character had a good, intimate knowledge of the alien’s computer systems well in advance, it would be nigh on impossible to write a virus that would corrupt a completely unknown, totally alien operating system. Epic tech-fail!
5. 2001: A Space Odyssey (1968)
Stanley Kubrick’s 2001: A Space Odyssey (1968), Ridley Scott’s Alien (1979) and Steven Spielberg’s Jurassic Park (1993) all feature computer monitors so bright that you can almost read what’s on them projected onto the face of whoever’s using the computer. It’s a visually stunning idea, so you can see why directors would like it.
But in reality, a screen so bright would firstly be so dazzling it would be very difficult to use, and secondly wouldn’t project words or images sharply enough for anyone watching to be able to read them.