I have been asked by Science & Film to review the realism of EYE IN THE SKY in terms of the new technologies we see deployed in the film. Most of the technologies employed in the film narrative have some basis in reality, though many are still in very early stages, or proof-of-concept, and remain far from the reliable and useful technologies depicted in the film.
EYE IN THE SKY is a contemporary military drama starring Helen Mirren and the late Alan Rickman, in his last on-screen appearance, in the respective roles of a United Kingdom colonel and general. It was written by Guy Hibbert and directed by Gavin Hood, who also directed Ender’s Game. The narrative of the film begins with an attempt to capture terrorist suspects in Kenya, but evolves into a tense drama over whether to launch a drone strike in order to avert a terror plot, and the morally challenging questions of proportionality given the risk to civilians in the area of the strike. The accuracy of film’s depiction of the military chain-of-command and political control over this fictional joint U.K.-U.S. drone strike in Kenya is also an interesting question of realism, for another review.
Of course, the armed General Atomics MQ-9 Reaper drone technologies have been around for more than a decade, and are depicted with a great deal of accuracy in the film. These drones are remotely operated by pilots and sensor operators who can be thousands of miles from where the drones are flying. In this case, a U.S. drone pilot, played by Aaron Paul, is stationed at Creech Air Force Base outside of Las Vegas. The film depicts these operators using touch screens, which are not available in the older ground control stations depicted, but are available in the recently updated ground control stations that include multiple screens and joysticks used for videogames.
Much of the drone imagery depicted in the film, the footage we see collected by drone cameras, appears realistic. The actual resolution of those advanced surveillance cameras is classified, but we know that Reaper drones carry the DARPA funded ARGUS-IS and Gorgon Stare systems, which capture 1.8 gigapixel images at 12 frames per second. That is equal to an array of 368 5-megapixel smartphone cameras, allowing an incredible digital zoom while collecting imagery over a large geographic area.
You can see in the film’s trailer several of the more futuristic technologies: a small bird-like drone; an even smaller insect-like drone; and advanced face-recognition technologies. Each of these has a basis in recent and ongoing engineering research, mostly funded by the Defense Advanced Research Programs Agency (DARPA) for the Pentagon, but do not yet perform at anything like the level necessary to be used as they are in the film. DARPA is known for funding research into advanced topics that may–or may not–wind up being useful for the military or civilians, such as the ARPA-net that evolved into the internet, and the DARPA Grand Challenge that led to Google’s self-driving car.
Actual research into bird-like drones includes a DARPA-funded hummingbird drone called the Hummingbird Nano UAV. Built by AeroVironment, it was demonstrated in 2011 as a proof-of-concept for very small drones using wing-flapping for flight control and propulsion. While this biologically-inspired mechanical drone is fascinating, it faces many practical challenges for covert missions including the ability to carry enough battery power to fly more than a few minutes while carrying its camera and other sensors, and being loud enough that it would likely draw attention. DARPA, however, has continued funding work on small high-speed drones capable of obstacle avoidance, which look like the quadrotors we are now accustomed to seeing.
The film also depicts a bug-like drone that is much smaller and flies inside a house without drawing attention. There have been a variety of projects that either attempt to build biologically-inspired mechanical insect drones, or establish remote control over organic insect cyborgs.
Among the most advanced mechanical insects is Harvard’s RoboBee project. This insect-sized drone is capable of flight, and even swimming, though it cannot carry its own battery power, and remains tethered by power cables. It will also be difficult for such small drones to carry much in the way of sensors like cameras and microphones.
DARPA has also been funding research into using living insects as drones, by controlling their nervous system remotely. MIT pioneered this work with large moths back in 2012, while labs at North Carolina State continue the research. UC Berkeley had a similar project using large beetles.
The other technology that is frequently employed in the film is face recognition. We see several examples where a full front of a face is matched to a mug shot or ID photo. This technology has been around for a while, and is even used by Facebook to identify your friends in your photos. The technology is far from perfect, however, and a recent National Institute of Standards and Technology (NIST) study found error rates in the single digits for the best of the commercial algorithms it tested. This error rate is only attainable when the database used contains photos taken in controlled and standards-compliant conditions. The same algorithms don’t work well on webcam images, and would not work well if the terrorist suspect you are looking for has not sat down for an ANSI/NIST ITL 1-2011 Type 10 standard photo portrait. Even more difficult is recognizing someone in an image when their face is partially obscured by sunglasses, veils, or is only seen in profile. Under such conditions, and without databases containing profile shots, face recognition is highly unreliable, which makes it all the more worrisome that such technologies might be used to confirm identities for individuals on terrorist targeting lists.
In the end, the film is compelling because of the moral questions it poses, and the brilliant acting work. A key part of the moral calculus that the characters in the film struggle with is the estimate of civilian casualties that are likely to result from the drone strike. We even see the colonel in charge of the drone operation, played by Helen Mirren, demand of an underling to adjust the parameters to shift the likelihood of civilian deaths to a more desirable percentage in order to appease her commanders. That bit of technology is called BugSplat, and it is a real software tool that has been used since 2003 to estimate the collateral damage from Air Force bombs. BugSplat actually got its name from the shape of the probabilistic damage area map it produces, but has been much maligned as the term has come to be used by military personnel to refer to collateral damage victims of drone strikes.
EYE IN THE SKY is a film that is definitely worth viewing. As you watch it, keep in mind that while some of the advanced technologies depicted are not yet out in the field, many are only a few years away from being a reality.