GangWang GPS Navigation Attack Leads Unsuspecting Drivers Astray

A proof-of-concept attack that uses realistic fake turn-by-turn navigation directions for in-car GPS systems has managed to fool drivers into following them a full 95 percent of the time in testing.

Mobile navigation services are used by billions of users around the globe today. While GPS spoofing is a known threat, it has not been clear what the real-world danger is: Randomly setting user locations can easily trigger a fake routing instruction, but human drivers will either quickly realize they’re being led in the wrong direction, or, the instructions may contradict with physical road constraints (i.e., asking drivers to turn left in the middle of a highway).

However, fresh experiments from a coalition team consisting of researchers from Virginia Tech, the University of Electronic Science and Technology in China and Microsoft showed that carefully crafting the spoofed GPS inputs (with cheap, readily available hardware) with an eye to the actual physical environment can successfully lead human operators astray most of the time.

“Compared to spoofing a drone or a ship, there are unique challenges to manipulate the road navigation systems,” the researchers said in a paper on the work. “First, a road-navigation attack has strict geographical constraints. It is far more challenging to perform GPS spoofing attacks in real-time while coping with road maps and vehicle speed limits. In addition, human drivers are in the loop of the attack, which makes a stealthy attack necessary.”

The new attack, which the researchers dubbed “GangWang,” spoofs a route that mimics the shape of the route displayed on the map, using navigation instructions that make sense (e.g., triggering the “turning right” prompt only when there is an actual right-turn ahead), to avoid alerting users.

To test the idea, the team designed an algorithm based on 600 taxi routes in Manhattan and Boston. The code searches for attacking routes in real-time that would match the targeted victim’s location and remain consistent with the physical road network. On average, it was able to identify 1,547 potential attacking routes for each target trip for a would-be attacker to choose from.

“If the attacker aims to endanger the victim, the algorithm can successfully craft a special attack route that contains wrong-ways for 99.8 percent of the trips,” the researchers explained. “The algorithm also allows the attacker to pre-define a target destination area to lead the victim to.”

Meanwhile, they were able to build a portable spoofer for $223 with off-the-shelf hardware, which was able to easily penetrate car bodies to take control of GPS navigation systems.

Civilian GPS systems are, they explained, very vulnerable to spoofing attacks. Attackers can lure a victim GPS receiver to migrate from the legitimate signal to the spoofed signal in two ways: Either by transmitting false signals at a high power, causing the victim receiver to lose track of the satellite feeds and lock on to the stronger spoofed signals; or by transmitting signals synchronized with the legitimate ones, which then gradually overpower the original signal to cause the migration.

From there, an attacker can manipulate the victimized GPS receiver by either shifting the signals’ arrival time or modifying the navigation messages. For the experiments, the researchers used the algorithm they created from the taxi traces to do the latter.

“Our measurement shows that effective spoofing range is 40–50 meters, and the target device can consistently latch onto the false signals without losing connections,” the researchers said. “The results suggest that adversaries can either place the spoofer inside/under the target car and remotely control the spoofer, or tailgate the target car in real time to perform spoofing.”

In testing using a driving simulator, GangWang was able to succeed in getting drivers to follow the directions to the wrong location in 95 percent of cases. Out of 40 participants doing four rounds of driving each, only one U.S. volunteer and one Chinese participant recognized the attack.

In the former case, the driver realized there were inconsistencies: He was driving on a highway with a gas station on his right; but Google Maps showed that he was on a local way without a gas station nearby. He also checked the street signs and saw that there were inconsistent road names.

The Chinese participant recognized the attack during the first round, also alerted by the “highway and local way” inconsistency.

The ramifications of the trials are clear: In a stalking or random criminal scenario, the ability to guide someone to an out-of-the-way, isolated location could be a precursor to kidnapping or worse. There’s also a nuisance aspect – sheer malicious mischief.

Of course, this is likely to succeed only when the user is in an unfamiliar location – which, arguably, is the only time someone would be using navigation in the first place.

“The key intuition is that users are more likely to rely on GPS services when navigating in unfamiliar areas (confirmed via user study),” researchers wrote. “In addition, most navigation systems display the ‘first-person’ view, which forces users to focus on the current road and the next turn.”

The other takeaway is that such spoofed directions would be much more successful in an autonomous driving scenario.

“Misdirection…only works if the human is fully dependent on the directions of the GPS system and does not use any level of human intuition on what they are being told,” said Chris Morales, head of security analytics at Vectra, in an email to Threatpost. “This might be more of a problem with the rise of autonomous systems, but currently it feels like it would cause more of an inconvenience to person dependent on GPS than it would be a serious threat that can cause harm. If anything, I see it as an argument as to why fully autonomous systems might be considered dangerous and it is important to keep human judgement in the decision loop.”