This is technically called a "Denial of Service attack via resource poisoning" and it's really, really hard to prevent motivated people to stop poisoning information sources (see Wikipedia on any politically controversial topic). Admittedly Wayz is an app (not a self-driving car nav system) but while the specific access techniques to get to the database are different, the attack methodology is precisely the same.People who don't want Wayz routing cars through their neighborhoods are feeding it false data.It was here that Connor learned that some Waze warriors had launched concerted campaigns to fool the app. Neighbors filed false reports of blockages, sometimes with multiple users reporting the same issue to boost their credibility.
Thursday, June 9, 2016
Why self-driving cars are a huge risk
Because people are already subverting their navigation systems:
Sounds like people will be needing more independently owned and operated full service auto repair facilities. http://goo.gl/jhsG1Q
ReplyDeleteOne of the running jokes you hear about auto navigation systems is, "why don't they add an 'avoid ghetto' feature?" I actually experienced that in Chattanooga. It seems if enough Wayz users warned about problems in those areas, it work as well.
ReplyDeleteNo reason we can't use the feature for good.
Resource poisoning is exactly what I recommended to prevent the Obama administration from settling section 8 housing in your neighborhood. Just claim you are black on all HUD and census forms. Once your neighborhood, has "enough", you are safe.
ReplyDelete