Seven Ways Technology Can Get You Arrested for Something You Didn't Do
Millions of people get arrested every year—about 7.36 million in 2022 alone. For most of us, that seems pretty abstract, because most of us aren’t criminals, and we assume that if we haven’t done anything wrong we have nothing to worry about.
But there are at least two things wrong with that assumption. First, almost everyone does something illegal in their lives that they view as a victimless crime or simply don’t realize is illegal, so your chances of being arrested someday may be higher than you suppose. And second, law enforcement and corporations are increasingly cutting out human beings and relying on technologies like automation, facial recognition, and artificial intelligence (often in combination), and those technologies are flawed—like, really flawed. And interacting with them can lead to false accusations and even arrests, even if you did absolutely nothing wrong. Here are seven ways you could be arrested today without even thinking about a crime.
Self-checkout
Self-checkout kiosks have been controversial, and many retailers are rethinking them, but they’re still pretty common. And if they make a mistake you could find yourself in a lot of trouble. Consider the example of Olympic athlete Meagan Pettipiece, who bought $176 worth of groceries at a self-checkout in a Walmart. The kiosk she used missed two items: Ham and asparagus. Pettipiece did scan the items, but the kiosk failed to register them—she did nothing wrong. But the police were called, and when they searched her bag they found marijuana and prescription medication, and she was arrested and charged with theft and possession of controlled substances.
The charges were later dropped, but not before Pettipiece’s life was ruined: She resigned from her coaching job and suffered damage to her reputation that will follow her forever. So next time you’re buying groceries at the self-checkout, make sure that every single thing gets scanned.
Facial recognition
Facial recognition technology is buggy and unreliable (often in very racist ways), but that doesn’t stop corporations and law enforcement agencies from using it, with unsurprisingly disastrous results. For example, Harvey Murphy Jr. was arrested when facial recognition software used by Houston retailers Macy’s and Sunglass Hut identified him as the perpetrator of an armed robbery. Murphy was in jail for two weeks, during which he was allegedly assaulted multiple times by other prisoners. But Murphy wasn’t just innocent: He wasn’t even in Texas when the robbery took place. This actually happens a lot, and it could happen to you if a facial recognition tool glitches and offers up your name for no reason whatsoever.
License plate cameras
Automated license plate readers are used by police departments to identify cars involved in crimes. If a car is, say, involved in a robbery or shooting, the readers can spot the plate number and alert police, who can then put officers on alert for the make, model, and license number of the car.
You can guess where this is going: License plate readers make mistakes. In North Carolina, for example, Jacqueline McNeill was arrested on suspicion of being involved in a shooting. The arrest was based on an automated license plate reader that mistakenly identified her car as the one being involved. She was held for several hours and interrogated, then released. She ultimately settled a lawsuit against the city for $60,000.
Incorrect databases
If you’ve ever had any run-ins with the law that were resolved—a dismissed case, a settled lawsuit—you might think your nightmare is over and you can get back to your life. But case management is increasingly automated these days—and the software that handles it is as buggy and unreliable as, well, all software. In California a few years ago a new case management system suddenly began treating old arrest warrants as current, and a spate of false arrests occurred because the police were given incorrect information. In other words, if a single piece of data in a complex database flips from a 1 to a 0, something that you handled years ago could result in a fresh arrest.
Flawed photo analysis
There’s really no such thing as online privacy—your files, photos, voicemails, and messages are all stored somewhere, and someone has access to them even if they’re supposedly protected. Companies like Google, which see enormous amounts of media pass through their servers, often utilize automatic scanning to identify and flag material that might be illegal—but when it gets it wrong, it leaves a ruined life or two in its wake.
In 2021, for example, a father took photos of his toddler and sent them to his physician for analysis. Google’s review algorithm flagged the photos and referred the man to law enforcement on suspicion of trading in child sexual abuse images. The police quickly cleared the man of any wrongdoing—but Google refused to reinstate his accounts. The lesson here is to remember that everything you post, store, email, or create using any internet-connected platform isn’t private, and could be easily misinterpreted by a soulless algorithm, leading to your arrest—and possibly worse.
Field drug test
Police often use field drug tests when they suspect someone is under the influence of controlled substances—about 773,000 out of 1.5 million drug arrests in this country are based on evidence gathered via a field test. But these tests are considered “presumptive” because the technology is not very reliable. The tests are cheap (they cost about $2 each) and disposable, and are so laughably terrible at their jobs that they frequently misidentify everything from cotton candy to vitamins as drugs.
Clarice Doku was arrested in 2018 after a field drug test reported the folic acid she was taking in hopes of getting pregnant as ecstasy. She and her husband spent two weeks in jail, she lost her job, he missed his citizenship ceremony, and the charges were, of course, eventually dismissed.
Technology makes our lives easier. Until it makes them much, much more difficult, especially when it leads to a false arrest based on nothing but some phantom data you can’t control.
Source: View source