Safety Last: AI Weapons Scanners Sold To US Schools Routinely Fail To Detect Knives

from the haphazardly-thinking-of-the-children dept

We’ve done all we can we’re willing to do to make schools safer. We’ve added more cops, something that sounds like safety but just means we’ve offloaded school discipline to people trained in the art of violence. We’ve locked more doors, added more machinery, and opened up our students to all sorts of pervasive surveillance.

And yet, we still lead the world in school shootings. Maybe that’s where we’re going wrong. Maybe we need to look towards our nearest analogue, the home of the founding fathers we told to pack up and go home back in 1776. When the dust and bullets cleared, we got the Second Amendment. They got Knifecrime Island.

The biggest threat in the US is guns. Guns are easy. Guns allow people to kill without having to generate much physical effort. The UK has clamped down on guns, but still finds itself dealing with plenty of violence, mostly of the knife variety.

Let’s let the Brits train our AI. Maybe that way we can catch weapons that aren’t guns before they’re wielded against children. Whoever’s training AI to catch weapons isn’t doing the best they can to prevent weapons from being brought into schools. This report, coming to us from our former landlords at the BBC, says weapons are eluding school entry checkpoints at an alarming rate.

Evolv Technology is a security firm that wants to replace traditional metal detectors with AI weapons scanners.

Instead of simply detecting metal, Evolv says its scanner “combines powerful sensor technology with proven artificial intelligence” to detect weapons.

Cool shit, non?

Non. Non, indeed.

Evolv claims its system is “highly accurate” and utilizes intelligence capable of detecting weapons ranging from improvised explosives to guns to knives. But it apparently ain’t all that great when it comes to the last item on that list.

However, a BBC investigation last year revealed that testing had found the system could not reliably detect large knives – after Evolv’s scanner missed 42% of large knives in 24 walk-throughs.

That information was passed on to Evolv. BBC testers told Evolv to inform its clients that its AI failed to detect knives almost 50% of the time. Its clients — at that time — included stadiums across the US, as well as the Manchester Arena in the UK.

It appears Evolv believes no news (delivered to its customers) is good news. Or, at the very least, if it’s not good news, then it’s definitely good business. Keeping its customers in the dark has allowed Evolv to expand it market base, despite selling a faulty product that misses one form of deadly weapon more than 40% of time.

Despite this, the company has been expanding into schools, and now claims to be in hundreds of them across the US.

Knives have rarely contributed in mass killings in the United States. We have always had better violence options.

But that fact shouldn’t be used to excuse a company’s unwillingness to inform current and potential customers of its shortcomings. Sure, guns are a bigger problem in the United States, but Evolv isn’t just plying it wares in the Land of the Free and home of the Mass Shooting. It’s selling apparently faulty tech to customers elsewhere in the world where gun rights are more limited and knives have become the most efficient way to engage in mass violence.

And Evolv knows it’s falling down on the job. That knowledge isn’t deterring it from pitching its products while it attempts to find a solution. In fact, it appears Evolv isn’t looking for solutions. It’s just asking its copywriters to create more absolutional sales pitches.

Following a high-profile stabbing in a New York school utilizing Evolv’s tech, Evolv began rewording its pitch pages on its website to distance itself from its previous promises of impervious defense to something much more vague and more in line with the sales pitches of cop tech, which have moved away from the term “non-lethal” following deadly deployments of their products to phrases that actively distance these tech purveyors from legal liability, like “less lethal.”

After the stabbing, the wording on Evolv’s website changed.

Up until October last year, Evolv’s homepage featured a headline that boasted of “Weapons-Free Zones”. The company then removed that wording, and changed the text to “Safe Zones”. It has now been changed again and reads “Safer Zones”.

But “safer” than what? Staying at home? Using plain old metal detectors? Belated attempts at plausible deniability?

The BBC has yet again asked Evolv to explain itself and inform its customers it’s not all that great at detecting certain weapons. And yet again, the company has refused to engage directly with the BBC and its investigative journalists. Instead of directly responding to this article, the company has directed people to an exonerative blog post written by its CEO, Peter George, in which the company claims the reason it won’t answer questions about its faulty tech is because it doesn’t want violent threats to children to exploit these details to thwart its obviously-faulty system.

But who needs to thwart anything? It’s pretty much a coin toss whether or not someone heading into a school will be caught with a knife. Why bother with the dissembling when you can just roll the dice on tech that claims it will keep schools free of weapons slightly less free of weapons may catch some weapons.

If that’s the best Evolv can do, it can be done better by cheaper tech sold by companies that actually know how to detect weapons, rather than claim they’re doing some sort of sci-fi shit with their over-priced scanners and then directing people to non-apologies every time it’s pointed out they’re failed to deliver on their promises.

Filed Under: , , ,
Companies: evolv

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Safety Last: AI Weapons Scanners Sold To US Schools Routinely Fail To Detect Knives”

Subscribe: RSS Leave a comment
6 Comments
Anathema Device (profile) says:

Shades of this nonsense:

Fake bomb detectors still being used in Baghdad

A corrupt British businessman has been convicted of fraud after a jury found him guilty of a multi-million pound scam selling bogus bomb detecting equipment around the world.

Iraq spent more than $40m (£26.2m) on 6,000 of Jim McCormick’s fake devices between 2008 and 2010.

Despite the authorities in the UK establishing the bomb detectors were useless in 2012, the BBC’s Ben Brown discovered they were still in widespread use in Baghdad while filming there in March 2013.

https://en.wikipedia.org/wiki/ADE_651

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...