The Business & Technology Network
Helping Business Interpret and Use Technology
«  

May

  »
S M T W T F S
 
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
 
 
 
 
 
 
 
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
31
 
 

Stop Expecting Tech Companies To Provide ‘Consequences’ For Criminal Behavior; That’s Not Their Job

DATE POSTED:May 6, 2024

Whose job is it to provide consequences when someone breaks the law?

It seems like this issue shouldn’t be that complicated. We expect law enforcement to deal with it when someone breaks the law. Not private individuals or organizations. Because that’s vigilantism.

Yet, on the internet, over and over again, we keep seeing people set the expectations that tech companies need to provide the consequences. That’s even when those who actually violate the law already face legal consequences.

None of this is to say that tech companies shouldn’t be focused on trying to minimize the misuse of their products. They have trust & safety teams for a reason. They know that if they don’t, they will face all sorts of reasonable backlash from advertisers or users leaving, due to negative media coverage and more. But demanding that they face legal consequences, while ignoring the legal consequences facing the actual users who violated the law… is weird.

For years, one of the cases that we kept hearing about as an example of why Section 230 was bad and needed to be done away with was Herrick v. Grindr. In that case, a person who was stalked and harassed sued Grindr for supposedly enabling such harassment and stalking.

What’s left out of the discussion is that the guy who stalked Herrick was arrested and ended up pleading guilty to criminal contempt, identity theft, falsely reporting an incident, and stalking. He was then sentenced to over a year in prison. Indeed, it appears he was arrested a few weeks before the lawsuit was filed against Grindr.

So, someone broke the law and faced the legal consequences. Yet some people are still much more focused on blaming the tech companies for not somehow “dealing” with these situations. Hell, much of the story around the Herrick case was about how there were no other remedies that he could find, even as the person who wronged him was, for good reason, in prison.

We’re now seeing a similar sort of thing with a new case you might have heard about recently. A few weeks ago, a high school athletic director, Dazhon Darien, was arrested in Baltimore after using some AI tools to mimic the principal at Pikesville High School, Eric Eiswart. Now Darien may need to use his AI tools to conjure up a lawyer.

A Maryland high school athletic director is facing criminal charges after police say he used artificial intelligence to duplicate the voice of Pikesville High School Principal Eric Eiswert, leading the community to believe Eiswert said racist and antisemitic things about teachers and students.

“We now have conclusive evidence that the recording was not authentic,” Baltimore County Police Chief Robert McCullough told reporters during a news conference Thursday. “It’s been determined the recording was generated through the use of artificial intelligence technology.”

Dazhon Darien, 31, was arrested Thursday on charges of stalking, theft, disruption of school operations and retaliation against a witness after a monthslong investigation from the Baltimore County Police Department.

This received plenty of attention as an example of the kind of thing people are worried about regarding “deepfakes” and whatnot: where someone is accused of doing something they didn’t by faking proof via AI tools.

However, every time this comes up, the person seems to be caught. And, in this case, they’ve been arrested and could face some pretty serious consequences including prison time and a conviction on their record.

And yet, in that very same article, NPR quotes professor Hany Farid complaining about the lack of consequences.

After following this story, Farid is left with the question: “What is going to be the consequence of this?”

[….]

Farid said there remains, generally, a lackluster response from regulators reluctant to put checks and balances on tech companies that develop these tools or to establish laws that properly punish wrongdoers and protect people.

“I don’t understand at what point we’re going to wake up as a country and say, like, why are we allowing this? Where are our regulators?”

I guess “getting arrested and facing being sentenced to prison” aren’t consequences? I mean, sure, maybe it doesn’t have the same ring to it as “big tech bad!!” but, really, how could anyone say with a straight face that there are no consequences here? How could anyone in the media print that without noting what the focus of the very story is?

It already breaks the law and is a criminal matter, and we let law enforcement handle those. If there were no consequences, and we were allowing this as a society, Darien would not have been arrested and would not be facing a trial next month.

I understand that there’s anger from some corners that this happened in the first place, but this is the nature of society. Some things break the law, and we treat them accordingly. Wishing to live in a world in which no one could ever break the law, or in which companies were somehow magically responsible for guaranteeing no one would ever misuse their products is not a good outcome. It would lead to a horrific mess of mostly useless tools, ruined by the small group of people who might misuse them.

We have a system to deal with criminals. We can use it. We shouldn’t be deputizing tech companies which are problematic enough to also have to take on Minority Report “pre-crime” style policing as well.

I understand that this is kinda Farid’s thing. Last year we highlighted him blaming Apple for CSAM online. Farid constantly wants to blame tech for the fact that some people will misuse the tech. And, I guess that gets him quoted in the media, but it’s pretty silly and disconnected from reality.

Yes, tech companies can put in place some safeguards, but people will always find some ways around them. If we’re talking about criminal behavior, the way to deal with it is through the criminal justice system. Not magically making tech companies go into excess surveillance mode to make sure no one is ever bad.