Tuesday, March 5, 2019

Tech, Personal Information, and the Justice System

Social media has become an integrated part of our lives.
As electronic attacks become more common and many people don’t fully understand how to secure their personally identifiable information (PII). To complicate matters, hackers are becoming more sophisticated.
It’s for this reason that people should be more aware of what kinds of information they put online.
Social media is one of the most common places people feel the need to share what’s going on in their lives without regard to what kinds of sensitive information they may be broadcasting to the world.
If you haven’t seen the “Mayhem” commercial from Allstate Insurance, it’s kind of like this…

Comical as it is, this commercial underscores an important issue with how people are using technology and the dangers it presents for unsuspecting victims of cybercrime.

In the article below from Findlaw, William Vogeler urges people to understand that the problem of sharing personal information is a real issue. Volger offers some advice to help keep your information as safe as possible as our lives become more integrated with technology:
If we learned anything from the Terminator movies, it’s that technology can be really scary. Not only did Arnold Schwarzenegger come back, he became governor. In the real world, it’s even scarier. The Internet of Things means that everything with an electronic pulse can be hacked: your security system, your thermostat, your refrigerator! Not that anybody really wants to know what’s in your refrigerator, but the experts are right cyber-insecurity. Here’s what the smart ones at Harvard Business Review say to do about the inevitable cyber-invasion:
– Selectively digitize data
– Back up critical systems
– Don’t use devices that can’t be updated
– Make software companies accountable
These are all great points and seem simple enough to follow, but are they enough to keep your information safe?
Massive data breaches seem to happen more frequently and it seems consumers are taking the fallout as ones their personal information has become compromised.
For example, Apple announced that they were pulling Facebook’s enterprise license for questionable use of consumer data.

Apple’s Empty Grandstanding About Privacy

The speech is worth revisiting in light of an emerging fight between Apple and Facebook. Earlier this week, TechCrunch reported that Facebook had been paying people, including teens 13 to 17 years old, to install a “research” app that extracted huge volumes of personal data from their iPhones—direct messages, photos, emails, and more. Facebook uses this information partly to improve its data profiles for advertisement, but also as a business-intelligence tool to help paint a picture of competitor behavior.
After the story broke, Facebook said it would shut down the iOS version of the program. That wasn’t enough for Apple, which canceled Facebook’s ability to distribute custom iPhone apps for internal use by Facebook employees. That might look like a severe punishment that will send a strong message to Facebook, and to other companies. But it’s mostly a slap on the wrist. It gives Apple moral cover while doing little to change the data economy the company claims to oppose.
Though this was done publicly, it seems that was a bit more of a show than taking actual action against Facebook for privacy. It seems Apple’s actions might just be temporary “scolding.” It may be cynical to think things could very easily go back to the way they previously were with personal privacy in the crosshairs of big data.
Beyond people’s information being leaked, there’s an even bigger issue when it comes to criminal defense.
What happens when we start relying on tech to the point we view it as infallible. What happens to the justice system?
William Vogeler of FindLaw makes an interesting assertion, suggestions that technology could even be compromised, resulting in wrongful convictions. Here’s an excerpt from the article:
TechDirt points out the fly in the ointment of machine-learning. The algorithms use statistics to find patterns in data.
“So if you feed it historical crime data, it will pick out the patterns associated with crime,” TechDirt says. “But those patterns are statistical correlations — nowhere near the same as causations.”
It’s the old adage: garbage in, garbage out. But that should be applied to machines, not people.
We still need cops and judges to do their jobs, even with human errors.
Relying on machine learning is truly a double-edged sword. On one hand, it could help law enforcement catch criminals more effectively. However, on the other hand, do we really want to live in a police-state?
How many personal freedoms are we willing to give up in order to rest easy thinking we’re safe and secure?

Additional resources:


via Marvin Rodriguez Blog https://marvintrodriguez.wordpress.com/2019/01/31/technology-and-justice/

No comments:

Post a Comment