The Lede, Tuesday, March 13, 2018
By David Royse
Let’s Face a Coming Issue
Two stories in recent days lead me to giving you an update here on surveillance and facial recognition.
The first involves the increasing interest (including some pretty positive coverage in this newsletter) in the idea of using “smart technology” to improve how our cities work. This basically just means connecting our devices electronically so they don’t work in a vacuum. Street light replacers know when street lights go out because they’re connected to the internet of things. Firefighters know when there’s smoke even before someone can hit an alarm, because the smoke detector is connected to the internet of things. Police know when shots are fired even if no one calls them because of shot detecting audio devices connected to the internet of things.
But when LED lights began showing up and some communities began putting cameras in them, some civil rights activists worried about the proliferation of cameras and called for oversight of their use.
“I think rather than call them smart bulbs in smart cities I’d call them surveillance bulbs in surveillance cities,” Chad Marlow, advocacy and policy council for the ACLU, said in the above-linked City Lab story from last week.
But it seems to me the question of photo and video surveillance has long been answered. Horse out of barn. We’re being watched and photographed pretty widely.
From airports, to city streets, to schools and highways, images of us are being captured more than you might think. It’s not just government – but private entities, from Google to the 7-Eleven to Facebook (in that case because we give it to them), who have pictures of us. And again – the inter-connectivity of our networks makes that all potentially one big network.
“If you’re in public, you do not have an expectation of privacy,” New Orleans Mayor Mitch Landrieu says in a recent article in The Intercept. “I think that’s just the new day in age that we’re in, and people should conduct themselves accordingly.”
If we don’t have the ability to put toothpaste back in the tube, or we don’t want to because of the advantages of widespread photographic surveillance in public spaces, arguments now have to turn to what’s next: What can be done with those photographs and videos, and how may they be used.
One ongoing step in the debate – also not totally new, but still not completely ubiquitous – is how authorities may use facial recognition software. That technology – linked with camera networks, and operating close to real-time in some cases, is here.
Obviously, it’s a technical question – how reliable is it? – and also, a civil rights debate. Do people have some right to not be recognized, and if so, what are the details?
One very small part of this question of how the data should be used is in the news, because a guy convicted of selling drugs in Jacksonville, Fla., is challenging that city’s police department’s use of a facial recognition database to identify him as the man who sold drugs to undercover officers. He was convicted as a result of the identification and sentenced to eight years in prison.
The man, Willie Allen Lynch, isn’t asking an appellate court to invalidate the use of the statewide database his photo was matched against, but is challenging his conviction based on how its results were used in his case. (The Florida system, used by several agencies, is called FACES. The database matches query photographs against photos of 22 million people from Florida driver’s licenses and more than 11 million law enforcement photographs.)
When police sent cell phone photos of Lynch to a Jacksonville crime lab technician, she ran them through the database and got back some hits for possible matches. She then forwarded one match to the officers, who said, that was their guy.
Lynch wanted to see the other matches as a way of showing jurors that it could have been a case of mistaken facial recognition. He wasn’t allowed to. He also didn’t get to have the crime lab technician – or anyone else – testify about how the facial recognition software even works to find matches and how reliable those matches actually might be.
The case, which is pending before an appeals court in Tallahassee, brings up an interesting question that law enforcement and the law will have to grapple with: when a machine says you are someone in a photo, how much trust should we place in that machine’s judgement. (More on the case from the T-U)
So how widespread is the use of this kind of technology?
About half of us is in a law enforcement face recognition network, according to The Georgetown Law Center on Privacy and Technology. More than 25 states allow police to search driver’s license databases, as Florida does, to compare license images to suspect photos.
Meanwhile, questions arise about whether systems may be used to target certain communities. Black people in New Orleans are certainly worried about the system there. “Darker skin has less color contrast. And these algorithms rely on being able to pick out little patterns and color to be able to tell people apart,” computer scientist Jonathan Frankle, who was one of the authors of the Georgetown report linked above, notes in this NPR story.
It’s not just dark complected people who need to worry about this technology, all kinds of things make can make it less reliable, and research has shown that women and young people also can be easily misidentified. Though at least some of the research indicates that the technology itself is less of the problem, and more the training on how to use it.
Even the Georgetown authors aren’t saying that the technology shouldn’t be used — only that lawmakers need to create standards.
As we move forward, we will have to face (pun intended) the very real need to make sure that if we use these recognition programs, that they’re closely watched for inaccuracies.
So it may soon be that the only place we can have some privacy is in our homes (right, Alexa?)
But more than a billion people in the world don’t have an adequate home.
To wit, a company at SXSW has proposed a possible solution.
Austin-based construction startup ICON and nonprofit housing organization New Story say they can 3D print small homes in 24 hours at half the cost. That could, truly, revolutionize affordable home building in the places with the worst needs for basic shelter.
They’ve actually already built (printed) one in Austin, and are planning to start soon on a whole community in El Salvador.
Check this out. Let’s hope it works.
NOTES FROM THE AGE OF DISRUPTION:
Sets June 4 for new iPhone and Mac software debut. CNBC
BJ’s Wholesale and Instacart
Offer same-day delivery. CNBC
Wants to make space travel cheaper. And eats an iguana. Bloomberg
Plans to Start Carsharing Program, Dubbed Airbnb for Cars. Bloomberg TV
As always, I welcome your thoughts. @daveroyse on Twitter or firstname.lastname@example.org
I’ll let Bob Dylan take his out with his song about facial recognition technology.