We are doing 'big brother' to ourselves and on our own devices
First Amendment threats and defenses have, for much of the past 100 years, largely focused on protecting individual speech — the rights of any one of us to express ourselves without interference or punishment by the government.
Not to be too glib but, oh, those were the days! This glee is due, in no small part, to the degree that individual speech and press rights triumphed in that era. But looking into this new year, that situation — and those victories — may be more nostalgia than norm. There is increasing danger to our core freedoms from what I'll call "systemic" challenges, which often appear focused on other issues, but which carry a First Amendment impact, if not wallop.
The increasing public and commercial use of drones raise issues of noise, public safety and congestion in the airways — but also questions about what on-board cameras see and record that go far beyond earlier "peeping Tom" worries.
Consider a new network of drones constantly crisscrossing the skies over your hometown, constantly sending video of the passing scene to the insatiable maw of computer storage. Combine that record with facial recognition software, vehicle tracking devices and surveillance cameras that can ID license plates from miles away and it's but a small step to government discovery of who we meet, where and when, with resulting impact on the right of assembly or association.
We've known for some time there's a running joke, in national security and spy circles in this country and elsewhere, that we're now doing most of the surveillance work they used to do simply by living our lives on social media. Add the abilities of artificial intelligence to collect, collate and match social media and online data about any one of us and the kind of "anonymous" speech that produced the Federalist Papers is ever more nonexistent.
Put another way, George Orwell's draconian "Big Brother" presence was predicated on government installing a device in every home — and life — to observe each of us. In 2019, we're the ones installing the devices. Not just at home, but 24/7 in pockets and purses through smart phones, watches and the like.
In 2018, in two decisions involving GPS and cell phones, the U.S. Supreme Court pushed back on this new technological threat.
Chief Justice John Roberts said that cell phone location information is a "near perfect" tool for government surveillance, analogous to an electronic monitoring ankle bracelet. "The time-stamped data provides an intimate window into a person's life, revealing not only his particular movements, but through them his 'familial, political, professional, religious and sexual associations,'" Roberts wrote.
Try being a reporter, under such involuntary transparency in the future, attempting to meet secretly with a source about government corruption or official misconduct or a botched criminal investigation or an undisclosed, invasive national security policy. Good luck.
Let's round up this Pandora's box assembly of threats with a look at the 2020 election cycle. Not only will legitimate reports by a free press be mixed in with mis- and disinformation, a new technological threat challenges the adage that "seeing in believing."
What's included in "involuntary synthetic imagery" (a mouthful of a title) is the sinister possibility of videos that take real situations and seamlessly "paste" faces of politicians and others onto actual participants. Imagine misleading or embarrassing video that's nearly impossible for most to distinguish from the real thing. Tragically, such fakery already has invaded our lives thanks to what's known as "deepfake" porn.
How do we square such "deepfake" videos with First Amendment law, which — with the exceptions when such fake video clearly is being used for extortion or blackmail — would tend to side with free expression and with those who create such works? When would satire cross the line into defamation or intentional infliction of emotional distress — two traditional, but often expensive, time-consuming legal tools available to those who claim injury from such fakery?
And what of news consumers, already besieged by fakery on social media, claims of bias in news reporting by various outlets old and new, photo and video edits that distort, who already have a deep distrust of much of what they see, hear and read?
Despite all this, not the entire look into 2019 is glum. News consumers have more tools to identify misleading items. The fact-checking industry can be paired with "trust" projects and background programs — such as (self-promotion alert!) the Freedom Forum Institute's "Newstrition" tool.
More of us than ever appear concerned about our First Amendment rights than at any time in the past 25 years. Let's keep that concern and attention going and growing in the new year.