• I Ran the DHS Intelligence Unit. Its Reports on Journalists are Concerning.

    The intelligence arm of the Department of Homeland Security, known as the Office of Intelligence and Analysis (DHS I&A), has been the subject of extensive criticism recently, first for questionable intelligence support to law enforcement in Portland, Oregon, and then for its deeply problematic intelligence reports naming U.S. journalists reporting on I&A’s own actions. Gen. Francis X. Taylor (USAF, retd), who served as under-secretary of intelligence and analysis at the Department of Homeland Security from 2014 to January 2017, writes that the investigation of the mistakes DHS I&A made in Portland and in reporting on journalists “should focus not only on personnel on the ground, but—more importantly—on those who demanded that the intelligence agency depart from its guidelines,” and he adds that “it is important to distinguish between the danger of I&A acting beyond its authority and the value that the office can provide when it works well.”

  • How the DHS Intelligence Unit Sidelined the Watchdogs

    Several months ago, the leadership of the Office of Intelligence and Analysis asked DHS’s second-in-command, Ken Cuccinelli, to limit a department watchdog from regularly reviewing the intelligence products it produces and distributes. Cuccinelli signed off on the move, according to two sources familiar with the situation, which constrained the role of the department’s Office of Civil Rights and Civil Liberties in approving the intelligence office’s work. Benjamin Wittes writes that “It is no wonder, under these circumstances, that there has been a rash of cases in which the office [DHS I&A] seems to have collected and disseminated “intelligence” on absurd subjects (including but not limited to me).”

  • What if J. Edgar Hoover Had Been a Moron?

    Benjamin Wittes, founder and co-editor of Lawfare, writes that it was on the ninth day of the Trump presidency, when writing in response to the new president’s new travel ban executive order, that he coined the phrase “malevolence tempered by incompetence.” But he never imagined in doing so that the phrase might aptly describe the Trump administration’s behavior toward him personally. In his detailed article, Wittes looks at both the incompetence, “which is simple and easy to understand and genuinely amusing,” and then the malevolence beneath it—”which is more complicated and is not amusing at all.”

  • DHS Authorizes Domestic Surveillance to Protect Statues and Monuments

    You might not imagine that the U.S. intelligence community would have much stake in local protests over monuments and statues, Steve Vladeck and Benjamin Wittes write, but you’d be wrong. An unclassified DHS memo, provided to Lawfare, makes clear that the authorized intelligence activity by DHS personnel covers significantly more than protecting federal personnel or facilities. It appears to also include planned vandalism of Confederate (and other historical) monuments and statues, whether federally owned or not. “[W]e do not accept that graffiti and vandalism are remotely comparable threats to the homeland [as attacks on federal buildings] — or that they justify this kind of federal response even if, in the right circumstances, such activity would technically constitute a federal crime,” Vladeck and Wittes conclude.

  • EFF Launches Searchable Database of Police Use of Surveillance Technologies

    The Electronic Frontier Foundation (EFF), in partnership with the Reynolds School of Journalism at the University of Nevada, Reno, the other day launched what the EFF describes as “the largest-ever collection of searchable data on police use of surveillance technologies,” created as a tool for the public to learn about facial recognition, drones, license plate readers, and other devices law enforcement agencies are acquiring to spy on our communities.

  • Large-Scale Facial Recognition Is Incompatible with a Free Society

    In the U.S., tireless opposition to state use of facial recognition algorithms has recently won some victories. Outside the U.S., however, the tide is heading in the other direction. To decide whether to expand or limit the use of facial recognition technology, nations will need to answer fundamental questions about the kind of people, and the kind of society, they want to be. Face surveillance is based on morally compromised research, violates our rights, is harmful, and exacerbates structural injustice, both when it works and when it fails. Its adoption harms individuals, and makes our society as a whole more unjust, and less free. A moratorium on its use is the least we should demand.

  • Privacy Risks of Home Security Cameras

    Researchers have used data from a major home Internet Protocol (IP) security camera provider to evaluate potential privacy risks for users. The researchers found that the traffic generated by the cameras could be monitored by attackers and used to predict when a house is occupied or not.

  • Coronavirus opens door to company surveillance of workers

    Employers are rushing to use digital tracking technology to reduce virus transmission in the workplace. Mohana Ravindranath writes in Politico that privacy experts worry that businesses will start using their newfound surveillance capabilities for purposes far beyond public health. The data could be used to evaluate workers’ productivity, see which colleagues are holding meetings or even flag an employee who unexpectedly ducks out of the office during work hours.

  • How Much Control Would People Be Willing to Grant to a Personal Privacy Assistant?

    CyLab’s Jessica Colnago believes that in the future, the simple act of walking down the street is going to be a little weird. “You know how every time you enter a website, and it says: ‘We use cookies. Do you consent?’ Imagine that same thing walking down the street, but for a light pole, or a surveillance camera, or an energy sensor on a house,” Colnago says.

  • Yes, Big Brother IS Watching: Russian Schools Installing Surveillance Systems Called “Orwell”

    You might think governments seeking digital oversight of their citizens would avoid invoking the author who coined the phrase “Big Brother is watching you” and implanted the nightmare of total state surveillance in the imaginations of millions of readers. Think again, because Russian officials appear to disagree. In the first phase of the project, the “total surveillance” system will be installed in 43,000 schools across Russia.

  • Was the Coronavirus Outbreak an Intelligence Failure?

    As the coronavirus pandemic continues to unfold, it’s clear that having better information sooner, and acting more quickly on what was known, could have slowed the spread of the outbreak and saved more people’s lives. Initial indications are that the U.S. intelligence community did well in reporting on the virus once news of the outbreak in China became widely known by early January. Whether it could have done more before that time, and why the Trump administration did not act more decisively early on, will have to wait for a future national coronavirus commission to help us sort out.

  • High-Tech Surveillance Amplifies Police Bias and Overreach

    Local, state and federal law enforcement organizations use an array of surveillance technologies to identify and track protesters, from facial recognition to military-grade drones. Police use of these national security-style surveillance techniques – justified as cost-effective techniques that avoid human bias and error – has grown hand-in-hand with the increased militarization of law enforcement. Extensive research, including my own, has shown that these expansive and powerful surveillance capabilities have exacerbated rather than reduced bias, overreach and abuse in policing, and they pose a growing threat to civil liberties.

  • Calls for New Federal Authority to Regulate Facial Recognition Tech

    A group of artificial intelligence experts — citing profiling, breach of privacy and surveillance as potential societal risks — recently proposed a new model for managing facial recognition technologies at the federal level. The experts propose an FDA-inspired model that categorizes these technologies by degrees of risk and would institute corresponding controls.

  • IoT: Which Devices Are Spying on You?

    When hungry consumers want to know how many calories are in a bag of chips, they can check the nutrition label on the bag. When those same consumers want to check the security and privacy practices of a new IoT device, they aren’t able to find even the most basic facts. Not yet, at least.

  • A.C.L.U. Warns Against Fever-Screening Tools for Coronavirus

    Airports, office buildings, warehouses and restaurant chains are rushing to install new safety measures like fever-scanning cameras and infrared temperature-sensing guns. But the American Civil Liberties Union warned on Tuesday against using the tools to screen people for possible coronavirus symptoms, saying the devices were often inaccurate, ineffective and intrusive. Natasha Singer writes in the New York Times that in a new report, “Temperature Screening and Civil Liberties During an Epidemic,” the A.C.L.U. said that such technologies could give people a false sense of security, potentially leading them to be less vigilant about health measures like wearing masks or social distancing. The group also cautioned that the push for widespread temperature scans during the pandemic could usher in permanent new forms of surveillance and social control.