• It’s a Bird... It’s a Plane... It’s Superman? No: It’s a Flapping-Wing Drone

    A drone prototype that mimics the aerobatic maneuvers of one of the world’s fastest birds, the swift, is being developed by an international team of aerospace engineers in the latest example of biologically inspired flight. The 26 gram ornithopter (flapping wing aircraft) which can hover, dart, glide, brake and dive just like a swift, making them more versatile, safer and quieter than the existing quadcopter drones.

  • How to Hide from a Drone – the Subtle Art of “Ghosting” in the Age of Surveillance

    Drones of all sizes are being used by environmental advocates to monitor deforestation, by conservationists to track poachers, and by journalists and activists to document large protests. But when the Department of Homeland Security redirects large, fixed-wing drones from the U.S.-Mexico border to monitor protests, and when towns experiment with using drones to test people for fevers, it’s time to think about how many eyes are in the sky and how to avoid unwanted aerial surveillance. One way that’s within reach of nearly everyone is learning how to simply disappear from view.

  • DHS Authorizes Domestic Surveillance to Protect Statues and Monuments

    You might not imagine that the U.S. intelligence community would have much stake in local protests over monuments and statues, Steve Vladeck and Benjamin Wittes write, but you’d be wrong. An unclassified DHS memo, provided to Lawfare, makes clear that the authorized intelligence activity by DHS personnel covers significantly more than protecting federal personnel or facilities. It appears to also include planned vandalism of Confederate (and other historical) monuments and statues, whether federally owned or not. “[W]e do not accept that graffiti and vandalism are remotely comparable threats to the homeland [as attacks on federal buildings] — or that they justify this kind of federal response even if, in the right circumstances, such activity would technically constitute a federal crime,” Vladeck and Wittes conclude.

  • EFF Launches Searchable Database of Police Use of Surveillance Technologies

    The Electronic Frontier Foundation (EFF), in partnership with the Reynolds School of Journalism at the University of Nevada, Reno, the other day launched what the EFF describes as “the largest-ever collection of searchable data on police use of surveillance technologies,” created as a tool for the public to learn about facial recognition, drones, license plate readers, and other devices law enforcement agencies are acquiring to spy on our communities.

  • Large-Scale Facial Recognition Is Incompatible with a Free Society

    In the U.S., tireless opposition to state use of facial recognition algorithms has recently won some victories. Outside the U.S., however, the tide is heading in the other direction. To decide whether to expand or limit the use of facial recognition technology, nations will need to answer fundamental questions about the kind of people, and the kind of society, they want to be. Face surveillance is based on morally compromised research, violates our rights, is harmful, and exacerbates structural injustice, both when it works and when it fails. Its adoption harms individuals, and makes our society as a whole more unjust, and less free. A moratorium on its use is the least we should demand.

  • Accurately Pinpointing Malicious Drone Operators

    Researchers have determined how to pinpoint the location of a drone operator who may be operating maliciously or harmfully near airports or protected airspace by analyzing the flight path of the drone.

  • Privacy Risks of Home Security Cameras

    Researchers have used data from a major home Internet Protocol (IP) security camera provider to evaluate potential privacy risks for users. The researchers found that the traffic generated by the cameras could be monitored by attackers and used to predict when a house is occupied or not.

  • Coronavirus opens door to company surveillance of workers

    Employers are rushing to use digital tracking technology to reduce virus transmission in the workplace. Mohana Ravindranath writes in Politico that privacy experts worry that businesses will start using their newfound surveillance capabilities for purposes far beyond public health. The data could be used to evaluate workers’ productivity, see which colleagues are holding meetings or even flag an employee who unexpectedly ducks out of the office during work hours.

  • How Much Control Would People Be Willing to Grant to a Personal Privacy Assistant?

    CyLab’s Jessica Colnago believes that in the future, the simple act of walking down the street is going to be a little weird. “You know how every time you enter a website, and it says: ‘We use cookies. Do you consent?’ Imagine that same thing walking down the street, but for a light pole, or a surveillance camera, or an energy sensor on a house,” Colnago says.

  • Yes, Big Brother IS Watching: Russian Schools Installing Surveillance Systems Called “Orwell”

    You might think governments seeking digital oversight of their citizens would avoid invoking the author who coined the phrase “Big Brother is watching you” and implanted the nightmare of total state surveillance in the imaginations of millions of readers. Think again, because Russian officials appear to disagree. In the first phase of the project, the “total surveillance” system will be installed in 43,000 schools across Russia.

  • Was the Coronavirus Outbreak an Intelligence Failure?

    As the coronavirus pandemic continues to unfold, it’s clear that having better information sooner, and acting more quickly on what was known, could have slowed the spread of the outbreak and saved more people’s lives. Initial indications are that the U.S. intelligence community did well in reporting on the virus once news of the outbreak in China became widely known by early January. Whether it could have done more before that time, and why the Trump administration did not act more decisively early on, will have to wait for a future national coronavirus commission to help us sort out.

  • High-Tech Surveillance Amplifies Police Bias and Overreach

    Local, state and federal law enforcement organizations use an array of surveillance technologies to identify and track protesters, from facial recognition to military-grade drones. Police use of these national security-style surveillance techniques – justified as cost-effective techniques that avoid human bias and error – has grown hand-in-hand with the increased militarization of law enforcement. Extensive research, including my own, has shown that these expansive and powerful surveillance capabilities have exacerbated rather than reduced bias, overreach and abuse in policing, and they pose a growing threat to civil liberties.

  • Calls for New Federal Authority to Regulate Facial Recognition Tech

    A group of artificial intelligence experts — citing profiling, breach of privacy and surveillance as potential societal risks — recently proposed a new model for managing facial recognition technologies at the federal level. The experts propose an FDA-inspired model that categorizes these technologies by degrees of risk and would institute corresponding controls.

  • IoT: Which Devices Are Spying on You?

    When hungry consumers want to know how many calories are in a bag of chips, they can check the nutrition label on the bag. When those same consumers want to check the security and privacy practices of a new IoT device, they aren’t able to find even the most basic facts. Not yet, at least.

  • A.C.L.U. Warns Against Fever-Screening Tools for Coronavirus

    Airports, office buildings, warehouses and restaurant chains are rushing to install new safety measures like fever-scanning cameras and infrared temperature-sensing guns. But the American Civil Liberties Union warned on Tuesday against using the tools to screen people for possible coronavirus symptoms, saying the devices were often inaccurate, ineffective and intrusive. Natasha Singer writes in the New York Times that in a new report, “Temperature Screening and Civil Liberties During an Epidemic,” the A.C.L.U. said that such technologies could give people a false sense of security, potentially leading them to be less vigilant about health measures like wearing masks or social distancing. The group also cautioned that the push for widespread temperature scans during the pandemic could usher in permanent new forms of surveillance and social control.