• Crack Down on Genomic Surveillance

    Across the world, DNA databases that could be used for state-level surveillance are steadily growing. Yves Moreau writes that “Now the stakes are higher for two reasons. First, as technology gets cheaper, many countries might want to build massive DNA databases. Second, DNA-profiling technology can be used in conjunction with other tools for biometric identification — and alongside the analysis of many other types of personal data, including an individual’s posting behavior on social networks.”

  • Facial-Recognition Technology: Closer to Utopia Than Dystopia

    Is facial recognition technology ushering in the age of Big Brother, allowing the government to monitor what we do everywhere we do it? “This is the image that the American Civil Liberties Union, the Electronic Freedom Foundation (EFF), and a host of other alarmists are attempting to conjure in the minds of the media, elected officials, and the American public,” Robert Atkinson writes. But with the right regulations, “Americans can be safer and have more convenience with little or no reduction of our precious civil liberties.”

  • Victory: Pennsylvania Supreme Court Rules Police Can’t Force You to Tell Them Your Password

    The Pennsylvania Supreme Court issued a forceful opinion on Wednesday holding that the Fifth Amendment to the U.S. Constitution protects individuals from being forced to disclose the passcode to their devices to the police. The court found that disclosing a password is “testimony” protected by the Fifth Amendment’s privilege against self-incrimination.

  • Saudi “Twitter Spies” Broke No Federal Privacy Laws -- Because There Are None

    Privacy expert Mike Chapple of the University of Notre Dame says that the Saudi “Twitter Spies,” who were charged last week by the Justice Department for spying on behalf of Saudi Arabia, committed espionage — but broke no federal privacy laws because there are no such laws. Chapple says that Twitter failed to live up to industry-standard cybersecurity practices.

  • Why Adding Client-Side Scanning Breaks End-To-End Encryption

    Recent attacks on encryption have diverged. On the one hand, we’ve seen Attorney General William Barr call for “lawful access” to encrypted communications, using arguments that have barely changed since the 1990’s. Erica Portnoy writes that we’ve also seen suggestions from a different set of actors for more purportedly “reasonable” interventions, particularly the use of client-side scanning to stop the transmission of contraband files, most often child exploitation imagery (CEI).

  • The DNA Database Used to Find the Golden State Killer Is a National Security Leak Waiting to Happen

    A private DNA ancestry database that’s been used by police to catch criminals is a security risk from which a nation-state could steal DNA data on a million Americans, according to security researchers. Antonio Regalado writes that spies could use a crowdsourced genetic ancestry service to compromise your privacy—even if you’re not a member.

  • Why Did Microsoft Fund an Israeli Firm that Surveils West Bank Palestinians?

    Microsoft has invested in AnyVision, an Israeli startup which has developed a facial recognition technology used by Israel’s military and intelligence services to surveil Palestinians throughout the West Bank, in spite of the tech giant’s public pledge to avoid using the technology if it encroaches on democratic freedoms. The surveillance technology lets customers identify individuals and objects in any live camera feed, such as a security camera or a smartphone, and then track targets as they move between different feeds. The Israeli surveillance project is similar to China’s surveillance of its Uighur minority population. China is using artificial intelligence and facial recognition technology for a pervasive, intrusive monitoring of the Uighurs, a Muslim group living in western China.

  • AI Could Be a Disaster for Humanity. A Top Computer Scientist Thinks He Has the Solution.

    Stuart Russell is a leading AI researcher who co-authored the top textbook on the topic. He has also, for the last several years, been warning that his field has the potential to go catastrophically wrong. In a new book, Human Compatible, he explains how. AI systems, he notes, are evaluated by how good they are at achieving their objective: winning video games, writing humanlike text, solving puzzles. If they hit on a strategy that fits that objective, they will run with it, without explicit human instruction to do so.

  • Rethinking Encryption

    In the face of congressional inaction, and in light of the magnitude of the threat, it is time for governmental authorities—including law enforcement—to embrace encryption because it is one of the few mechanisms that the United States and its allies can use to more effectively protect themselves from existential cybersecurity threats, particularly from China. This is true even though encryption will impose costs on society, especially victims of other types of crime.

  • Will Canada Weaken Encryption with Backdoors?

    Imagine you wake up one morning and discover that the federal government is requiring everyone to keep their back doors unlocked. First responders need access your house in an emergency, they say, and locked doors are a significant barrier to urgent care. For the good of the nation, public health concerns outweigh the risk to your privacy and security. Sounds crazy, right? Byron Holland writes that, unfortunately, a number of governments are considering a policy just like this for the internet, and there’s growing concern that the Canadian government could soon follow suit.

  • AI Could Be a Force for Positive Social Change – but We’re Currently Heading for a Darker Future

    Artificial Intelligence (AI) is already re-configuring the world in conspicuous ways. Data drives our global digital ecosystem, and AI technologies reveal patterns in data. Smartphones, smart homes, and smart cities influence how we live and interact, and AI systems are increasingly involved in recruitment decisions, medical diagnoses, and judicial verdicts. Whether this scenario is utopian or dystopian depends on your perspective.

  • The FISA Oversight Hearing Confirmed That Things Need to Change

    Section 215, the controversial law at the heart of the NSA’s massive telephone records surveillance program, is set to expire in December. Last week the House Committee on the Judiciary held an oversight hearing to investigate how the NSA, FBI, and the rest of the intelligence community are using and interpreting 215 and other expiring national security authorities. If last week’s hearing made anything clear, it’s this: there is no good reason for Congress to renew the CDR authority,” McKinney writes, adding: “Despite repeated requests from the members of the panel to describe some way of measuring how effective these surveillance laws are, none of the witnesses could provide a framework. Congress must be able to determine whether any of the programs have real value and if the agencies are respecting the foundational rights to privacy and civil liberties that protect Americans from government overreach.”

  • Privacy Flaw Found in E-Passports

    Researchers have discovered a flaw in the security standard of biometric e-passports that has been used worldwide since 2004. This standard, ICAO 9303, allows e-passport readers at airports to scan the chip inside a passport and identify the holder.

  • What Data Hackers Can Get about You from Hospitals

    When hospitals are hacked, the public hears about the number of victims – but not what information the cybercriminals stole. New research uncovers the specific data leaked through hospital breaches, sounding alarm bells for nearly 170 million people.

  • Science Fiction Has Become Dystopian Fact

    So which dystopia are we living in? Most educated people have read George Orwell’s Nineteen Eighty-Four and Aldous Huxley’s Brave New World. So influential have these books been that we are inclined to view all disconcerting new phenomena as either “Orwellian” or “Huxleyan”. If you suspect we shall lose our freedom to a brutally repressive state, grinding its boot into our faces, you think of George. If you think we shall lose it to a hedonistic consumer culture, complete with test-tube designer babies, you quote Aldous. “My own belief is that the ruling oligarchy will find less arduous and wasteful ways of governing and of satisfying its lust for power,” Huxley wrote in a letter to Orwell in 1949. Niall Ferguson agrees: “As I reflect on the world in 2019, I am struck by the wisdom of [Huxley’s] words. In Xi Jinping’s China, we see Totalitarianism 2.0. The boot on the face remains a possibility, of course, but it is needed less and less as the system of social credit expands, aggregating and analyzing all the digital data that Chinese citizens generate.”