WORLD ROUNDUPNo End to the Pandemic-Origins Debate | AI “Should Be Subject to Nuclear-Level Regulation” | Chinese Labs Are Selling Fentanyl, and more

Published 24 May 2023

·  There Is No Evidence Strong Enough to End the Pandemic-Origins Debate
The recent fight over wet-market raccoon dogs underscores just how much prior beliefs can affect interpretation

·  Germans Fear Democracy Is in Danger from Extremists
Increasing concern about the spread of various forms of extremism and a growing minority of voters alienated from conventional politics

·  AI “Should Be Subject to Nuclear-Level Regulation”
AI poses such a risk to humanity that it must be subject to similar regulation as nuclear power

·  Ukraine-Backed Sabotage Group Gains Ground in “Liberation” of Russia
Second day of fighting in cross-border raid as partisan troops occupy villages

·  Will Russia Use Nuclear Weapons? Putin’s Options Explained
Fears are growing that he will resort to his most deadly arsenal, but he has many reasons to hold back

·  Chinese Malware Hits Systems on Guam. Is Taiwan the Real Target?
Chinese malware attack on Guam may be a practice run for a Chinese attack on Taiwan

·  Chinese Labs Are Selling Fentanyl Ingredients for Millions in Crypto
And it’s happening in plain sight

·  Leaked Government Document Shows Spain Wants to Ban End-to-End Encryption
For years, EU states have debated whether end-to-end encrypted communication platforms

There Is No Evidence Strong Enough to End the Pandemic-Origins Debate  (Katherine J. Wu, The Atlantic)
Three and a half years since the start of a pandemic that has killed millions of people and debilitated countless more, the world is still stuck at the start of the COVID-19 crisis in one maddening way: No one can say with any certainty how, exactly, the outbreak began. Many scientists think the new virus spilled over directly from a wild animal, perhaps at a Chinese wet market; some posit that the pathogen leaked accidentally from a local laboratory in Wuhan, China, the pandemic’s likely epicenter. All of them lack the slam-dunk evidence to prove one hypothesis and rule out the rest.

Germans Fear Democracy Is in Danger from Extremists  (Oliver Moody, The Times)
Almost four out of five Germans believe that their democracy is increasingly under siege and want the state to do more to help prop it up, a survey has found.
After the exposure of an alleged far-right plot to topple the government and install a military dictatorship led by a minor aristocrat, there is much concern about the spread of various forms of extremism and a growing minority of voters who are alienated from conventional politics.
The country’s highest prosecutor has announced that police have detained another three suspected right-wing extremists linked to the alleged conspiracy, following the arrest of 25 people shortly before Christmas and raids targeting another five suspects in March.

AI “Should Be Subject to Nuclear-Level Regulation”  (Keiran Southern, The Times)
The creators of ChatGPT have suggested that artificial intelligence poses such a risk to humanity that it must be subject to similar regulation as nuclear power.
OpenAI founder Sam Altman, president Greg Brockman and chief scientist Ilya Sutskever said that within a decade it is possible that AI systems will be capable of exceeding “expert skill level in most domains.”
Superintelligence, the scientists wrote in a blog post, “will be more powerful than other technologies humanity has had to contend with in the past.”
“Given the possibility of existential risk, we can’t just be reactive,” they said. “Nuclear energy is a commonly used historical example of a technology with this property; synthetic biology is another example.”