Researchers have unearthed never-before-seen wiper malware tied to the Kremlin and an operation two years ago that took out more than 10,000 satellite modems located mainly in Ukraine on the eve of Russia’s invasion of its neighboring country.
AcidPour, as researchers from security firm Sentinel One have named the new malware, has stark similarities to AcidRain, a wiper discovered in March 2022 that Viasat has confirmed was used in the attack on its modems earlier that month. Wipers are malicious applications designed to destroy stored data or render devices inoperable. Viasat said AcidRain was installed on more than 10,000 Eutelsat KA-SAT modems used by the broadband provider seven days prior to the March 2022 discovery of the wiper. AcidRain was installed on the devices after attackers gained access to the company’s private network.
Sentinel One, which also discovered AcidRain, said at the time that the earlier wiper had enough technical overlaps with malware the US government attributed to the Russian government in 2018 to make it likely that AcidRain and the 2018 malware, known as VPNFilter, were closely linked to the same team of developers. In turn, Sentinel One’s report Thursday noting the similarities between AcidRain and AcidPour provides evidence that AcidPour was also created by developers working on behalf of the Kremlin.
On Thursday, the United Nations General Assembly unanimously consented to adopt what some call the first global resolution on AI, reports Reuters. The resolution aims to foster the protection of personal data, enhance privacy policies, ensure close monitoring of AI for potential risks, and uphold human rights. It emerged from a proposal by the United States and received backing from China and 121 other countries.
Being a nonbinding agreement and thus effectively toothless, the resolution seems broadly popular in the AI industry. On X, Microsoft Vice Chair and President Brad Smith wrote, "We fully support the @UN's adoption of the comprehensive AI resolution. The consensus reached today marks a critical step towards establishing international guardrails for the ethical and sustainable development of AI, ensuring this technology serves the needs of everyone."
The resolution, titled "Seizing the opportunities of safe, secure and trustworthy artificial intelligence systems for sustainable development," resulted from three months of negotiation, and the stakeholders involved seem pleased at the level of international cooperation. "We're sailing in choppy waters with the fast-changing technology, which means that it's more important than ever to steer by the light of our values," one senior US administration official told Reuters, highlighting the significance of this "first-ever truly global consensus document on AI."
In a medical triumph, the US Food and Drug Administration on Monday approved a gene therapy that appears to trounce a rare, tragic disease that progressively steals children's ability to talk, move, and think, leading to a vegetative state and death. For those who begin to slip away in infancy, many die by age 5. But, with the new therapy, 37 children in an initial trial were all still alive at age 6. Most could still talk, walk on their own, and perform normally on IQ tests, which was unseen in untreated children. Some of the earliest children treated have now been followed for up to 12 years—and they continue to do well.
But, the triumph turned bittersweet today, Wednesday, as the company behind the therapy, Lenmeldy, set the price for the US market at $4.25 million, making it the most expensive drug in the world. The price is $310,000 higher than what experts calculated to be the maximum fair price for the lifesaving drug; the nonprofit Institute for Clinical and Economic Review, or ICER, gave a range last October of between $2.29 million to $3.94 million.
The price raises questions about whether state, federal, and private health insurance plans will be able to shoulder the costs. "Unless states have allocated appropriately for it, and looked at the drug pipeline, they may not be prepared for what could be significant cost spikes," Edwin Park, a research professor at the McCourt School of Public Health at Georgetown University, told CNN.
"Overwhelming evidence" shows that Australian computer scientist Craig Wright is not bitcoin creator Satoshi Nakamoto, a UK judge declared Thursday.
In what Wired described as a "surprise ruling" at the closing of Wright's six-week trial, Justice James Mellor abruptly ended years of speculation by saying:
Dr. Wright is not the author of the Bitcoin white paper. Dr. Wright is not the person that operated under the pseudonym Satoshi Nakamoto. Dr. Wright is not the person that created the Bitcoin system. Nor is Dr. Wright the author of the Bitcoin software.
Wright was not in the courtroom for this explosive moment, Wired reported.
Democratic lawmakers are probing SpaceX over Russia's reported use of Starlink in Ukraine, saying that recent developments raise questions about SpaceX's "compliance with US sanctions and export controls."
SpaceX CEO Elon Musk last month denied what he called "false news reports [that] claim that SpaceX is selling Starlink terminals to Russia," saying that, "to the best of our knowledge, no Starlinks have been sold directly or indirectly to Russia." But Musk's statement didn't satisfy US Reps. Jamie Raskin (D-Md.) and Robert Garcia (D-Calif.), who sent a letter to SpaceX President Gwynne Shotwell yesterday.
"Starlink is an invaluable resource for Ukrainians in their fight against Russia's brutal and illegitimate invasion. It is alarming that Russia may be obtaining and using your technology to coordinate attacks against Ukrainian troops in illegally occupied regions in Eastern and Southern Ukraine, potentially in violation of US sanctions and export controls," Raskin and Garcia wrote.
A core developer of Nginx, currently the world's most popular web server, has quit the project, stating that he no longer sees it as "a free and open source project… for the public good." His fork, freenginx, is "going to be run by developers, and not corporate entities," writes Maxim Dounin, and will be "free from arbitrary corporate actions."
Dounin is one of the earliest and still most active coders on the open source Nginx project and one of the first employees of Nginx, Inc., a company created in 2011 to commercially support the steadily growing web server. Nginx is now used on roughly one-third of the world's web servers, ahead of Apache.
Nginx Inc. was acquired by Seattle-based networking firm F5 in 2019. Later that year, two of Nginx's leaders, Maxim Konovalov and Igor Sysoev, were detained and interrogated in their homes by armed Russian state agents. Sysoev's former employer, Internet firm Rambler, claimed that it owned the rights to Nginx's source code, as it was developed during Sysoev's tenure at Rambler (where Dounin also worked). While the criminal charges and rights do not appear to have materialized, the implications of a Russian company's intrusion into a popular open source piece of the web's infrastructure caused some alarm.
Russian forces are using Starlink terminals on the front line in Ukraine, according to the Ukrainian military, which said the adoption of Elon Musk’s satellite Internet service by Moscow’s troops was becoming “systemic.”
Ukraine’s GUR military intelligence unit said on Telegram on Sunday that radio intercepts confirmed the use of Starlink terminals by Russian units operating in the occupied Donetsk region of eastern Ukraine.
“Yes, there have been recorded cases of the Russian occupiers using these devices,” Andriy Yusov, a GUR officer, told RBC-Ukraine. “This is starting to take on a systemic nature.”
Before Neanderthals and Denisovans, before vaguely humanoid primates, proto-mammals, or fish that crawled out of the ocean to become the first terrestrial animals, our earliest ancestors were microbes.
More complex organisms like ourselves descend from eukaryotes, which have a nuclear membrane around their DNA (as opposed to prokaryotes, which don’t). Eukaryotes were thought to have evolved a few billion years ago, during the late Palaeoproterozoic period, and started diversifying by around 800 million years ago. Their diversification was not well understood. Now, a team of researchers led by UC Santa Barbara paleontologist Leigh Ann Riedman discovered eukaryote microfossils that are 1.64 billion years old, yet had already diversified and had surprisingly sophisticated features.
“High levels of eukaryotic species richness and morphological disparity suggest that although late Palaeoproterozoic [fossils] preserve our oldest record of eukaryotes, the eukaryotic clade has a much deeper history,” Riedman and her team said in a study recently published in Papers in Paleontology.
Adobe has abandoned its proposed $20 billion acquisition of product design software company Figma, as there was “no clear path to receive necessary regulatory approvals” from UK and EU watchdogs.
The deal had faced probes from both the UK and EU competition regulators for fears it would have an impact on the product design, image editing, and illustration markets.
Adobe refused to offer remedies to satisfy the UK Competition and Markets Authority’s concerns last week, according to a document published by the regulator on Monday, arguing that a divestment would be “wholly disproportionate.”
Ukrainian civilians on Wednesday grappled for a second day of widespread cellular phone and Internet outages after a cyberattack, purportedly carried out by Kremlin-supported hackers, hit the country’s biggest mobile phone and Internet provider a day earlier.
Two separate hacking groups with ties to the Russian government took responsibility for Tuesday’s attack striking Kyivstar, which has said it serves 24.3 million mobile subscribers and more than 1.1 million home Internet users. One group, calling itself Killnet, said on Telegram that “an attack was carried out on Ukrainian mobile operators, as well as on some banks,” but didn’t elaborate or provide any evidence. A separate group known as Solntsepek said on the same site that it took “full responsibility for the cyberattack on Kyivstar” and had “destroyed 10,000 computers, more than 4,000 servers, and all cloud storage and backup systems.” The post was accompanied by screenshots purporting to show someone with control over the Kyivstar systems.
In the city of Lviv, street lights remained on after sunrise and had to be disconnected manually, because Internet-dependent automated power switches didn’t work, according to NBC News. Additionally, the outage prevented shops throughout the country from processing credit payments and many ATMs from functioning, the Kyiv Post said.
This summer, a Vancouver car mechanic named Max got a perplexing ping on his phone: Betty White was in Ukraine and needed his help. This was surprising because she had died on a Canadian highway back in January.
When Max last saw Betty White, his nickname for his Tesla Model Y Performance, they were both in rough shape after getting sideswiped on the highway. Max’s rotator cuff was torn in several places. The small SUV had bounced off multiple concrete barriers at high speed and was bashed in on all four corners, its wheels ripped to pieces. Coolant appeared to be leaking into the battery chamber. From his own work on EVs in the garage, Max knew that Betty was done for. “No auto shop would put a repair person at risk with that kind of damage,” says Max, whose last name isn’t being used out of doxing concerns. A damaged EV battery can become dangerous due to the risk of shocks, fire, and toxic fumes. His insurer agreed, and Betty was written off and sent to a salvage yard.
On Wednesday, the UK hosted an AI Safety Summit attended by 28 countries, including the US and China, which gathered to address potential risks posed by advanced AI systems, reports The New York Times. The event included the signing of "The Bletchley Declaration," which warns of potential harm from advanced AI and calls for international cooperation to ensure responsible AI deployment.
"There is potential for serious, even catastrophic, harm, either deliberate or unintentional, stemming from the most significant capabilities of these AI models," reads the declaration, named after Bletchley Park, the site of the summit and a historic World War II location linked to Alan Turing. Turing wrote influential early speculation about thinking machines.
Rapid advancements in machine learning, including the appearance of chatbots like ChatGPT, have prompted governments worldwide to consider regulating AI. Their concerns led to the meeting, which has drawn criticism for its invitation list. In the tech world, representatives from major companies included those from Anthropic, Google DeepMind, IBM, Meta, Microsoft, Nvidia, OpenAI, and Tencent. Civil society groups, like Britain's Ada Lovelace Institute and the Algorithmic Justice League in Massachusetts, also sent representatives.
Jeremy Wright was the first of five UK ministers charged with pushing through the British government’s landmark legislation on regulating the Internet, the Online Safety Bill. The current UK government likes to brand its initiatives as “world-beating,” but for a brief period in 2019 that might have been right. Back then, three prime ministers ago, the bill—or at least the white paper that would form its basis—outlined an approach that recognized that social media platforms were already de facto arbiters of what was acceptable speech on large parts of the Internet, but that this was a responsibility they didn’t necessarily want and weren’t always capable of discharging. Tech companies were pilloried for things that they missed, but also, by free speech advocates, for those they took down. “There was a sort of emerging realization that self-regulation wasn’t going to be viable for very much longer,” Wright says. “And therefore, governments needed to be involved.”
The bill set out to define a way to handle “legal but harmful” content—material that wasn’t explicitly against the law but which, individually or in aggregate, posed a risk, such as health care disinformation, posts encouraging suicide or eating disorders, or political disinformation with the potential to undermine democracy or create panic. The bill had its critics—notably, those who worried it gave Big Tech too much power. But it was widely praised as a thoughtful attempt to deal with a problem that was growing and evolving faster than politics and society were able to adapt. Of his 17 years in parliament, Wright says, “I’m not sure I’ve seen anything by way of potential legislation that’s had as broadly based a political consensus behind it.”
The United Kingdom is experiencing a dramatic outbreak—unprecedented in scale and magnitude—of diarrheal illnesses from the intestinal parasite Cryptosporidium, aka Crypto.
According to a rapid communication published Thursday in the journal Eurosurveillance, UK health officials report that Crypto cases have exceeded the upper bounds of expected cases since mid-September, and an October peak saw cases roughly threefold above what is usual for this time of year. The outbreak is still ongoing.
Laboratory notifications of Cryptosporidium species in England, Wales, and Northern Ireland, by week of specimen, 2023. (credit: Eurosurveillance | Peake et al.)
So far, it's unclear what's driving the extraordinary burst in cases. The outbreak has splattered into almost every region of all four UK nations. "Given the scale and geographical spread of the [case] exceedance across regions and nations of the UK, a single local exposure is an unlikely cause," the authors, led by officials at the United Kingdom Health Security Agency in London, wrote in the rapid report.