Report Finds Few Open Source Projects are Actively Maintained

“A recent analysis accounting for nearly 1.2 million open source software projects primarily across four major ecosystems found that only about 11% of projects were actively maintained,” reports InfoWorld:

In its 9th Annual State of the Software Supply Chain report, published October 3, software supply chain management company Sonatype assessed 1,176,407 projects and reported an 18% decline this year in actively maintained projects. Just 11% of projects — 118,028 — were receiving active maintenance.

The report also found some new projects, unmaintained in 2022, now being maintained.

The four ecosystems included JavaScript, via NPM; Java, via the Maven project management tool; Python, via the PyPI package index; and .NET, through the NuGet gallery. Some Go projects also were included. According to the report, 18.6% of Java and JavaScript projects that were being maintained in 2022 are no longer being maintained today.
Other interesting findings:

Nearly 10% reported security breaches due to open source vulnerabilities in the past 12 months.
Use of AI and machine learning software components within corporate environments surged 135% over the last year.

Read more of this story at Slashdot.



Source: Slashdot – Report Finds Few Open Source Projects are Actively Maintained

T2 Linux Discovers (Now Patched) AMD Zen 4 Invalid Opcode Speculation Bug

T2 SDE is not just a Linux distribution, but “a flexible Open Source System Development Environment or Distribution Build Kit,” according to a 2022 announcement of its support for 25 CPU architectures, variants, and C libraries. (“Others might even name it Meta Distribution. T2 allows the creation of custom distributions with state of the art technology, up-to-date packages and integrated support for cross compilation.”)

And while working on it, Berlin-based T2 Linux developer René Rebe (long-time Slashdot reader ReneR) discovered random illegal instruction speculation on AMD Ryzen 7000-Series and Epyc Zen 4 CPU.

ReneR writes:

Merged to Linux 6.6 Git is a fix for the bug now known at AMD as Erratum 1485.

The discovery was possible through continued high CPU load cross-compiling the T2 Linux distribution with support for all CPU architectures from ARM, MIPS, PowerPC, RISC-V to x86 (and more) for 33 build variants. With sustained high CPU load and various instruction sequences being compiled, pseudo random illegal instruction errors were observed and subsequently analyzed.

ExactCODE Research GmbH CTO René Rebe is thrilled that working with AMD engineers lead to a timely mitigation to increase system stability of the still new and highest performance Zen4 platform.

“I found real-world code that might be similar or actually trigger the same bugs in the CPU that are also used for all the Spectre Meltdown and other side-channel security vulnerability mitigations,” Rebe says in a video announcement on YouTube.

It took Rebe a tremendous amount of research, and he says now that “all the excessive work changed my mind. Mitigations equals considered harmful… If you want stable, reliable computational results — no, you can’t do this. Because as Spectre Meltdown and all the other security issues have proven, the CPUs are nowadays as complex as complex software systems…”

Read more of this story at Slashdot.



Source: Slashdot – T2 Linux Discovers (Now Patched) AMD Zen 4 Invalid Opcode Speculation Bug

To 'Evolve' Windows Authentication, Microsoft Wants to Eventually Disable NTLM in Windows 11

An anonymous reader shared this report from Neowin:

The various versions of Windows have used Kerberos as its main authentication protocol for over 20 years. However, in certain circumstances, the OS has to use another method, NTLM (NT LAN Manager). Today, Microsoft announced that it is expanding the use of Kerberos, with the plan to eventually ditch the use of NTLM altogether.

In a blog post, Microsoft stated that NTLM continues to be used by some businesses and organizations for Windows authentication because it “doesn’t require local network connection to a Domain Controller.” It also is “the only protocol supported when using local accounts” and it “works when you don’t know who the target server is.” Microsoft states:
These benefits have led to some applications and services hardcoding the use of NTLM instead of trying to use other, more modern authentication protocols like Kerberos. Kerberos provides better security guarantees and is more extensible than NTLM, which is why it is now a preferred default protocol in Windows.
The problem is that while businesses can turn off NTLM for authentication, those hardwired apps and services could experience issues. That’s why Microsoft has added two new authentication features to Kerberos.

Microsoft’s blog post calls it “the evolution of Windows authentication,” arguing that “As Windows evolves to meet the needs of our ever-changing world, the way we protect users must also evolve to address modern security challenges…” So, “our team is building new features for Windows 11.”

Initial and Pass Through Authentication Using Kerberos, or IAKerb, “a public extension to the industry standard Kerberos protocol that allows a client without line-of-sight to a Domain Controller to authenticate through a server that does have line-of-sight.”
A local Key Distribution Center (KDC) for Kerberos, “built on top of the local machine’s Security Account Manager so remote authentication of local user accounts can be done using Kerberos.”
“We are also fixing hard-coded instances of NTLM built into existing Windows components… shifting these components to use the Negotiate protocol so that Kerberos can be used instead of NTLM… NTLM will continue to be available as a fallback to maintain existing compatibility.”
“We are also introducing improved NTLM auditing and management functionality to give your organization more insight into your NTLM usage and better control for removing it.”
“Reducing the use of NTLM will ultimately culminate in it being disabled in Windows 11. We are taking a data-driven approach and monitoring reductions in NTLM usage to determine when it will be safe to disable.”

Read more of this story at Slashdot.



Source: Slashdot – To ‘Evolve’ Windows Authentication, Microsoft Wants to Eventually Disable NTLM in Windows 11

GNU's 40th Anniversary: the FSF's Meeting with Old and New Friends

Devin Ulibarri, the Free Software Foundation’s outreach and communications coordinator, writes up an event he describes as meeting with some old and new friends:
On Sunday, October 1, the Free Software Foundation (FSF) hosted a hackday to celebrate the fortieth anniversary of the GNU Project. Folks came from both near and far to join in the festivities at FSF headquarters, Boston, MA… Sadi moma bela loza, the Bulgarian melody from which The Free Software Song is set, could be heard faintly playing in a nearby room, its distinctive odd-metered tune performed by a fully-liberated X200…

All in all, the event succeeded in our goal of welcoming both long-time members as well as introducing new people to free software and our cause. A few college students from local universities, for example, were able to ask questions seeking to better understand free software licenses and GNU Project history. We received multiple requests from attendees to host similar events again in the near future. And one parent, whose son played NetHack at the event, reported that, the following morning, his son asked to go to the FSF office after school to play it again. When playing he mastered the “vi” movement keys immediately. We hope they serve him well…!

Happy hacking and please stay tuned for more FSF-hosted events, including LibrePlanet 2024!

Read more of this story at Slashdot.



Source: Slashdot – GNU’s 40th Anniversary: the FSF’s Meeting with Old and New Friends

Climate-Driven Heat Extremes May Make Earth to Hot for Billions of Humans

An anonymous reader shared this report from Phys.org:
If global temperatures increase by 1 degrees Celsius (C) or more than current levels, each year billions of people will be exposed to heat and humidity so extreme they will be unable to naturally cool themselves, according to interdisciplinary research from the Penn State College of Health and Human Development, Purdue University College of Sciences and Purdue Institute for a Sustainable Future… Humans can only withstand certain combinations of heat and humidity before their bodies begin to experience heat-related health problems, such as heat stroke or heart attack. As climate change pushes temperatures higher around the world, billions of people could be pushed beyond these limits…

Results of the study indicate that if global temperatures increase by 2 degreesC above pre-industrial levels, the 2.2 billion residents of Pakistan and India’s Indus River Valley, the one billion people living in eastern China and the 800 million residents of sub-Saharan Africa will annually experience many hours of heat that surpass human tolerance… Troublingly, researchers said, these regions are also in lower-to-middle income nations, so many of the affected people may not have access to air conditioning or any effective way to mitigate the negative health effects of the heat.

Read more of this story at Slashdot.



Source: Slashdot – Climate-Driven Heat Extremes May Make Earth to Hot for Billions of Humans

Climate-Driven Heat Extremes May Make Earth Too Hot for Billions of Humans

An anonymous reader shared this report from Phys.org:
If global temperatures increase by 1 degrees Celsius (C) or more than current levels, each year billions of people will be exposed to heat and humidity so extreme they will be unable to naturally cool themselves, according to interdisciplinary research from the Penn State College of Health and Human Development, Purdue University College of Sciences and Purdue Institute for a Sustainable Future… Humans can only withstand certain combinations of heat and humidity before their bodies begin to experience heat-related health problems, such as heat stroke or heart attack. As climate change pushes temperatures higher around the world, billions of people could be pushed beyond these limits…

Results of the study indicate that if global temperatures increase by 2 degreesC above pre-industrial levels, the 2.2 billion residents of Pakistan and India’s Indus River Valley, the one billion people living in eastern China and the 800 million residents of sub-Saharan Africa will annually experience many hours of heat that surpass human tolerance… Troublingly, researchers said, these regions are also in lower-to-middle income nations, so many of the affected people may not have access to air conditioning or any effective way to mitigate the negative health effects of the heat.

Read more of this story at Slashdot.



Source: Slashdot – Climate-Driven Heat Extremes May Make Earth Too Hot for Billions of Humans

C# Challenges Java in Programming Language Popularity

“The gap between C# and Java never has been so small,” according to October’s update for TIOBE’s “Programming Community Index”.

“Currently, the difference is only 1.2%, and if the trends remain this way, C# will surpass Java in about 2 month’s time.”
Java shows the largest decline of -3.92% and C# the largest gain of +3.29% of all programming languages (annually).
The two languages have always been used in similar domains and thus have been competitors for more than 2 decades now. Java’s decline in popularity is mainly caused by Oracle’s decision to introduce a paid license model after Java 8. Microsoft took the opposite approach with C#. In the past, C# could only be used as part of commercial tool Visual Studio. Nowadays, C# is free and open source and it’s embraced by many developers.

There are also other reasons for Java’s decline. First of all, the Java language definition has not changed much the past few years and Kotlin, its fully compatible direct competitor, is easier to use and free of charge.

“Java remains a critical language in enterprise computing,” argues InfoWorld, “with Java 21 just released last month and Java 22 due next March. And free open source binaries of Java still are available via OpenJDK.” InfoWorld also notes TIOBE’s ranking is different than other indexes. TIOBE’s top 10:

Python (14.82%)
C (12.08%)
C++ (10.67%)
Java (8.92%)
C# (7.71%)
JavaScript (2.91%)
Visual Basic (2.13%)
PHP (1.9%)
SQL (1.78%)
Assembly (1.64%)

And here’s the Pypl Popularity of Programming Language (based on searches for language tutorials on Google):

Python, with a 28.05% share
Java (15.88%)
JavaScript (9.27%)
C# (6.79%)
C/C++ (6.59%)
PHP (4.86%)
R (4.45%)
TypeScript (2.93%)
Swift (2.69%)
Objective-C (2.29%)

Read more of this story at Slashdot.



Source: Slashdot – C# Challenges Java in Programming Language Popularity

Is Glass the Future of Storage?

“If we carry on the way we’re going, we’re going to have to concrete the whole planet just to store the data that we’re generating,” explains a deputy lab director at Microsoft Research Cambridge in a new video.

Fortunately, “A small sheet of glass can now hold several terabytes of data, enough to store approximately 1.75 million songs or 13 years’ worth of music,” explains a Microsoft Research web page about “Project Silica”. (Data is retrieved by a high-speed, computer-controlled microscope from a library of glass disks storing data in three-dimensional pixels called voxels):
Magnetic storage, although prevalent, is problematic. Its limited lifespan necessitates frequent re-copying, increasing energy consumption and operational costs over time. “Magnetic technology has a finite lifetime,” says Ant Rowstron, Distinguished Engineer, Project Silica. “You must keep copying it over to new generations of media. A hard disk drive might last five years. A tape, well, if you’re brave, it might last ten years. But once that lifetime is up, you’ve got to copy it over. And that, frankly, is both difficult and tremendously unsustainable if you think of all that energy and resource we’re using.”

Project Silica aims to break this cycle. Developed under the aegis of Microsoft Research, it can store massive amounts of data in glass plates roughly the size of a drink coaster and preserve the data for thousands of years. Richard Black, Research Director, Project Silica, adds, “This technology allows us to write data knowing it will remain unchanged and secure, which is a significant step forward in sustainable data storage.” Project Silica’s goal is to write data in a piece of glass and store it on a shelf until it is needed. Once written, the data inside the glass is impossible to change.

Project Silica is focused on pioneering data storage in quartz glass in partnership with the Microsoft Azure team, seeking more sustainable ways to archive data. This relationship is symbiotic, as Project Silica uses Azure AI to decode data stored in glass, making reading and writing faster and allowing more data storage… The library is passive, with no electricity in any of the storage units. The complexity is within the robots that charge as they idle inside the lab, awakening when data is needed…

Initially, the laser writing process was inefficient, but after years of refinement, the team can now store several TB in a single glass plate that could last 10,000 years. For a sense of scale, each plate could store around 3,500 movies. Or enough non-stop movies to play for over half a year without repeating. A glass plate could hold the entire text of War and Peace — one of the longest novels ever written — about 875,000 times.
And most importantly, it can store data in a fraction of the space of a datacenter…

Thanks to long-time Slashdot reader Kirschey for sharing the article.

Read more of this story at Slashdot.



Source: Slashdot – Is Glass the Future of Storage?

How a Series of Air Traffic Control Lapses Nearly Killed 131 People

Due to an air traffic control mistake in February, a FedEx cargo plane flew within 100 feet of a Southwest Airlines flight in February. The New York Times reports that the flight’s 128 passengers “were unaware that they had nearly died.”
In a year filled with close calls involving US airlines, this was the one that most unnerved federal aviation officials: A disaster had barely been averted, and multiple layers of the vaunted US air-safety system had failed… But the errors by the controller — who has continued to direct some plane traffic in Austin, Texas — were far from the whole story, according to 10 current and former controllers there, as well as internal Federal Aviation Administration documents reviewed by the Times. Austin-Bergstrom, like the vast majority of US airports, lacks technology that allows controllers to track planes on the ground and that warns of imminent collisions. The result is that on foggy days, controllers can’t always see what is happening on runways and taxiways. Some have even resorted to using a public flight-tracking website in lieu of radar.

In addition, for years Austin has had a shortage of experienced controllers, even as traffic at the airport has surged to record levels. Nearly three-quarters of shifts have been understaffed. Managers and rank-and-file controllers have repeatedly warned that staffing levels pose a public danger. The controller on that February morning was working an overtime shift. In June, Stephen B. Martin, then Austin’s top manager, and a local union representative wrote a memo pleading for more controllers. “Drastic steps are needed to allow the facility to adequately staff for existing traffic,” they wrote to FAA and union officials.
Austin is a microcosm of a systemic crisis. The safety net that underpins air travel in America is fraying, exposing passengers to potential tragedies like the episode in February.

And yet the chair of America’s National Transportation Safety Board calls the February incident “just one of seven serious close calls and near misses involving commercial airlines that we have initiated investigations on this year.”
Thanks to long-time Slashdot reader schwit1 for sharing the article.

Read more of this story at Slashdot.



Source: Slashdot – How a Series of Air Traffic Control Lapses Nearly Killed 131 People

First 'Doctor Who' Writer Honored. His Son Contests BBC's Rights to 'Unearthly Child'

The BBC reports:
Doctor Who’s first writer could finally be recognised 60 years after he helped launch the hugely-popular series. Anthony Coburn penned the first four episodes of the sci-fi drama in 1963 — a story called An Unearthly Child. But after his second story did not air, the writer has been seen as a minor figure among some Doctor Who fans.

However, a campaign to erect a memorial to Coburn in his home town of Herne Bay, Kent, is gathering pace a month ahead of the show’s 60th anniversary.
A local elected councillor told the BBC they’re working to find a location for the memorial.

The BBC writes that Coburn’s episode — broadcast November 23, 1963 — “introduced the character of The Doctor, his three travelling companions, and his time and space machine, the TARDIS, stuck in the form of a British police box.”
Richard Bignell, a Doctor Who historian, believes Coburn played a significant role in sowing the seeds of the programme’s success. He said: “Although the major elements that would go on to form the core of the series were devised within the BBC, as the scriptwriter for the first story, Coburn was the one who really put the flesh on the bones of the idea and how it would work dramatically. “Many opening episodes of a new television series can be very clunky as they attempt to land their audience with too much information about the characters, the setting and what’s going to happen, but Coburn was very reserved in how much he revealed, preserving all the wonder and mystery.”

In 2013, the Independent reported:
Mr Coburn’s son claims that the BBC has been in breach of copyright since his father’s death in 1977. He has demanded that the corporation either stop using the Tardis in the show or pay his family for its every use since then. Stef Coburn claims that upon his father’s death, any informal permission his father gave the BBC to use his work expired and the copyright of all of his ideas passed to his widow, Joan. Earlier this year she passed it on to him.

He said: “It is by no means my wish to deprive legions of Doctor Who fans (of whom I was never one) of any aspect of their favourite children’s programme. The only ends I wish to accomplish, by whatever lawful means present themselves, involve bringing about the public recognition that should by rights always have been his due, of my father James Anthony Coburn’s seminal contribution to Doctor Who, and proper lawful recompense to his surviving estate.”

Today jd (Slashdot reader #1,658) notes that Stef Coburn apparently has a Twitter feed, where this week Stef claimed he’d cancelled the BBC’s license to distribute his father’s episodes after being offered what he complained was “a pittance” to relicense them.
In response to someone who asked “What do you actually gain from doing this though?” Stef Coburn replied: “Vengeance.” But elsewhere Stef Coburn writes “There are OTHER as yet unfulfilled projects & aspirations of Tony’s (of one of which, I was a significant part, in his final year), which I would like to see brought to fruition. If Doctor Who is my ONLY available leverage. So be it!”

Stef Coburn also announced plans to publish his father’s “precursor draft-scripts (At least one very different backstory; sans ‘Timelords’) plus accompanying notes, for the story that became ‘The Tribe of Gum’.”

Read more of this story at Slashdot.



Source: Slashdot – First ‘Doctor Who’ Writer Honored. His Son Contests BBC’s Rights to ‘Unearthly Child’

Australian Student Invents Affordable Electric Car Conversion Kit.

“Australian design student Alexander Burton has developed a prototype kit for cheaply converting petrol or diesel cars to hybrid electric,” reports Dezeen magazine, “winning the country’s national James Dyson Award in the process.”

Titled REVR (Rapid Electric Vehicle Retrofits), the kit is meant to provide a cheaper, easier alternative to current electric car conversion services, which Burton estimates cost AU$50,000 (£26,400) on average and so are often reserved for valuable, classic vehicles.

Usually, the process would involve removing the internal combustion engine and all its associated hardware, like the gearbox and hydraulic brakes, to replace them with batteries and electric motors. With REVR, those components are left untouched. Instead, a flat, compact, power-dense axial flux motor would be mounted between the car’s rear wheels and disc brakes, and a battery and controller system placed in the spare wheel well or boot. Some additional off-the-shelf systems — brake and steering boosters, as well as e-heating and air conditioning — would also be added under the hood. By taking this approach, Burton believes he’ll be able to offer the product for around AU$5,000 (£2,640) and make it compatible with virtually any car…

With REVR, people should be able to get several more years of life out of their existing cars. The kit would transform the vehicle into a hybrid rather than a fully electric vehicle, with a small battery giving the car 100 kilometres of electric range before the driver has to switch to the internal combustion engine… Borrowing a trick from existing hybrid vehicles, the kit uses a sensor to detect the position of the accelerator pedal to control both acceleration and braking. That means no changes have to be made to the car’s hydraulic braking system, which Burton says “you don’t want to have to interrupt”.

Thanks to Slashdot reader FrankOVD for sharing the news.

Read more of this story at Slashdot.



Source: Slashdot – Australian Student Invents Affordable Electric Car Conversion Kit.

Musicians Are Angry About Venues Taking T-shirt Money

The singer known as Tomberlin says their first five years in the music industry may have been a net loss, according to MarketWatch. Selling “merch[andise]” like t-shirts “is what really is covering your costs and hopefully helping you make, like, an actual profit.”

And then…

After being told she would have to hand over more than 40% of the money she collected from selling T-shirts and other items, Tomberlin refused to sell her merchandise at the venue and publicly spoke about a practice she calls robbery — venues taking cuts from bands’ merchandise sales… Other musicians are also speaking out about the practice, and their complaints seem to be having an effect. Industry giant Live Nation Entertainment Inc. announced recently that it would stop collecting merch fees at nearly 80 of the smaller clubs it owns and operates and provide all bands that play at those venues with an additional $1,500 in gas cards and cash.

Musicians who spoke with MarketWatch remain unsatisfied, however. Because of the way the announcement is phrased, many think merch fees at Live Nation clubs are only being paused until the end of the year. The musicians said they also wonder about the roughly 250 other Live Nation concert facilities, as well as the hundreds of venues owned by other companies. A Live Nation spokesperson told MarketWatch the change is “open-ended.”

[…] As Tomberlin continues on her current tour, she wonders if she will be able to make a profitable career in music. Of all her ways of earning money, streaming services like Spotify and Apple Music provide “the least amount of money,” she said, and with tours not leaving her with any cash at the end, she feels that even modest ambitions are out of reach.

Musician Laura Jane Grace is even soliciting signers for an online petition demanding venues stop taking cuts of the musicians’ merchandise sales…
Thanks to Slashdot reader quonset for sharing the news.

Read more of this story at Slashdot.



Source: Slashdot – Musicians Are Angry About Venues Taking T-shirt Money

Startup Aims to Build Hundreds of Chip Factories with Prefab Parts and AI

“To meet the world’s growing hunger for chips, a startup wants to upend the costly semiconductor fabrication plant with a nimbler, cheaper idea…” reports Fast Company, “an AI-enabled chip factory that can be assembled and expanded modularly with prefab pieces, like high-tech Lego bricks.”

In other words, they want to enable what is literally a fast company…
“We’re democratizing the ownership of semiconductor fabs,” says Matthew Putman, referring to chip fabrication plants. Putman is the founder and CEO of Nanotronics, a New York City-based industrial AI company that deploys advanced optical solutions for detecting defects in manufacturing procedures. Its new system, called Cubefabs, combines its modular inspection tools and other equipment with AI, allowing the proposed chip factories to monitor themselves and adapt accordingly — part of what Putman calls an “autonomous factory.” The bulk of the facility can be preassembled, flat-packed and put in shipping containers so that the facilities can be built “in 80% of the world,” says Putman.

Eventually, the company envisions hundreds of the flower-shaped fabs around the world, starting with a prototype in New York or Kuwait that it hopes to start building by the end of the year… Nanotronics says a single Cubefab installation could start at one acre with a single fab, and grow to a four-fab, six-acre footprint. Each fab could be built in under a year, the company says, with a four-fab installation estimated to cost under $100 million. Nanotronics declined to disclose how much it has raised for the project, but Putman says the company has previously raised $170 million from investors, including Peter Thiel and Jann Tallin, the Skype cofounder…

A single automated Cubefab will need only about 30 people to operate, “and they don’t have to be semiconductor experts,” says Putman. “AI takes away that need for that specialization that you would normally need in a fab.” […] Putman also hopes automation will help further reduce the environmental impact of an industry that’s notoriously resource-intensive and produces thousands of tons of waste a year, much of it hazardous. “Because you have the AI fixing the material and the device before it’s manufactured, you have less waste of the final material,” he says.

Thanks to Slashdot reader tedlistens for sharing the news.

Read more of this story at Slashdot.



Source: Slashdot – Startup Aims to Build Hundreds of Chip Factories with Prefab Parts and AI

Chinese Scientists Claim Record-Smashing Quantum Computing Breakthrough

From the South China Morning Post:

Scientists in China say their latest quantum computer has solved an ultra-complicated mathematical problem within a millionth of a second — more than 20 billion years quicker than the world’s fastest supercomputer could achieve the same task. The JiuZhang 3 prototype also smashed the record set by its predecessor in the series, with a one million-fold increase in calculation speed, according to a paper published on Tuesday by the peer-reviewed journal Physical Review Letters…

The series uses photons — tiny particles that travel at the speed of light — as the physical medium for calculations, with each one carrying a qubit, the basic unit of quantum information… The fastest classical supercomputer Frontier — developed in the US and named the world’s most powerful in mid-2022 — would take over 20 billion years to complete the same task, the researchers said.
The article claims they’ve increased the number of photons from 76 to 113 in the first two versions, improving to 255 in the latest iteration.
Thanks to long-time Slashdot reader hackingbear for sharing the news.

Read more of this story at Slashdot.



Source: Slashdot – Chinese Scientists Claim Record-Smashing Quantum Computing Breakthrough

US Antitrust Enforcer Continues Fighting Microsoft/Activision Deal, Calls it 'A Threat to Competition'

Yesterday America’s Federal Trade Commission said it remained focused on its appeal opposing Microsoft’s deal to buy Activision, reports Reuters.

Reuters notes that Microsoft and Activision closed their transaction Friday “after winning approval from Britain on condition that they sell the streaming rights to Activision’s games to Ubisoft Entertainment.” But the U.S. Federal Trade Commission “has also fought the deal, and has an argument scheduled before an appeals court on December 6. The agency said on Friday that it remained focused on that appeal.” An FTC spokesperson had this comment for Reuters.
“The FTC continues to believe this deal is a threat to competition.”

Read more of this story at Slashdot.



Source: Slashdot – US Antitrust Enforcer Continues Fighting Microsoft/Activision Deal, Calls it ‘A Threat to Competition’

Third-party Reddit App Narwhal Hopes To Survive Reddit's App Purge With Subscriptions

An anonymous reader shared this report from TechCrunch:

After a nasty battle between the developers of third-party apps and Reddit management, ultimately resulting in a site-wide protest, many app makers were put out of business due to Reddit’s price increases related to the usage of its API. Though the changes meant the loss of popular apps like Apollo, RIF (Reddit is Fun), ReddPlanent, Sync and BaconReader, one app, Narwhal, is attempting to make a comeback. The company announced this week that it will implement a subscription-based version of its app at $3.99 per month, promising an ad-free and privacy-focused experience.

The new app will also include a Tip Jar to solicit donations to help keep the app afloat beyond the subscription fees and fund additional development work. Though not available at launch, the app’s developer Rick Harrison (u/det0ur on Reddit and CTO at Meadow by day) says he’s considering adding a small fee, perhaps $1 per month, to allow users to also check their notifications and messages… Notes Narwhal’s developer, Reddit’s fee will be “tens of thousands if not hundreds of thousands a month depending on how many people subscribe.” To work, the app will need a critical mass of subscribers to cover its costs, but Harrison says he’s fairly confident the model will work.
“Also, with a simpler plan like this, I can offer a subscription on a Narwhal website for 30% less (no Apple cut),” Harrison wrote…

Narwhal isn’t the only Reddit client to attempt to remain in business despite Reddit’s API pricing changes. Another, Relay, announced a multi-tier subscription plan where users have to choose one of six price points, each that caps them at a certain number of API calls.

Read more of this story at Slashdot.



Source: Slashdot – Third-party Reddit App Narwhal Hopes To Survive Reddit’s App Purge With Subscriptions

Could The Next Big Solar Storm Fry the Grid?

Long-time Slashdot reader SonicSpike shared the Washington Post’s speculation about the possibility of a gigantic solar storm leaving millions without phone or internet access, and requiring months or years of rebuilding:
The odds are low that in any given year a storm big enough to cause effects this widespread will happen. And the severity of those impacts will depend on many factors, including the state of our planet’s magnetic field on that day. But it’s a near certainty that some form of this catastrophe will happen someday, says Ian Cohen, a chief scientist who studies heliophysics at the Johns Hopkins Applied Physics Laboratory.

Long-time Slashdot reader davidwr remains skeptical. “I’ve only heard of two major events in the last 1300 years, one estimated to be between A. D. 744 and A. D. 993, and the other being the Carrington Event in 1859.

But efforts are being made to improve our readiness, reports the Washington Post:
To get ahead of this threat, a loose federation of U.S. and international government agencies, and hundreds of scientists affiliated with those bodies, have begun working on how to make predictions about what our Sun might do. And a small but growing cadre of scientists argue that artificial intelligence will be an essential component of efforts to give us advance notice of such a storm…

At present, no warning system is capable of giving us more than a few hours’ notice of a devastating solar storm. If it’s moving fast enough, it could be as little as 15 minutes. The most useful sentinel — a sun-orbiting satellite launched by the U.S. in 2015 — is much closer to Earth than the sun, so that by the time a fast-moving storm crosses its path, an hour or less is all the warning we get. The European Space Agency has proposed a system to help give earlier warning by putting a satellite dubbed Vigil into orbit around the Sun, positioned roughly the same distance from the Earth as the Earth is from the Sun. It could potentially give us up to five hours of warning about an incoming solar storm-enough time to do the main thing that can help preserve electronics: Switch them all off.
But what if there were a way to predict this better, by analyzing the data we’ve got? That’s the idea behind a new, AI-powered model recently unveiled by scientists at the Frontier Development Lab — a public-private partnership that includes NASA, the U.S. Geological Survey, and the U.S. Department of Energy. The model uses deep learning, a type of AI, to examine the flow of the solar wind, the usually calm stream of particles that flow outward from our sun and through the solar system to well beyond the orbit of Pluto. Using observations of that solar wind, the model can predict the “geomagnetic disturbance” an incoming solar storm observed by sun-orbiting satellites would cause at any given point on Earth, the researchers involved say. This model can predict just how big the flux of the Earth’s magnetic field will be when the solar storm arrives, and thus how big the induced currents in power lines and undersea internet cables will be…

Already, the first primitive ancestor of future AI-based solar-weather alert systems is live. The DstLive system, which debuted on the web in December 2022, uses machine learning to take data about the state of Earth’s magnetic field and the solar wind and translate both into a single measure for the entire planet, known as DST. Think of it as the Richter scale, but for solar storms. This number is intended to give us an idea of how intense a storm’s impact will be on earth, an hour to six hours in advance.

Unfortunately, we may not know how useful such systems are until we live through a major solar storm.

Read more of this story at Slashdot.



Source: Slashdot – Could The Next Big Solar Storm Fry the Grid?

FTX Thief Cashes Out Millions During Bankman-Fried Trial

An anonymous reader quotes a report from the BBC: A thief who stole more than $470 million in cryptocurrency when FTX crashed is trying to cash it out while the exchange’s founder is on trial. Sam Bankman-Fried’s high-profile court case began last week. The former crypto mogul denies fraud. After lying dormant for nine months, experts say $20 million of the stolen stash is being laundered into traditional money every day. New analysis shows how the mystery thief is trying to hide their tracks. […] On the day FTX collapsed, hundreds of millions of dollars of cryptocurrency controlled by the exchange were stolen by an unidentified thief that is believed to still have control of the funds. No one knows how the thief — or thieves — was able to get digital keys to FTX crypto wallets, but it is thought it was either an insider or a hacker who was able to steal the information. The criminal moved 9,500 Ethereum coins, then worth $15.5 million, from a wallet belonging to FTX, to a new wallet. Over the next few hours, hundreds of other cryptoassets were taken from the company’s wallets, in transactions eventually totaling $477 million.

According to researchers from Elliptic, a cryptocurrency investigation firm, the thief lost more than $100 million in the weeks following the hack as some was frozen or lost in processing fees as they frantically moved the funds around to evade capture. But by December around $70 million was successfully sent to a cryptocurrency mixer — a criminal service used to launder Bitcoin, making it difficult to trace. […] Although mixers make it difficult to trace Bitcoin, Elliptic was able to follow a small amount of the funds — $4 million — that was sent to an exchange. The rest of the stolen FTX stash — around $230 million — remained untouched until 30 September — the weekend before Mr Bankman-Fried’s trial began. Nearly every day since then chunks worth millions have been sent to a mixer for laundering and then presumably cashing out. Elliptic has been able to trace $54 million of Bitcoin being sent to the Sinbad mixer after which the trail has gone cold for now. “Crypto launderers have been known to wait for years to move and cash out assets once public attention has dissipated, but in this case they have begun to move just as the world’s attention is once again directed towards FTX and the events of November 2022,” said Tom Robinson, Elliptic’s co-founder.

Read more of this story at Slashdot.



Source: Slashdot – FTX Thief Cashes Out Millions During Bankman-Fried Trial

Audit Calls NASA's Goal To Reduce Artemis Rocket Costs 'Highly Unrealistic,' Threat To Deep Space Exploration

Richard Tribou reports via Phys.Org: NASA’s goal to reduce the costs of the powerful Space Launch System rocket for its Artemis program by 50% was called “highly unrealistic” and a threat to its deep space exploration plans, according to a report by NASA’s Office of the Inspector General released (PDF) on Thursday. The audit says the costs to produce one SLS rocket through its proposed fixed-cost contract will still top $2.5 billion, even though NASA thinks it can shrink that through “workforce reductions, manufacturing and contracting efficiencies, and expanding the SLS’s user base.”

“Given the enormous costs of the Artemis campaign, failure to achieve substantial savings will significantly hinder the sustainability of NASA’s deep space human exploration efforts,” the report warns. The audit looked at NASA’s plans to shift from its current setup among multiple suppliers for the hardware to a sole-sourced services contract that would include the production, systems integration and launch of at least five SLS flights beginning with Artemis V currently slated for as early as 2029. NASA’s claim it could get those costs to $1.25 billion per rocket was taken to task by the audit.

“NASA’s aspirational goal to achieve a cost savings of 50% is highly unrealistic. Specifically, our review determined that cost saving initiatives in several SLS production contracts were not significant,” the audit reads. It does find that rocket costs could approach $2 billion through the first 10 SLS rockets under the new contract, a reduction of 20%. […] Through 2025, the audit stated its Artemis missions will have topped $93 billion, which includes billions more than originally announced in 2012 as years of delays and cost increases plagued the leadup to Artemis I. The SLS rocket represents 26% of that cost to the tune of $23.8 billion. The inspector general makes several recommendations to NASA. The most striking of which is that NASA consider using commercial heavy-lift rockets, such as SpaceX’s Starship and Super Heavy or Blue Origin’s New Glenn, as an alternative to the SLS rocket for future Artemis missions.

“Although the SLS is the only launch vehicle currently available that meets Artemis mission needs, in the next 3 to 5 years other human-rated commercial alternatives that are lighter, cheaper, and reusable may become available,” the audit reads. “Therefore, NASA may want to consider whether other commercial options should be a part of its mid- to long-term plans to support its ambitious space exploration goals.”

Read more of this story at Slashdot.



Source: Slashdot – Audit Calls NASA’s Goal To Reduce Artemis Rocket Costs ‘Highly Unrealistic,’ Threat To Deep Space Exploration

Hydro Dams Are Struggling To Handle the World's Intensifying Weather

Saqib Rahim reports via Wired: It’s been one of the wettest years in California since records began. From October 2022 to March 2023, the state was blasted by 31 atmospheric rivers — colossal bands of water vapor that form above the Pacific and become firehoses when they reach the West Coast. What surprised climate scientists wasn’t the number of storms, but their strength and rat-a-tat frequency. The downpours shocked a water system that had just experienced the driest three years in recorded state history, causing floods, mass evacuations, and at least 22 deaths.

Swinging between wet and dry extremes is typical for California, but last winter’s rain, potentially intensified by climate change, was almost unmanageable. Add to that the arrival of El Nino, and more extreme weather looks likely for the state. This is going to make life very difficult for the dam operators tasked with capturing and controlling much of the state’s water. Like most of the world’s 58,700 large dams, those in California were built for yesterday’s more stable climate patterns. But as climate change taxes the world’s water systems — affecting rainfall, snowmelt, and evaporation — it’s getting tough to predict how much water gets to a dam, and when. Dams are increasingly either water-starved, unable to maintain supplies of power and water for their communities, or overwhelmed and forced to release more water than desired — risking flooding downstream.

But at one major dam in Northern California, operators have been demonstrating how to not just weather these erratic and intense storms, but capitalize on them. Management crews at New Bullards Bar, built in 1970, entered last winter armed with new forecasting tools that gave unprecedented insight into the size and strength of the coming storms — allowing them to strategize how to handle the rain. First, they let the rains refill their reservoir, a typical move after a long drought. Then, as more storms formed at sea, they made the tough choice to release some of this precious hoard through their hydropower turbines, confident that more rain was coming. “I felt a little nervous at first,” says John James, director of resource planning at Yuba Water Agency in northern California. Fresh showers soon validated the move. New Bullards Bar ended winter with plumped water supplies, a 150 percent boost in power generation, and a clean safety record. The strategy offers a glimpse of how better forecasting can allow hydropower to adapt to the climate age.

Read more of this story at Slashdot.



Source: Slashdot – Hydro Dams Are Struggling To Handle the World’s Intensifying Weather