Cybersickness Could Spell an Early Death For the Metaverse

An anonymous reader quotes a report from the Daily Beast: Luis Eduardo Garrido couldn’t wait to test out his colleague’s newest creation. Garrido, a psychology and methodology researcher at Pontificia Universidad Catolica Madre y Maestra in the Dominican Republic, drove two hours between his university’s campuses to try a virtual reality experience that was designed to treat obsessive-compulsive disorder and different types of phobias. But a couple of minutes after he put on the headset, he could tell something was wrong. “I started feeling bad,” Garrido told The Daily Beast. He was experiencing an unsettling bout of dizziness and nausea. He tried to push through but ultimately had to abort the simulation almost as soon as he started. “Honestly, I don’t think I lasted five minutes trying out the application,” he said.

Garrido had contracted cybersickness, a form of motion sickness that can affect users of VR technology. It was so severe that he worried about his ability to drive home, and it took hours for him to recover from the five-minute simulation. Though motion sickness has afflicted humans for thousands of years, cybersickness is a much newer condition. While this means that many of its causes and symptoms are understood, other basic questions — like how common cybersickness is, and whether there are ways to fully prevent it — are only just starting to be studied. After Garrido’s experience, a colleague told him that only around 2 percent of people feel cybersickness. But at a presentation for prospective students, Garrido watched as volunteers from the audience walked to the front of an auditorium to demo a VR headset — only to return shakily to their seats. “I could see from afar that they were getting sweaty and kind of uncomfortable,” he recalled. “I said to myself, ‘Maybe I’m not the only one.'”

As companies like Meta (nee Facebook) make big bets that augmented reality and virtual reality technology will go mainstream, the tech industry is still trying to figure out how to better recruit users to the metaverse, and get them to stay once there. But experts worry that cybersickness could derail these plans for good unless developers find some remedies soon. “The issue is actually something of a catch-22: In order to make VR more accessible and affordable, companies are making devices smaller and running them on less powerful processors,” adds the report. “But these changes introduce dizzying graphics — which inevitably causes more people to experience cybersickness.”

“At the same time, a growing body of research suggests cybersickness is vastly more pervasive than previously thought — perhaps afflicting more than half of all potential users.” When Garrido conducted his own study of 92 people, the results indicated that more than 65 percent of people experienced symptoms of cybersickness — a sharp contrast to the 2 percent estimate Garrido had been told.

He says that these results should be concerning for developers. “If people have this type of bad experience with something, they’re not going to try it again,” Garrido said.

Read more of this story at Slashdot.

Source: Slashdot – Cybersickness Could Spell an Early Death For the Metaverse

Newsom Vetoes 'Premature' Crypto Oversight Bill For California

California Governor Gavin Newsom vetoed a bill that would require crypto financial-service businesses to get a special license to operate, calling it premature and costly. Bloomberg reports: Newsom on Friday declined to sign the legislation known as the Digital Financial Assets Law, which was passed by the state assembly and senate last month. While the governor said he shares the bill’s intent to protect Californians from financial harm and provide clear rules for the industry, his administration has been conducting research and gathering input on the right approach. The bill would require a loan of “tens of millions of dollars” from the general fund during the first several years, a “significant” commitment that needs to be accounted for in the state’s annual budget process, Newsom added.

“It is premature to lock a licensing structure in statute without considering both this work and forthcoming federal actions,” Newsom said in a statement. “A more flexible approach is needed to ensure regulatory oversight can keep up with rapidly evolving technology and use cases, and is tailored with the proper tools to address trends and mitigate consumer harm.”

Read more of this story at Slashdot.

Source: Slashdot – Newsom Vetoes ‘Premature’ Crypto Oversight Bill For California

DOT To Map Out Nation's Time Zones After Report Shows No Official Map Exists

A person may take knowing the local time for granted, but an official review revealed that there is no single, accurate map showing the nation’s time zones and local observance of Daylight Saving Time. CNN reports: Federal transportation officials are now at work creating an accurate map of the nation’s time zones, according to a report by the inspector general for the Department of Transportation. The issue came up, the inspector general’s office said, after the US Senate passed legislation this year to end the biannual time turn by making Daylight Saving Time permanent.

Investigators found no single map accurately showing the boundaries nationwide and said several sources of time information on the DOT website contained errors, such as inaccurately noting the time practices in some localities. For example, one map incorrectly identifies a deviation in Nevada: “Elko County, NV is shown as the location that changed time zones rather than the correct location, the city of West Wendover.”

“The official boundaries are narratively described [in federal regulations] with various types of coordinates and geographic features such as lines of longitude, State or county lines, and rivers,” the report stated. The inspector general report said the Transportation Department is responsible for keeping the clock because of the importance of time to travel. It said the original five time zones have expanded to nine.

Read more of this story at Slashdot.

Source: Slashdot – DOT To Map Out Nation’s Time Zones After Report Shows No Official Map Exists

Is Plant-Based Meat Fizzling In the US?

Citing McDonald’s shelved meat-free burger trial and a 70% dip in Beyond Meat’s stock, The Guardian suggests plant-based meats may not interest Americans as much as investors thought. From the report: Getting meat eaters in the US to adopt plant-based alternatives has proven a challenge. Beyond Meat, which produces a variety of plant-based products, including imitations of ground beef, burgers, sausages, meatballs and jerky, has had a rough 12 months, with its stock dipping nearly 70%. Multiple chains that partnered with the company, including McDonald’s, have quietly ended trial launches. In August, the company laid off 4% of its workforce after a slowdown in sales growth. Last week, its chief operating officer was reportedly arrested for biting another man on the nose during a road rage confrontation. It’s a dramatic reversal of fortune. Just two years ago, Beyond Meat, its competitor Impossible Foods and the plant-based meat industry at large seemed poised to start a food revolution.

For a time, Wall Street went vegetarian. In 2019 Beyond Meat was valued at over $10 billion, more than Macy’s or Xerox. The most bullish investors believed that plant-based meat would make up 15% of all meat sales by 2030. But the reality of Americans’ interest in plant-based meat has proven more complicated than investors thought, and the adoption of meat alternatives has been slower than what was once hoped. Today Beyond Meat is valued at just over $900 million. The sobering story is similar to those experienced by many new ventures that see exhilarating hype after a flood of Silicon Valley venture capital cash, fueled by excitement about innovation. Bill Gates backed Beyond Meat, and a number of venture capital firms that typically invest in tech startups funneled money to startups making plant-based meat. Even the meat industry’s biggest players have, ironically, invested in companies coming up with plant-based meat. While eating plant-based meat (or no meat at all) has been shown to be the most effective thing individual consumers can do to fight climate change, “consumers seem hesitant to adapt their behavior when the environment — not their health or wallets — is the sole beneficiary,” reports The Guardian. “Despite the increasing alarm over climate change, the number of Americans who are vegetarian or vegan has remained relatively stable over the last 20 years.”

“Even when participants in a study conducted at Purdue University in Indiana were given information about the carbon footprint of meat production, participants were more likely to go with regular meat over a plant-based alternative.”

Read more of this story at Slashdot.

Source: Slashdot – Is Plant-Based Meat Fizzling In the US?

AV1 Update Reduces CPU Encoding Times By Up To 34 Percent

According to Phoronix, Google has released a new AOM-AV1 update — version 3.5, that drastically improves encode times when streaming, rendering, or recording from the CPU. At its best, the update can improve encoding times by up to 34%. Tom’s Hardware reports: It is a fantastic addition to AV1’s capabilities, with the encoder becoming very popular among powerful video platforms such as YouTube. In addition, we are also seeing significant support for AV1 hardware acceleration on modern discrete GPUs now, such as Intel’s Arc Alchemist GPUs and, most importantly – Nvidia’s RTX 40-series GPUs. Depending on the resolution, encoding times with the new update have improved by 20% to 30%. For example, at 1080P, encode times featuring 16 threads of processing are reduced by 18% to 34%. At 4K, render times improved by 18% to 20% with 32 threads. Google could do this by adding Frame Parallel Encoding to heavily multi-threaded configurations. Google has also added several other improvements contributing to AV1’s performance uplifts in other areas – specifically in real-time encoding.

In other words, CPU utilization in programs such as OBS has been reduced, primarily for systems packing 16 CPU threads. As a result, they are allowing users to use those CPU resources for other tasks or increase video quality even higher without any additional performance cost. If you are video editing and are rendering out a video in AV1, processing times will be vastly reduced if you have a CPU with 16 threads or more.

Read more of this story at Slashdot.

Source: Slashdot – AV1 Update Reduces CPU Encoding Times By Up To 34 Percent

Controversial Artist Matches Influencer Photos With Surveillance Footage

An anonymous reader quotes a report from Smithsonian Magazine: It’s an increasingly common sight on vacation, particularly in tourist destinations: An influencer sets up in front of a popular local landmark, sometimes even using props (coffee, beer, pets) or changing outfits, as a photographer or self-timed camera snaps away. Others are milling around, sometimes watching. But often, unbeknownst to everyone involved, another device is also recording the scene: a surveillance camera. Belgian artist Dries Depoorter is exploring this dynamic in his controversial new online exhibit, The Followers, which he unveiled last week. The art project places static Instagram images side-by-side with video from surveillance cameras, which recorded footage of the photoshoot in question.

To make The Followers, Depoorter started with EarthCam, a network of publicly accessible webcams around the world, to record a month’s worth of footage in tourist attractions like New York City’s Times Square and Dublin’s Temple Bar Pub. Then he enlisted an artificial intelligence (A.I.) bot, which scraped public Instagram photos taken in those locations, and facial-recognition software, which paired the Instagram images with the real-time surveillance footage. Depoorter calls himself a “surveillance artist,” and this isn’t his first project using open-source webcam footage or A.I. Last year, for a project called The Flemish Scrollers, he paired livestream video of Belgian government proceedings with an A.I. bot he built to determine how often lawmakers were scrolling on their phones during official meetings. “On its face, The Followers is an attempt, like many other studies, art projects and documentaries in recent years, to expose the staged, often unattainable ideals shown in many Instagram and influencer photos posted online,” writes Smithsonian’s Molly Enking. “But The Followers also tells a darker story: one of increasingly worrisome privacy concerns amid an ever-growing network of surveillance technology in public spaces. And the project, as well as the techniques used to create it, has sparked both ethical and legal controversy.”

Depoorter told Vice’s Samantha Cole that he got the idea when he “watched an open camera and someone was taking pictures for like 30 minutes.” He wondered if he’d be able to find that person on Instagram.

Read more of this story at Slashdot.

Source: Slashdot – Controversial Artist Matches Influencer Photos With Surveillance Footage

Crypto-Mixing Service Tornado Cash Code Is Back On GitHub

Code repositories for the Ethereum-based mixer Tornado Cash were relisted on GitHub on Thursday. CoinDesk reports: The U.S. Treasury Department’s Office of Foreign Assets (OFAC) banned Americans last month from using Tornado Cash, a decentralized privacy service that mixes cryptocurrencies together to obfuscate the original address. The mixer was blacklisted and designated under the Specially Designated National list because the North Korean hacking group Lazarus had used it in the past.

GitHub is a centralized internet hosting service for software development often used by Ethereum developers. Within hours of the OFAC announcement, GitHub, along with other platforms, removed Tornado Cash from their sites in order to comply with the new U.S. regulation. Ethereum developers — believing that computer code is protected speech under the First Amendment of the U.S. Constitution — have called for platforms that host the Tornado Cash code to reverse their bans. In particular, Ethereum core developer Preston Van Loon asked for GitHub to relist the mixer’s code on Sept. 13. Further reading: Treasury Says Sanctions On Tornado Cash Don’t Stop People From Sharing Code

Read more of this story at Slashdot.

Source: Slashdot – Crypto-Mixing Service Tornado Cash Code Is Back On GitHub

Ask Slashdot: What High-End Smartphone Is Best For Privacy?

New submitter cj9er writes: Considering all the privacy issues in today’s online climate (all the issues with Meta right now), what is the best high-end smartphone to select? Apple: No way they don’t sell your data… Sure, they have privacy for third-party apps, but what about the data they collect from the phone itself? Consider what the revenue is on a single smartphone (say $150), how do you think they have all that cash on hand?

Google: Yeah right, Pixel is probably collecting [data] 24/7 considering their main business is selling ads on Search. They have developed the Pixel line because they probably realized they were missing out on the direct collection of data from their own hardware (cut out the middle players using Android).
Samsung: Their TVs even collect and sell data on you. I don’t really understand the price premium on Galaxy phones anyways. I have kept my data and Wi-Fi turned off on my phones for years. Initially it was for battery reasons but now add in data collection. Ultimately, if we could turn off the GPS feature at will on our phones, maybe we could prevent all tracking (except for cellular triangulation). If we then think about safety, GPS is great and now with satellite-tracking on Apple phones, even better. But then what is going on behind the scenes 99.99% of the rest of the time when you don’t require those options for safety reasons? What phone manufacturer can be trusted?

Read more of this story at Slashdot.

Source: Slashdot – Ask Slashdot: What High-End Smartphone Is Best For Privacy?

Senators Introduce a Bill To Protect Open-Source Software

An anonymous reader quotes a report from the Washington Post: When researchers discovered a vulnerability in the ubiquitous open-source log4j system last year that could’ve affected hundreds of millions of devices, the executive branch snapped into action and major tech companies huddled with the White House. Now, leaders of the Senate Homeland Security and Governmental Affairs Committee are introducing legislation to help secure open-source software, first reported by The Cybersecurity 202. Chairman Gary Peters (D-Mich.) and top ranking Republican Rob Portman (Ohio) plan to hold a vote next week on the bill they’re co-sponsoring.

The Peters/Portman legislation would direct the Cybersecurity and Infrastructure Security Agency to develop a way to evaluate and reduce risk in systems that rely on open-source software. Later, CISA would study how that framework could apply to critical infrastructure. The log4j “incident presented a serious threat to federal systems and critical infrastructure companies — including banks, hospitals, and utilities — that Americans rely on each and every day for essential services,” Peters said in a written statement. “This common-sense, bipartisan legislation will help secure open source software and further fortify our cybersecurity defenses against cybercriminals and foreign adversaries who launch incessant attacks on networks across the nation.” Here’s how the Peters-Portman legislation works, as outlined in the report:
– It directs CISA to hire open-source experts “to the greatest extent practicable.”
– It gives the agency a year to publish a framework on open-source code risk. A year later and periodically thereafter, CISA would perform an assessment of open-source code components that federal agencies commonly use.
– Also, two years after publishing the initial framework, CISA would have to study whether it could be used in critical infrastructure outside the government and potentially work with one or more critical infrastructure sectors to voluntarily test the idea.
– Other agencies would have roles as well, such as the Office of Management and Budget publishing guidance to federal chief information officers on secure use of open-source software.

Read more of this story at Slashdot.

Source: Slashdot – Senators Introduce a Bill To Protect Open-Source Software

Microsoft Edge Found Serving Malicious Tech Support Scam Ads

AmiMoJo shares a report from Neowin: Anti-malware solutions maker Malwarebytes has recently uncovered a campaign which is serving tech support scams via malicious ads in Microsoft Edge’s ‘My Feed’ section. They provided an image that shows a screenshot of a malvertising campaign where a fake browser locker page is displayed to dupe potential victims. The adware is smart in the way it operates as Malwarebytes has found that the malicious ad banner redirects only potential targets to the tech support scam page. Meanwhile bots, VPNs and geo-locations are shown the actual ad page powered by the Taboola ad network. The firm notes that the differentiation is made with a help of a base64-encoded JavaScript string.

In the span of just 24 hours, Malwarebytes managed to collect over 200 different hostnames. Somewhat unsurprisingly perhaps, one of the associated domains is linked to an individual who appears to be the director of a software company operating in Delhi, India. You can find more details about this malvertising campaign on Malwarebytes’ blog post about the topic.

Read more of this story at Slashdot.

Source: Slashdot – Microsoft Edge Found Serving Malicious Tech Support Scam Ads

Coinbase Sued For Patent Infringement Over Crypto Transfer Technology

Coinbase is being sued by Veritaseum Capital LLC, which alleges that the crypto exchange has infringed on a patent awarded to Veritaseum founder Reggie Middleton. CoinDesk reports: According to Veritaseum, Coinbase has used the patent for some of its blockchain infrastructure, and the company is seeking at least $350 million in damages. Middleton and Veritaseum in 2019 settled a case with the U.S. Securities and Exchange Commission (SEC), paying nearly $9.5 million over charges surrounding the initial coin offering (ICO) for the company’s VERI token/ “Veritaseum’s website says it ‘builds blockchain-based, peer-to-peer capital markets as software on a global scale,'” adds Reuters, which first reported the lawsuit. “Thursday’s lawsuit accuses Coinbase features including its website, mobile app and Coinbase Cloud, Pay, and Wallet services of infringing a patent covering a secure method for processing digital-currency transactions.”

“Veritaseum Capital’s attorney Carl Brundidge of Brundidge Stanger said Friday that Coinbase was ‘uncooperative’ when they tried to settle out of court.”

Read more of this story at Slashdot.

Source: Slashdot – Coinbase Sued For Patent Infringement Over Crypto Transfer Technology

Hunga Tonga Eruption Put Over 50 Billion Kilograms of Water Into Stratosphere

An anonymous reader quotes a report from Ars Technica: In January this year, an undersea volcano in Tonga produced a massive eruption, the largest so far this century. The mixing of hot volcanic material and cool ocean water created an explosion that sent an atmospheric shockwave across the planet and triggered a tsunami that devastated local communities and reached as far as Japan. The only part of the crater’s rim that extended above water was reduced in size and separated into two islands. A plume of material was blasted straight through the stratosphere and into the mesosphere, over 50 km above the Earth’s surface. We’ve taken a good look at a number of past volcanic eruptions and studied how they influence the climate. But those eruptions (most notably that of Mount Pinatubo) all came from volcanoes on land. Hunga Tonga may be the largest eruption we’ve ever documented that took place under water, and the eruption plume contained unusual amounts of water vapor — so much of it that it actually got in the way of satellite observations at some wavelengths. Now, researchers have used weather balloon data to reconstruct the plume and follow its progress during two circuits around the globe.

Your vocabulary word of the day is radiosonde, which is a small instrument package and transmitter that can be carried into the atmosphere by a weather balloon. There are networks of sites where radiosondes are launched as part of weather forecasting services; the most relevant ones for Hunga Tonga are in Fiji and Eastern Australia. A balloon from Fiji was the first to take instruments into the eruption plume, doing so less than 24 hours after Hunga Tonga exploded. That radiosonde saw increasing levels of water as it climbed through the stratosphere from 19 to 28 kilometers of altitude. The water levels had reached the highest yet measured at the top of that range when the balloon burst, bringing an end to the measurements. But shortly after, the plume started showing up along the east coast of Australia, which again registered very high levels of water vapor. Again, water reached to 28 km in altitude but gradually settled to lower heights over the next 24 hours.

The striking thing was how much of it there was. Compared to normal background levels of stratospheric water vapor, these radiosondes were registering 580 times as much water even two days after the eruption, after the plume had some time to spread out. There was so much there that it still stood out as the plume drifted over South America. The researchers were able to track it for a total of six weeks, following it as it spread out while circling the Earth twice. Using some of these readings, the researchers estimated the total volume of the water vapor plume and then used the levels of water present to come up with a total amount of water put into the stratosphere by the eruption. They came up with 50 billion kilograms. And that’s a low estimate, because, as mentioned above, there was still water above the altitudes where some of the measurements stopped. The recent findings appear in a new study published in the journal Science.

Read more of this story at Slashdot.

Source: Slashdot – Hunga Tonga Eruption Put Over 50 Billion Kilograms of Water Into Stratosphere

CIA Launches First Podcast, 'The Langley Files'

The Central Intelligence Agency (CIA) is launching a podcast called “The Langley Files.” As the agency explains, “The mission of ‘The Langley Files: A CIA Podcast’ is to educate and connect with the general public, sharing insight into the Agency’s core mission, capabilities and agility as an intelligence leader… and to share some interesting stories along the way!” Variety reports: The podcast features suspenseful intro music and a narrator explaining that CIA will be “sharing what we can” with stories that go “beyond those of Hollywood scripts and shadowed whispers.” CIA Director Bill Burns is the featured guest on Episode 1 of “The Langley Files.” “We do usually operate in the shadows, out of sight and out of mind,” Burns said in the premiere. However, he continued, “in our democracy, where trust in institutions is in such short supply… it’s important to try to explain ourselves the best we can and to demystify a little bit of what we do.”

According to Burns, one of the biggest misconceptions people have about the CIA stems from Hollywood’s depictions of intelligence field agents. Many people think CIA is a “glamorous world” of “heroic individuals who drive fast cars and defuse bombs and solve world crises all on their own” — a la Jason Bourne, James Bond and Jack Ryan. (Bond is a British spy, but you get the drift.) On the podcast, Burns shared that he drives a 2013 Subaru Outback “at posted speed limits.” […] The CIA says each episode of the podcast will be about 15-30 minutes long and will “feature our hosts leading conversations with a range of special guests.” The series is distributed on major audio platforms including Apple Podcasts, Spotify, Google Podcasts, Amazon Music and “From all of us here at CIA — we’ll be seeing you,” said one of the hosts before signing off the inaugural episode.

Read more of this story at Slashdot.

Source: Slashdot – CIA Launches First Podcast, ‘The Langley Files’

Alien-Hunting Astronomer Says There May Be a Second Interstellar Object On Earth In New Study

A pair of researchers who previously identified what may be the first known interstellar meteor to impact Earth have now presented evidence of a second object that could have originated beyond the solar system, before it burned up in our planet’s skies and potentially fell to the surface, according to a new study. Motherboard reports: Amir Siraj, a student in astrophysics at Harvard University, and astronomer Avi Loeb, who serves as Harvard’s Frank B. Baird Jr. Professor of Science, suggest that a fast-moving meteor that burst into a fireball hundreds of miles off the coast of Portugal on March 9, 2017, is an “additional interstellar object candidate” that they call interstellar meteor 2 (IM2) in a study posted to the preprint server arXiv this week. The paper has not been peer-reviewed. In addition to their potential origin beyond the solar system, these objects appear to be extraordinarily robust, as they rank as the first- and third-highest meteors in material strength in a NASA catalog that has collected data about hundreds of fireballs.

“We don’t have a large enough sample to say how much stronger interstellar objects are than solar system objects, but we can say that they are stronger,” Siraj said in an email. “The odds of randomly drawing two objects in the top 3 out of 273 is 1 in 10 thousand. And when we look at the specific numbers relative to the distribution of objects, we find that the Gaussian odds are more like 1 in a million.” This makes IM2 “an outlier in material strength,” Loeb added in a follow-up call with Siraj. “To us, it means that the source is different from planetary systems like the solar system.”

Loeb has attracted widespread attention in recent years over his speculation that the first interstellar object ever identified, known as ‘Oumuamua, was an artifact of alien technology. Spotted in 2017, ‘Oumuamua sped through the solar system and was up to a quarter-mile in scale, making it much larger than the interstellar meteor candidates identified by Siraj and Loeb, which are a few feet across. Loeb’s claims of an artificial origin for ‘Oumuamua have provoked substantial pushback from many scientists who do not consider a technological explanation to be likely. Loeb also thinks these interstellar meteor candidates could be alien artifacts, though he and Siraj present a mind-boggling natural explanation for the strangely robust objects in the study: The meteors may be a kind of interstellar shrapnel produced by the explosions of large stars, called supernovae. […] Loeb, of course, is keeping his mind open. “We don’t say, necessarily, that it is artificial,” Loeb said in the call, referring to the supernovae explanation. But, he added, “obviously, there is a possibility that a spacecraft was designed to sustain such harsh conditions as passing through the Earth’s atmosphere, so we should allow for that.”

Read more of this story at Slashdot.

Source: Slashdot – Alien-Hunting Astronomer Says There May Be a Second Interstellar Object On Earth In New Study

Vultures Prevent Tens of Millions of Metric Tons of Carbon Emissions Each Year

An anonymous reader quotes a report from Scientific American: Vultures are hard birds for humans to love. They are an obligate scavenger, meaning they get all their food from already dead prey — and that association has cast them as a harbinger of death since ancient times. But in reality, vultures are nature’s flying sanitation crew. And new research adds to that positive picture by detailing these birds’ role in a surprising process: mitigating greenhouse gas emissions. With their impressive vision and the range they can cover in their long, soaring flights, the 22 species of vultures found around the world are often the first scavengers to discover and feed on a carcass. This cleanup provides a vital service to both ecosystems and humans: it keeps nutrients cycling and controls pathogens that could otherwise spread from dead animals to living ones.

Decaying animal bodies release greenhouse gases, including carbon dioxide and methane. But most of these emissions can be prevented if vultures get to the remains first, a new study in Ecosystem Services shows. It calculates that an individual vulture eats between 0.2 and one kilogram (kg) of carcass per day, depending on the vulture species. Left uneaten, each kg of naturally decomposing carcass emits about 0.86 kg of CO2 equivalent. This estimate assumes that carcasses not eaten by vultures are left to decay. But many carcasses are composted or buried by humans, which result in more emissions than natural decay, so vulture consumption can avert even more emissions when replacing those methods. The avoided emissions may not sound like much, but multiply those estimates by the estimated 134 million to 140 million vultures around the world, and the number becomes more impressive: tens of millions of metric tons of emissions avoided per year.

But this ecosystem service is not evenly distributed around the world. It occurs mostly in the Americas, says the study’s lead author Pablo Plaza, a biologist at the National University of Comahue in Argentina. Three species found only in the Americas — the Black, Turkey and Yellow-headed vultures — are responsible for 96 percent of all vulture-related emissions mitigation worldwide, Plaza and his colleagues found. Collectively, vultures in the Americas keep about 12 million metric tons of CO2 equivalent out of the atmosphere annually. Using estimates from the U.S. Environmental Protection Agency, that is akin to taking 2.6 million cars off the road each year. The situation outside of the Americas stands in stark contrast. “The decline in vulture populations in many regions of the world, such as Africa and Asia, has produced a concomitant loss of the ecosystem services vultures produce,” Plaza says.

Read more of this story at Slashdot.

Source: Slashdot – Vultures Prevent Tens of Millions of Metric Tons of Carbon Emissions Each Year

The World's Largest Carbon Removal Project Yet Is Headed For Wyoming

A couple of climate tech startups plan to suck a hell of a lot of carbon dioxide out of the air and trap it underground in Wyoming. The Verge reports: The goal of the new endeavor, called Project Bison, is to build a new facility capable of drawing down 5 million metric tons of carbon dioxide annually by 2030. The CO2 can then be stored deep within the Earth, keeping it out of the atmosphere, where it would have continued to heat up the planet. A Los Angeles-based company called CarbonCapture is building the facility, called a direct air capture (DAC) plant, that is expected to start operations as early as next year. It’ll start small and work up to 5 million metric tons a year. If all goes smoothly by 2030, the operation will be orders of magnitude larger than existing direct air capture projects.

CarbonCapture’s equipment is modular, which is what the company says makes the technology easy to scale up. The plant itself will be made of modules that look like stacks of shipping containers with vents that air passes through. At first, the modules used for Project Bison will be made at CarbonCapture’s headquarters in Los Angeles. In the first phase of the project, expected to be completed next year, around 25 modules will be deployed in Wyoming. Those modules will collectively have the capacity to remove about 12,000 tons of CO2 a year from the air. The plan is to deploy more modules in Wyoming over time and potentially manufacture the modules there one day, too.

Inside each of the 40-foot modules are about 16 “reactors” with “sorbent cartridges” that essentially act as filters that attract CO2. The filters capture about 75 percent of the CO2 from the air that passes over them. Within about 30 to 40 minutes, the filters have absorbed all the CO2 they can. Once the filters are fully saturated, the reactor goes offline so that the filters can be heated up to separate out the CO2. There are many reactors within one module, each running at its own pace so that they’re constantly collecting CO2. Together, they generate concentrated streams of CO2 that can then be compressed and sent straight to underground wells for storage. DAC is still very expensive — it can cost upwards of $600 to capture a ton of carbon dioxide. That figure is expected to come down with time as the technology advances. But for now, it takes a lot of energy to run DAC plants, which contributes to the big price tag. The filters need to reach around 85 degrees Celsius (185 degrees Fahrenheit) for a few minutes, and getting to those kinds of high temperature for DAC plants can get pretty energy-intensive. Eventually, […] Bison plans to get enough power from new wind and solar installations. When the project is running at its full capacity in 2030, it’s expected to use the equivalent of about 2GW of solar energy per year. For comparison, about 3 million photovoltaic panels together generate a gigawatt of solar energy, according to the Department of Energy. But initially, the energy used by Project Bison might have to come from natural gas, according to Corless. So Bison would first need to capture enough CO2 to cancel out the amount of emissions it generates by burning through that gas before it can go on to reduce the amount of CO2 in the atmosphere. “The geology in Wyoming allows Project Bison to store the captured CO2 on-site near the modules,” adds The Verge. “Project Bison plans to permanently store the CO2 it captures underground. Specifically, project leaders are looking at stowing it 12,000 feet underground in ‘saline aquifers’ — areas of rock that are saturated with salt water.”

Read more of this story at Slashdot.

Source: Slashdot – The World’s Largest Carbon Removal Project Yet Is Headed For Wyoming

Accused Russian RSOCKS Botmaster Arrested, Requests Extradition To US

A 36-year-old Russian man recently identified by KrebsOnSecurity as the likely proprietor of the massive RSOCKS botnet has been arrested in Bulgaria at the request of U.S. authorities. At a court hearing in Bulgaria this month, the accused hacker requested and was granted extradition to the United States, reportedly telling the judge, “America is looking for me because I have enormous information and they need it.” From the report: On June 22, KrebsOnSecurity published Meet the Administrators of the RSOCKS Proxy Botnet, which identified Denis Kloster, a.k.a. Denis Emelyantsev, as the apparent owner of RSOCKS, a collection of millions of hacked devices that were sold as “proxies” to cybercriminals looking for ways to route their malicious traffic through someone else’s computer. A native of Omsk, Russia, Kloster came into focus after KrebsOnSecurity followed clues from the RSOCKS botnet master’s identity on the cybercrime forums to Kloster’s personal blog, which featured musings on the challenges of running a company that sells “security and anonymity services to customers around the world.” Kloster’s blog even included a group photo of RSOCKS employees.

The Bulgarian news outlet reports that Kloster was arrested in June at a co-working space in the southwestern ski resort town of Bansko, and that the accused asked to be handed over to the American authorities. “I have hired a lawyer there and I want you to send me as quickly as possible to clear these baseless charges,” Kloster reportedly told the Bulgarian court this week. “I am not a criminal and I will prove it in an American court.” 24Chasa said the defendant’s surname is Emelyantsev and that he only recently adopted the last name Kloster, which is his mother’s maiden name. As KrebsOnSecurity reported in June, Kloster also appears to be a major player in the Russian email spam industry. […] Kloster turned 36 while awaiting his extradition hearing, and may soon be facing charges that carry punishments of up to 20 years in prison.

Read more of this story at Slashdot.

Source: Slashdot – Accused Russian RSOCKS Botmaster Arrested, Requests Extradition To US

Compute North Files For Bankruptcy As Cryptomining Data Center Owes Up To $500 Million

Compute North, one of the largest operators of crypto-mining data centers, filed for bankruptcy and revealed that its CEO stepped down as the rout in cryptocurrency prices weighs on the industry. CoinDesk reports: The company filed for Chapter 11 in the U.S. Bankruptcy Court for the Southern District of Texas and owed as much as $500 million to at least 200 creditors, according to a filing. Compute North in February announced a capital raise of $385 million, consisting of an $85 million Series C equity round and $300 million in debt financing. But it fell into bankruptcy as miners struggle to survive amid slumping bitcoin (BTC) prices, rising power costs and record difficulty in mining bitcoin. The filing is likely to have negative implications for the industry. Compute North is one of the largest data center providers for miners, and has multiple deals with other larger mining companies.

“The Company has initiated voluntary Chapter 11 proceedings to provide the company with the opportunity to stabilize its business and implement a comprehensive restructuring process that will enable us to continue servicing our customers and partners and make the necessary investments to achieve our strategic objectives,” a spokesperson told CoinDesk in an emailed statement. CEO Dave Perrill stepped down earlier this month but will continue to serve on the board, the spokesperson added. Drake Harvey, who has been chief operating officer for the last year, has taken the role of president at Compute North, the spokesperson said. Compute North has four facilities in the U.S. — two in Texas and one in both South Dakota and Nebraska, according to its website.

Read more of this story at Slashdot.

Source: Slashdot – Compute North Files For Bankruptcy As Cryptomining Data Center Owes Up To 0 Million

Bosses Think Workers Do Less From Home, Says Microsoft

An anonymous reader quotes a report from the BBC: A major new survey from Microsoft shows that bosses and workers fundamentally disagree about productivity when working from home. Bosses worry about whether working from home is as productive as being in the office. While 87% of workers felt they worked as, or more, efficiently from home, 80% of managers disagreed. The survey questioned more than 20,000 staff across 11 countries. Microsoft chief executive Satya Nadella told the BBC this tension needed to be resolved as workplaces were unlikely to ever return to pre-pandemic work habits. “We have to get past what we describe as ‘productivity paranoia,’ because all of the data we have that shows that 80% plus of the individual people feel they’re very productive — except their management thinks that they’re not productive. That means there is a real disconnect in terms of the expectations and what they feel.”

Both Mr Nadella and Ryan Roslansky, the boss of Microsoft-owned LinkedIn, said employers were grappling with perhaps the biggest shift in working patterns in history. The number of fully-remote jobs advertised on LinkedIn soared during the pandemic but Mr Roslansky said data suggested that type of role might have peaked. He told the BBC that of some 14 or 15 million job listings that are typically live on LinkedIn, about 2% of those involved remote working before the pandemic. Some months ago, that stood at 20%, and it has since come down to 15% this month. At a time of acute labour shortages, employers are having to work harder to recruit, enthuse and retain staff. That even includes Microsoft itself, according to Mr Nadella. “We had 70,000 people who joined Microsoft during the pandemic, they sort of saw Microsoft through the lens of the pandemic. And now when we think about the next phase, you need to re-energize them, re-recruit them, help them form social connections.”

An unprecedented number of people have also changed jobs since the start of the pandemic. A phenomenon Microsoft has dubbed “the great reshuffle”, sees workers born after 1997 (so-called Generation Z) nearly twice as likely to switch jobs. “At the peak of our ‘great reshuffle’ we saw a year-on-year increase of 50% of LinkedIn members changing jobs. Gen Z was at 90%,” the report said. By 2030, Generation Z will make up about 30% of the entire workforce so managers need to understand them, according to LinkedIn’s boss.

Read more of this story at Slashdot.

Source: Slashdot – Bosses Think Workers Do Less From Home, Says Microsoft

Fitbit Accounts Are Being Replaced By Google Accounts

New Fitbit users will be required to sign-up with a Google account, from next year, while it also appears one will be needed to access some of the new features in years to come. Trusted Reviews reports: Google has been slowly integrating Fitbit into the fold since buying the company back in November 2019. Indeed, the latest products are now known as “Fitbit by Google.” However, as it currently stands, device owners have been able to maintain separate accounts for Google and Fitbit accounts. Google has now revealed it is bringing Google Accounts to Fitbit in 2023, enabling a single login for both services. From that point on, all new sign ups will be through Google. Fitbit accounts will only be supported until 2025. From that point on, a Google account will be the only way to go. To aid the transition, once the introduction of Google accounts begins, it’ll be possible to move existing devices over while maintaining all of the recorded data.

Read more of this story at Slashdot.

Source: Slashdot – Fitbit Accounts Are Being Replaced By Google Accounts