Safari Is More Private Than Ever in iOS 26

Did you know you can customize Google to filter out garbage? Take these steps for better search results, including adding my work at Lifehacker as a preferred source.


The internet isn’t private, but that doesn’t mean you have to willingly give up all of your information to use it. By default, trackers steal a lot of your data while you browse the web, but simple settings can block a lot of those trackers from functioning. As it turns out, by updating to iOS 26, your iPhone gets an upgraded tool against tracking.

The feature, called Advanced Tracking and Fingerprinting Protection, is an option in Safari that hides certain browser and device data trackers typically use to build a digital “fingerprint” of users. In this case, a fingerprint is a profile companies use to track your movements across the web. Your fingerprint may be made up of data like your device’s IP address, the device itself, your browser, fonts, plug-ins, as well as its screen resolution. All of these data points come together to build a unique picture of who you are; the more unique your fingerprint, the easier it is to track you, since the odds of someone else with your exact same details is slim.

Fingerprinting has a number of applications across the web, but perhaps the largest is for targeted advertising. The reason many of the ads you see are so relevant to your interests and browsing history is because of fingerprinting. Trackers know your internet habits, and can identify and deliver the advertisements you’re most likely to click or tap on. Companies can also sell these fingerprinting profiles to brokers and advertisers, making money on both your engagement with ads, as well as your raw data.

Advanced Tracking and Fingerprinting Protection simplifies each Safari user’s data, so our fingerprints aren’t quite so unique. By doing so, it’s harder for trackers to identify individuals, and, as such, makes it possible to browser the web more privately.

This feature was already a thing on iPhone before iOS 26. However, it was limited to Private Browsing by default. If you didn’t do all your browsing in a private tab, you likely lost some of these privacy protections, unless you knew to manually change the setting to cover all browsing. The same goes for iPadOS and macOS as well. Most of us probably didn’t change the setting, since it’s quite buried in each OS’ respective settings.

Once you update your iPhone, iPad, or Mac to the latest OS, however, your browsing gets these added tracking protections by default. Whether you browse in a private or a normal window, trackers will have a difficult time identifying your device.

How to check your Advanced Tracking and Fingerprinting Protection settings

This setting should be enabled by default when you update your device. But if you want to make sure it’s on, here’s where to find it. On your iPhone or iPad, open Settings, then head to Apps > Safari > Advanced. On your Mac, open Safari, press Command + , to pull up Safari’s settings, then choose the Advanced tab.

From here, ensure Advanced Tracking and Fingerprinting Protection is set to “All Browsing” (iOS/iPadOS) or “in all browsing” (macOS).

US Plans 1:1 Chip Production Rule To Curb Overseas Reliance

The U.S. is considering a rule requiring chipmakers to match the volume of semiconductors that their customers currently import from overseas providers through domestic production, or face tariffs. Reuters reports: President Donald Trump has doubled down on his efforts to reshore semiconductor manufacturing, offering exemptions from tariffs of roughly 100% on chips to firms that produce domestically. Companies that fail to sustain a 1:1 domestic-to-import ratio over time would face tariffs, the Journal said.
U.S. Commerce Secretary Howard Lutnick floated the idea with semiconductor executives, telling them it might be necessary for economic security, the Journal said.

“America cannot be reliant on foreign imports for the semiconductor products that are essential for our national and economic security,” the newspaper cited White House spokesperson Kush Desai as saying, who added that any reporting about policymaking should be treated as speculative, unless officially announced. […] Under the proposal, a company pledging to make chips in the U.S. would receive credit for that pledged volume, allowing imports without tariffs until the plant is complete, with initial relief to help ramp capacity, according to the report.


Read more of this story at Slashdot.

DJI loses lawsuit over Pentagon’s ‘Chinese military company’ list

It’s been nearly a year since DJI sued the Department of Defense over its designation as a “Chinese military company.” On Friday, a judge ruled against the drone maker. US District Judge Paul Friedman said the DoD presented enough evidence that DJI contributes to the Chinese military.

“Indeed, DJI acknowledges that its technology can and is used in military conflict but asserts that its policies prohibit such use,” Friedman wrote in his opinion. “Whether or not DJI’s policies prohibit military use is irrelevant. That does not change the fact that DJI’s technology has both substantial theoretical and actual military application.”

DJI challenged the designation in October 2024. It told the court it is “neither owned nor controlled by the Chinese military.” The company claimed in its filing that it suffered “ongoing financial and reputational harm” as a result of the inclusion. The designation can prevent companies from accessing grants, contracts, loans and other programs.

The drone maker has a contentious history with the US government. The Department of Commerce added it and 77 other companies to its Entity List in 2020, effectively blocking US businesses from dealing with them. A year later, the Treasury Department included DJI on its “Chinese military-industrial complex companies” list. That designation was for its alleged involvement in the surveillance of Uyghur Muslim people in China. Last year, US customs began holding up DJI’s consumer drones at the border.

The company now faces a potential import ban in the US by the end of this year. The ban was initially scheduled for 2024. But a clause in the $895 billion US Defense Bill gave it a year to prove that its products don’t pose a national security risk. In March, DJI pleaded with five national security agencies (DHS, DoD, FBI, NSA, and ODNI) to begin evaluating its products “right away.”

This article originally appeared on Engadget at https://www.engadget.com/big-tech/dji-loses-lawsuit-over-pentagons-chinese-military-company-list-204804617.html?src=rss

xAI Offers Grok To Federal Government For 42 Cents

xAI struck a deal with the U.S. General Services Administration to sell its chatbot Grok to federal agencies under the executive branch for 42 cents over 18 months, undercutting OpenAI and Anthropic’s $1 offerings. TechCrunch reports: The steep discount for federal agencies includes access to xAI engineers to help integrate the technology. The price point is either part of a running joke Musk has of using variations of 420, a marijuana reference, or a nod to one of Musk’s favorite books, “The Hitchhiker’s Guide to the Galaxy,” which references the number 42 as the answer to the meaning of life and the universe.

… In late August, internal emails obtained by Wired revealed the White House had instructed the GSA to add xAI’s Grok to the approved vendor list “ASAP.” The company was also one of several AI firms, including Anthropic, Google, and OpenAI, to be selected for a $200 million contract with the Pentagon. A GSA spokesperson told TechCrunch that Musk was not directly involved in negotiating the agreement.


Read more of this story at Slashdot.

Sinclair gets nothing it asked for, puts Jimmy Kimmel back on anyway

Conservative broadcaster Sinclair is putting Jimmy Kimmel Live! back on the air. In a statement today, Sinclair said it will end its preemption of the show on its ABC affiliates starting tonight, even though ABC and owner Disney haven’t accepted its request for an ombudsman and other changes. Facing the threat of lost advertising dollars, Sinclair said it “received thoughtful feedback from viewers, advertisers, and community leaders representing a wide range of perspectives.”

Sinclair also said its decision to preempt Kimmel “was independent of any government interaction or influence.” Sinclair’s preempting of Kimmel last week came just as Federal Communications Commission Chairman Brendan Carr said TV station owners that didn’t preempt the show could lose their FCC licenses.

Sinclair last week said it wouldn’t air Kimmel on its stations “until formal discussions are held with ABC regarding the network’s commitment to professionalism and accountability.” Sinclair at the time praised Carr for his stance against Kimmel and urged the FCC to “take immediate regulatory action to address control held over local broadcasters by the big national networks.”

Read full article

Comments

You should care more about the stabilizers in your mechanical keyboard—here’s why

While most people don’t spend a lot of time thinking about the keys they tap all day, mechanical keyboard enthusiasts certainly do. As interest in DIY keyboards expands, there are plenty of things to obsess over, such as keycap sets, layout, knobs, and switches. But you have to get deep into the hobby before you realize there’s something more important than all that: the stabilizers.

Even if you have the fanciest switches and a monolithic aluminum case, bad stabilizers can make a keyboard feel and sound like garbage. Luckily, there’s a growing ecosystem of weirdly fancy stabilizers that can upgrade your typing experience, packing an impressive amount of innovation into a few tiny bits of plastic and metal.

What is a stabilizer, and why should you care?

Most keys on a keyboard are small enough that they go up and down evenly, no matter where you press. That’s not the case for longer keys: Space, Enter, Shift, Backspace, and, depending on the layout, a couple more on the number pad. These keys have wire assemblies underneath called stabilizers, which help them go up and down when the switch does.

Read full article

Comments

SFMTA Scambles To Shut Down Viral Parking Ticket Tracker

An anonymous reader quotes a report from SFGATE: It had all the makings of a viral X post, and viral it did go, with over 8 million views in under 24 hours. The message was straightforward: “I reverse engineered the San Francisco parking ticket system. I can see every ticket seconds after it’s written.” Underneath it was a familiar image for any iPhone user — an Apple map of the city dotted with gray, initialed bubbles, and an explanation: “So I made a website. Find My Friends?” No. “AVOID THE PARKING COPS.” The anarchy, however, was short-lived. […]

Given the potential lost revenue at stake, the San Francisco Municipal Transportation Agency caught on like the rest of the internet, and by Tuesday afternoon, the site had been quickly rendered obsolete. Undeterred, [creator of the site, Riley Walz] restored the site again after 10 p.m., though this, too, didn’t last. By his estimation, it was only active for a few more hours. “We made sure that all access to citation data was via authorized routes,” said Erica Kato, a spokesperson for SFMTA, in an email to SFGATE. “But when our staff’s safety, and personal information of people who have received parking citations, is at risk, we must act on that swiftly.”

Yet the saga wasn’t over. By Wednesday, the official SFMTA ticket payment site was also down, citing “maintenance.” “I’m curious what was going on there,” said Walz over the phone. “If it is even because of me.” As of Wednesday afternoon, that site is functional and the chaos seems over for now. According to SFMTA, there is no need for a site like Walz’s.”The official way to access our parking citation data is via our public website on DataSF,” Kato said. “Anyone is still able to see [the] type of citation, date of issuance and data that can be mapped and analyzed on DataSF daily.”


Read more of this story at Slashdot.

Apple reportedly made a ChatGPT-clone to test Siri’s new capabilities

In the pursuit of actually releasing the updated version of Siri the company promised way back at WWDC 2024, Apple is taking a page out of OpenAI’s book. According to Bloomberg, the company has created a ChatGPT-inspired app to test Siri’s new capabilities ahead of the release of the improved voice assistant next year.

This new app, called “Veritas” internally, will likely never make its way to the public in its current form, but offers Apple employees a faster way to test Siri’s new skills. That includes letting users search through personal data stored on their phone, like their emails and messages, or taking action in apps, like editing photos. The new app is apparently also a way for Apple to “gather feedback on whether the chatbot format has value,” Bloomberg writes.

While an internal app doesn’t make it any clearer how useful Apple’s updated Siri will be, it does suggest the project is in a more advanced stage than before. Given the difficulty the company’s faced actually releasing its various AI products — including publicly delaying the Siri update back in March 2025 — that’s meaningful.

Apple’s original promise for Apple Intelligence was that it could offer a curated selection of AI-powered features with a level of privacy and polish that its competitors couldn’t muster. The reality is that Apple shipped a collection of so-so features that worked, but couldn’t pull off its truly impressive demo: a Siri informed on the context of your life and with the ability to actually do things on your phone.

Apple is only realizing that vision in 2026, Bloomberg reports, through a combination of its own AI models, and at least one third-party model from its competitors. In June, the company was reportedly considering using a model from either OpenAI or Anthropic, but as of August, the company is now apparently circling a partnership with Google.

This article originally appeared on Engadget at https://www.engadget.com/ai/apple-reportedly-made-a-chatgpt-clone-to-test-siris-new-capabilities-194902560.html?src=rss

Abu Dhabi Royal Family To Take Stake In TikTok US

Abu Dhabi’s MGX (chaired by Sheikh Tahnoon bin Zayed Al Nahyan) is set to take a 15% stake in TikTok’s U.S. business after Donald Trump signed an executive order Thursday night brokering a deal that puts the social media company under U.S. ownership. “Larry Ellison’s Oracle, the private equity group Silver Lake and Abu Dhabi’s MGX will control roughly 45% of TikTok US,” adds The Guardian. “Overall, American companies are expected to control just over 65% of the company, with Trump also naming the personal computer pioneer Michael Dell and Rupert Murdoch’s Fox as other investors.” From the report: “[TikTok US] will be majority-owned and controlled by United States persons and will no longer be controlled by any foreign adversary,” Trump said. “We have American investors taking it over, running it [who are] highly sophisticated, including Larry Ellison. Great investors, the biggest. They don’t get bigger. This is going to be American-operated all the way.”

TikTok’s Chinese owner, ByteDance, will retain a 19.9% stake in the US operation. China has not publicly made clear whether it will approve the deal, although Trump said that he “had a good talk” with the Chinese president, Xi Jinping, who “gave us the go-ahead.”

JD Vance, the US vice-president, said the deal valued TikTok US at $14 billion. “There was some resistance on the Chinese side,” Vance said. “But the fundamental thing that we wanted to accomplish is that we wanted to keep TikTok operating but we wanted to make sure that protected Americans’ data privacy as required by law.” He added: “This deal really does mean that Americans can use TikTok, but actually use it with more confidence than in the past. Because their data is going to be secure and it’s not going to be used as a propaganda weapon against our fellow citizens.”


Read more of this story at Slashdot.

The Social Network 2 is coming next fall and stars Jeremy Strong as Mark Zuckerberg

The long-awaited sequel to The Social Network will hit theaters next fall, according to a report by Deadline. The official release date is set for October 9, 2026, which is just about 16 years after the first film dropped.

We also have plenty of other information, including the full cast and the actual name of the movie. The official name is The Social Reckoning, which makes sense as the movie follows recent events in which Facebook got into legal and political trouble when a whistleblower alleged that the company knew the platform was harming society but did nothing about it.

The cast is being led by Jeremy Strong from Succession, who takes over Zuckerberg duties from actor Jesse Eisenberg. Mikey Madison is playing the aforementioned whistle blower Frances Haugen and The Bear’s Jeremy Allen White portrays Wall Street Journal reporter Jeff Horowitz.

Bill Burr is also appearing in this flick, though we don’t know in what capacity. The Hollywood Reporter has suggested he will play a fictional character invented for the film that will be an amalgamation of several people. Aaron Sorkin is both writing and directing this one. He wrote the first movie, but David Fincher directed it.

This article originally appeared on Engadget at https://www.engadget.com/entertainment/tv-movies/the-social-network-2-is-coming-next-fall-and-stars-jeremy-strong-as-mark-zuckerberg-191021848.html?src=rss

Electronic Arts Nears Roughly $50 Billion Deal To Go Private

According to the Wall Street Journal, the videogame giant Electronic Arts is nearing a $50 billion deal to go private. A group of investors, including private-equity firms Silver Lake and Saudi Arabia’s Public Investment Fund, may announce a deal for Electronic Arts as soon as next week. The report says it “would likely be the largest leveraged buyout of all time.”

Developing…


Read more of this story at Slashdot.

Chinese Hackers Breach US Software and Law Firms Amid Trade Fight

An anonymous reader quotes a report from CNN: A team of suspected Chinese hackers has infiltrated US software developers and law firms in a sophisticated campaign to collect intelligence that could help Beijing in its ongoing trade fight with Washington, cybersecurity firm Mandiant said Wednesday. The hackers have been rampant in recent weeks, hitting the cloud-computing firms that numerous American companies rely on to store key data, Mandiant, which is owned by Google, said. In a sign of how important China’s hacking army is in the race for tech supremacy, the hackers have also stolen US tech firms’ proprietary software and used it to find new vulnerabilities to burrow deeper into networks, according to Mandiant.

[…] In some cases, the hackers have lurked undetected in the US corporate networks for over a year, quietly collecting intelligence, Mandiant said. The disclosure comes after the Trump administration escalated America’s trade war with China this spring by slapping unprecedented tariffs on Chinese exports to the United States. The tit-for-tat tariffs set off a scramble in both governments to understand each other’s positions. Mandiant analysts said the fallout from the breaches — the task of kicking out the hackers and assessing the damage — could last many months. They described it as a milestone hack, comparable in severity and sophistication to Russia’s use of SolarWinds software to infiltrate US government agencies in 2020.


Read more of this story at Slashdot.

Microsoft’s fix for PC shader compilation stutter could take years to fully implement

Microsoft has a fix for long shader compilation wait times. The system is called Advanced Shader Delivery, and it’s being first introduced for ASUS ROG Xbox Ally handhelds and games listed on the Xbox app.

Just about every PC gamer knows the feeling of booting up a highly anticipated new AAA title, excited to explore its sprawling environments or open world, only to be hit with “compiling shaders” and a progress bar that seems to move at a snail’s pace. Depending on what specs you’re rocking and what game you’ve just installed, the wait could be as much as one to two hours for those with slower CPUs and older systems.

While it seems increasingly common that huge games are using these shader compilation screens before even getting to the main menu (looking at you Hogwarts Legacy), games that choose not to use them still need to load and compile shaders. If they aren’t done ahead of time, then they must be done during gameplay, which can lead to in-game stuttering that many gamers are also familiar with.

Advanced Shader Delivery would preempt this by doing the entire compilation process ahead of time and storing those compiled shaders in the cloud. The catch is that shader compilation is hardware-specific, and since there are myriad GPU and driver combos, it would take a few dozen sets of compiled shaders to cover all the most common setups, and that’s per game. Extrapolate that out even just to all the AAA titles released yearly, and you’ve got yourself a massive database.

This is similar to how shader compilation works on consoles, but you’re talking about at most two or three versions per console, or even fewer in the case of the Nintendo Switch. In fact, that’s precisely why Microsoft is starting with the ASUS ROG Xbox Ally handhelds, which comprises only two hardware configurations.

Microsoft’s Agility SDK for game developers now supports Advanced Shader Delivery, meaning devs could start building it into new games already. In practice, it can take years to fully capitalize on new technologies like this.

That’s exactly what we’ve seen with Direct Storage, another Microsoft technology meant to reduce asset load times. Three years after its release, we still see only a handful of big titles incorporating Direct Storage. It might be a long time before we see Advanced Shader Delivery incorporated into most popular games and available on different store fronts like Steam.

This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/microsofts-fix-for-pc-shader-compilation-stutter-could-take-years-to-fully-implement-183904449.html?src=rss

The Complicated Ethics (and Laws) of Smart Glasses

Did you know you can customize Google to filter out garbage? Take these steps for better search results, including adding my work at Lifehacker as a preferred source.


The nearly universal adoption of smartphones in the late 2000s changed more than how we waste time while waiting in lines. With nearly everyone carrying a high-quality camera and microphone in their pocket—and the ability to instantly broadcast anything to a potential audience of millions—our collective concept of privacy has been permanently altered. If you’re not a little concerned with how what you do in public would play on YouTube, you’re not paying attention. 

As smart glasses equipped with cameras and mics edge closer to mainstream adoption, we’re facing another, subtler shift. Unlike smartphones, where it’s obvious when someone is recording, smart glasses can capture video or audio nearly invisibly—raising fresh legal, ethical, and moral concerns. Here’s what you should be aware of, whether you’re currently rocking smart glasses or plan to in the future.

The legality of filming in public

What the general public thinks of as “privacy” may have shifted, but the law may not have kept pace. “Current laws do not provide the protection that most people would probably expect that they should,” says David B. Hoppe, an international transactional lawyer who specializes in emerging legal issues in media and technology.

Some statutes have been written to account for new technology—prohibitions on revenge porn, for instance—but the overarching legal framework concerning privacy was developed for a pre-smartphone, pre-smart glasses world. So let’s dig into it.

A primer on public photography

State and federal laws have criminalized some kinds of recordings in public, like shooting videos up people’s skirts, but in general, the First Amendment provides broad protection of people’s right to take photos and videos of whatever they can see. “In general, our presumption is that capturing photos, videos, or other data from public spaces is unrestricted,” says Eric Goldman, a professor at Santa Clara University School of Law and Co-Director of the High Tech Law Institute.

That presumption applies to smart glasses, so if you’re in a public space, you can usually record what you’d like. “As a general matter, the video function could be used in a public setting,” Hoppe says. 

How you use a recording matters, though. “An issue that could arise is whether or not there’s a commercial aspect to its use,” Hoppe says. “In many states there could be an obligation to have cleared the publicity rights from any individuals who are identifiable in the video.”

The meaning of “commercial,” though, can be tricky. Something like filming an advertisement would likely be considered commercial speech and have less legal protection, in terms of privacy, than something like making an art movie for your film class. Somewhere in the middle is earning money from a social media video. Monetizing doesn’t automatically remove legal free speech protection, but it could shift content toward commercial speech, and local filming laws could apply to what you shoot as well. It’s complicated, so if you have any doubts, talk to a lawyer.

Private businesses are a bit different, though

Courts have largely held that a patron in a private business that is open to the public, like a store or a restaurant, can expect more privacy than they have while on a public sidewalk, but less than they’d have if they were somewhere really private, like their home. “It gets into expectations of privacy,” explains Goldman. “A restaurant could be anywhere from family-seating, where that expectation would be unreasonable, to a private booth that has 50 feet in any direction from any other seat, which might be a more reasonable expectation of privacy.”

While a person can generally legally capture images in a business that’s open to the public, it’s within the owners’ rights to prohibit filming. “Normally businesses can set rules for how their customers engage with each other,” Goldman says. “The recourse would be banning you from their premises.”

So if you turn on your Ray-Ban Metas in the gym, you probably won’t be arrested, but the gym could/should have a “no photography” policy that it could enforce by having you banned from the premises and calling the cops if you won’t leave. Of course, recording in private areas of any business, like the locker room of said gym, is illegal everywhere in the U.S. 

Video vs. audio recording

Recording sound from a pair of smart glasses could expose you to legal risks that shooting video may not. While images taken in public of anything in plain view are generally legal, audio is a different story. Just like a conversation in a restaurant, the key factor is the “reasonable expectation of privacy.” Two people having a quiet conversation on a park bench likely expect a level of privacy that a guy shouting on a street corner does not.

Courts have largely agreed that recording conversations in public is protected by the Constitution, as long as everyone in the conversation knows they are being recorded and agrees to it. The opposite situation—a third party recording a private conversation without the participants’ knowledge—would often be considered “eavesdropping,” and that’s often a crime. 

It gets tricky when only one party consents to a recording. “In general, there are some states that have required that any recording of a conversation between two parties requires the consent of both parties,” Goldman says. “So if the glasses are being used in those conversations, without consent from the other party, that would be a violation in those states.”

Here’s a breakdown of one-party consent states and all-party consent states. If you have any doubts about the legality of a recording, consult with a lawyer, or just don’t hit record.

The other side of the coin: what about the users’ privacy?

Maybe you bought a pair of smart glasses to record your life, but make no mistake: you are the one being recorded. When you click “agree” on that terms of service screen, you could be allowing a big data company to collect your GPS data, biometric data (like eye movements and health information), contact lists, messages, political views, what you see, what you say, who you talk to, and more. And it’s legal because you agreed to it. Usually.

“Some [data collected by your smart glasses] is controlled by contract,” Goldman says. “So Meta would disclose its privacy policies in some disclosure to the consumer, and then those might be the rules that apply. There are some places where there may be limits on the ability of Meta to access that data,” Goldman says.

Bottom line: you have some protections over your personal data that aren’t necessarily signed away with a click. A patchwork of federal laws provide specific protections: HIPAA protects the privacy of your medical records, FCRA protects your credit reports, and other federal laws protect financial information children’s privacy. But more meaningful consumer privacy protection comes from California state law. In the last 10 years, Cali has enacted relatively robust privacy protection laws that give Californians the right to know what personal data companies collect, the right to delete that data, and the right to opt out of their data being sold.

“But I live in Ohio,” you might be saying. First, sorry about that. Secondly, we have your back anyway! Big tech companies have largely adopted California’s privacy laws as their baseline for data collection. So while the amount of data being collected from your glasses isn’t ideal, at least you can claw some of it back.

Exciting new frontiers in privacy invasion

Check out this video of a recent concert from O.G. trip hop band Massive Attack:

The band is turning facial recognition technology on its audience, displaying audience members along with what seems to be their professions. The technology to instantly identify a stranger and scrape publicly available databases on that person is possible with existing technology in smart glasses, and is, in theory, perfectly legal. Even if the person being filmed doesn’t know you’re doing it. Again, how you use information you collect might not be legal.

According to Hoppe, the laws in place just weren’t written with smart glasses in mind. “The basic standard, that comes from common law times, was that if you’re in a public place, you don’t have a reasonable expectation of privacy, but at that point in time—and up until the last two decades—being in a public place meant you could be observed, but that you would simply be a memory in a human mind somewhere. It wouldn’t be recorded in video format that could immediately be published to the entire world.” Hoppe said. 

Where does the law go from here?

Right now, privacy laws in the U.S. are largely reactive and evolve after new technology has reshaped how we live. But what might it look like if we got ahead of the curve (or at least tried harder to catch up?) Like everything, it’s complicated.

Hoppe imagines one extreme: a “privacy maximalist” set of laws, where no one could be recorded without their consent, even in public. “That would make sense, right? But the challenge you then have is things like security cameras and other stationary devices that are simply recording everything. Is that really a privacy threat?” Hoppe says. “And if so, isn’t it outweighed by the beneficial effects to society as as a whole, in terms of protection of crime prevention and protection of property and so forth?”

And there’s that whole “freedom” thing. “The idea that there is a public sphere where we are free to capture and record and share our views about what we see, is an essential part of free speech,” Goldman says. “And if privacy laws were to overly restrict that, it would take our away our ability not only to express ourselves and and react to the world that we see, but it would have significant power implications on the ability of people to control conversations in a way that would ultimately take power away from us as people…We cannot let the concerns about people’s desire to control what people know about them override the ability of people to have organic, healthy, pro-social conversations.”

The social norms of smart glasses recording

If you’re living your life in a halfway ethical manner (and you’re not providing cultural commentary in concert form like Massive Attack) you probably aren’t keen to privately dox everyone on the bus, and social norms are probably more important to you than potential legal penalties. Maybe you won’t be hauled away in cuffs for recording people eating dinner on the outdoor patio of a restaurant, but you will be met with scorn from just about every diner—especially if you’re sticking a phone in their face. Smart glasses, being less obvious than iPhones, change the equation somewhat. The etiquette around their use is evolving, leaving us all in a gray area where what’s legal and what’s socially acceptable don’t always line up.

Even if they’re not encoded in law, we’ve (mostly) collectively agreed upon some norms when it comes to cell phones—don’t film others in the gym, don’t stick your phone in a stranger’s face, etc.—and we’re getting there with smart glasses, but until we arrive, it’s going to be a bit tricky. 

Smart glasses make recording less obtrusive and more natural-feeling, but they also make it easier to cross lines without realizing it. So it’s best to err on the side of courtesy: respect people in public, respect private spaces, and be cautious of what you’re recording in private/public spaces—taking pictures of your meal and friends is cool; taking pictures of strangers is not. Getting it wrong probably won’t end up with being thrown into jail, but being known as “that creep with the damn Meta glasses” might ultimately be a worse fate.

Just using open source software isn’t radical any more. Europe needs to dig deeper

Companies must realize they can be more than pure consumers, and public sector ought to go beyond ‘promotion’Feature It is 2025. Linux will turn 34 and the Free Software Foundation (FSF) 40. For the EU and Europe at large, which is famously experimental with government deployments of open source tech, behind initiatives to promote open licensing, and whose governments promote equal opportunity for FOSS vendors in public tendering, it’s a crunch point.…