❌

Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

βŒ₯ Engineering Consent

By: Nick Heer
30 July 2024 at 02:02

Anthony Ha, of TechCrunch, interviewed Jean-Paul Schmetz, CEO of Ghostery, and I will draw your attention to this exchange:

AH I want to talk about both of those categories, Big Tech and regulation. You mentioned that with GDPR, there was a fork where there’s a little bit of a decrease in tracking, and then it went up again. Is that because companies realized they can just make people say yes and consent to tracking?

J-PS What happened is that in the U.S., it continued to grow, and in Europe, it went down massively. But then the companies started to get these consent layers done. And as they figured it out, the tracking went back up. Is there more tracking in the U.S. than there is in Europe? For sure.

AH So it had an impact, but it didn’t necessarily change the trajectory?

J-PS It had an impact, but it’s not sufficient. Because these consent layers are basically meant to trick you into saying yes. And then once you say yes, they never ask again, whereas if you say no, they keep asking. But luckily, if you say yes, and you have Ghostery installed, well, it doesn’t matter, because we block it anyway. And then Big Tech has a huge advantage because they always get consent, right? If you cannot search for something in Google unless you click on the blue button, you’re going to give them access to all of your data, and you will need to rely on people like us to be able to clean that up.

The TechCrunch headline summarizes this by saying β€œregulation won’t save us from ad trackers”, but I do not think that is a fair representation of this argument. What it sounds like, to me, is that regulations should be designed more effectively.

The E.U.’s ePrivacy Directive and GDPR have produced some results: tracking is somewhat less pervasive, people have a right to data access and portability, and businesses must give users a choice. That last thing is, as Schmetz points out, also its flaw, and one it shares with something like App Tracking Transparency on iOS. Apps affected by the latter are not permitted to keep asking if tracking is denied, but they do similarly rely on the assumption a user can meaningfully consent to a cascading system of trackers.

In fact, the similarities and differences between cookie banner laws and App Tracking Transparency are considerable. Both require some form of consent mechanism immediately upon accessing a website or an app, assuming a user can provide that choice. Neither can promise tracking will not occur should a user deny the request. Both are interruptive.

But cookie consent laws typically offer users more information; many European websites, for example, enumerate all their third-party trackers, while App Tracking Transparency gives users no visibility into which trackers will be allowed. The latter choice is remembered forever unless a user removes and reinstalls the app, while websites can ask you for cookie consent on each visit. Perhaps the latter may sometimes be a consequence of using Safari; it is hard to know.

App Tracking Transparency also has a system-wide switch to opt out of all third-party tracking. There used to be something similar in web browsers, but compliance was entirely optional. Its successor effort, Global Privacy Control, is sadly not as widely supported as it ought to be, but it appears to have legal teeth.

Both of these systems have another important thing in common: neither are sufficiently protective of users’ privacy because they burden individuals with the responsibility of assessing something they cannot reasonably comprehend. It is patently ridiculous to put the responsibility on individuals to mitigate a systemic problem like invasive tracking schemes.

There should be a next step to regulations like these because user tracking is not limited to browsers where Ghostery can help β€” if you know about it. A technological response is frustrating and it is unclear to me how effective it is on its own. This is clearly not a problem only regulation can solve but neither can browser extensions. We need both.

Screen Time is Buggy

By: Nick Heer
5 June 2024 at 21:39

Joanna Stern, Wall Street Journal:

Porn, violent images, illicit drugs. I could see it all by typing a special string of characters into the Safari browser’s address bar. The parental controls I had set via Apple’s Screen Time? Useless.

Security researchers reported this particular software bug to Apple multiple times over the past three years with no luck. After I contacted Apple about the problem, the company said it would release a fix in the next software update. The bug is a bad one, allowing users to easily circumvent web restrictions, although it doesn’t appear to have been well-known or widely exploited.

It seems lots of parents are frustrated by Screen Time. It is not reliable software but, for privacy reasons, it is hard for third-parties to differentiate themselves as they rely on the same framework.

Stern:

  • Screen usage chart. Want to see your child’s screen usage for the day? The chart is often inaccurate or just blank.

I find this chart is always wildly disconnected from actual usage figures for my own devices. My iMac recently reported a week straight of 24-hour screen-on time per day, including through a weekend when I was out of town, because of a web browser tab I left open in the background.

One could reasonably argue nobody should entirely depend on software to determine how devices are used by themselves or their children, but I do not think many people realistically do. It is part of a combination of factors. Screen Time should perform the baseline functions it promises. It sucks how common problems are basically ignored until Stern writes about them.

βŒ₯ Permalink

Viewing and resetting the BIOS passwords on the RedmiBook 16

17 January 2021 at 23:00

I recently lost the BIOS password for my Xiaomi RedmiBook 16. Luckily, viewing and even resetting the password from inside a Linux session turned out to be incredibly easy.

As it turns out, both the user and the system ("supervisor") passwords are not hashed in any way and stored as plaintext inside EFI variables. Viewing these EFI variables is incredibly easy on a Linux system where efivarfs is enabled, even under a regular user account and if secure boot is enabled:

$ uname -a
Linux book 5.10.7.a-1-hardened #1 SMP PREEMPT Tue, 12 Jan 2021 20:46:33 +0000 x86_64 GNU/Linux
$ whoami
xx
$ sudo dmesg | grep "Secure boot"
[    0.010717] Secure boot enabled

Reading the variables:

$ hexdump -C /sys/firmware/efi/efivars/SystemSupervisorPw*
00000000  07 00 00 00 0a 70 61 73 73 77 6f 72 64 31 32 20  |.....password12 |

$ hexdump -C /sys/firmware/efi/efivars/SystemUserPw*
00000000  07 00 00 00 0a 70 61 73 73 77 6f 72 64 31 31 21  |.....password11!|

If you have a root shell, removing the passwords entirely is also possible:

# chattr -i /sys/firmware/efi/efivars/SystemUserPw* /sys/firmware/efi/efivars/SystemSupervisorPw*

# rm /sys/firmware/efi/efivars/SystemUserPw* /sys/firmware/efi/efivars/SystemSupervisorPw*

Reboot, and the BIOS no longer asks for a password to enter setup, change secure boot settings, etc.

❌
❌