![]() |
|
So I did some hunting in the spec (which I should've done when I first heard this concern), so that I can be specific. I turned up this from https://w3c.github.io/webauthn/#attestation-object > An important component of the attestation object is the attestation statement. This is a specific type of signed data object, containing statements about a public key credential itself and the authenticator that created it. It contains an attestation signature created using the key of the attesting authority (except for the case of self attestation, when it is created using the credential private key). The concern was about how creatively parties with a market interest in providing authentication services (1Password, Okta, Apple, Google) can use this field in service of goals that the user doesn't share, such as preventing competition. It's already the case that if you don't have a phone number you're in some ways a non-person because you can't 2fa with many services that require it. The same dynamic could be used to guarantee that everybody have a relationship with one of a small handful of providers such that they don't have to care about whether we consent to whatever new requirements they dream up. Maybe. I'll have to think about it a bit more. For instance, could this object one day contain an attestation that the user has a credit score above a certain threshold? That's the sort of thing that's new compared to passwords. |
![]() |
|
A lot of companies seem to pay to keep data they are not using. Some because they forgot, some because they have not figured out how to monetize it yet. |
![]() |
|
Do you mean 3-4 decades ago? I've been grocery shopping for myself since the 00s and have never seen a physical punch card being used in a grocery store.
|
![]() |
|
Disclosure liability insurance, low premiums if there was nothing to leak. Of course, that assumes a different world where companies actually pay for screwing up in the first place. |
![]() |
|
> Our laws shouldn't punish people for honestly doing the best they know how to Sure, but holding on to data you do not strictly need is not doing the best |
![]() |
|
That assumes users fully understood what they were consenting to. Agreements and Privacy Policies are written to exhaust and bamboozle the users so they just hit “accept.”
|
![]() |
|
I sympathize a lot with the headline statement; it boggles my mind on a lot of the data residency/integrity/confidentiality measures taken around massive data silos (as well as the infra teams companies bring to bear to manage, scale and then inevitably publish gospel articles on the web about) when companies could just opt… NOT to collect that data? I really like the model of “It stays on your device, we never see it. At most we get bare-minimum location statistics.” Although I question the assertion that their metrics system won’t be turned against them; seems obvious that anything programmed can be reprogrammed or updated, especially in the modern update-focused age. I don’t think they addressed that beyond a general statement that they took pains to assure that their users won’t ever be spied on. Would be interested in a technical article on that. Side note, we at Sentinel Devices are taking exactly this “we don’t hold your data” approach for industrial machinery. Think automated AI pipelines that are air-gapped. And we’re hiring! If you’re interested, reach out to [email protected] |
![]() |
|
While I’m sure there is some, I can’t think of a single B2B Saas product that sells their user’s data for targeted advertising. That’s more of a consumer problem. |
![]() |
|
The even better term is Datenenthaltsamkeit - data abstinence. Not just storing less, but really only storing something if there is no other option.
|
![]() |
|
Are they willing to contractually commit to not holding the user's data, with penalties? If not, they're not serious. Remember "Facebook - It's free and always will be." |
![]() |
|
This is why i wish projects like TBL's Solid [https://en.wikipedia.org/wiki/Solid_(web_decentralization_pr...] would take off some more. I happily pay providers for an application (or at least the service/utility that an app. provides) because that is often the value they is brought to bear for my benefit...but the data, ah the data is something I don't want anyone to control but me. Projects like Solid pave a possible path forward that *could possibly* enable an ecosystem where we still can legitimately pay a provider for value they provide a la an app or service, but still we as users would exert maximum control over our data sovereignty. I hope this author and others continue to keep thinking of data in this way.
|
![]() |
|
From their privacy policy:
https://matter.xyz/privacy > If we make changes to this privacy policy, we will update it here and update the effective date at the top. (We can’t email you about changes because we don’t collect everyone’s email addresses.) Changes to this policy will not apply retroactively. Effectively they can change it any time and you probably won't know. If they violate it, what power do you have to enforce it? Pay an attorney six figures? For what damages under what law? Also, I'm not sure what 'retroactively' means here, legally: They have my data and change the policy; can they tomorrow use my data according to the new policy? (Not that it matters much, because I won't know about the changes anyway.) |
![]() |
|
> You can't leak users' data if you don't hold it False. You can't leak a user's data if you never have it to begin with. If you process it, you are at risk of leaking it. |
![]() |
|
It's a very sensitive topic for the health data! Too many apps sending data left and right, 23andme scandal.. very few apps (e.g. Carrot Care on iOS) adopt such great philosophy.
|
![]() |
|
Good principle but the godforsaken states of America will never allow it, KYC and AML laws force financial service providers to keep pictures of your ID for eternity
|
![]() |
|
Sure; but it’s probably true for most of us most of the time. I’m on HN right now instead of playing piano or going for a walk. I think I need the reminder sometimes. |
![]() |
|
I can do anecdata too. Photoshop added content-aware fill, and then generative fill. These have been useful additions for me, saving time that was previously tedious stamp-tool work.
|
![]() |
|
Hey, no need to cast aspersions on the infosec practices of Belorussian hackers, I bet they store their stolen credentials in an encrypted SQLite database as per industry best practice.
|
![]() |
|
This isn't just a theoretical point. Chrome extensions are the canonical example of products which start off with the best intentions, get acquired, and then ...
|
![]() |
|
Not sure about GP, but I did read the post. If they get acquired, I don't see anything stopping the acquirer from pushing an update that decrypts stuff and sends the plaintext to the servers.
|
![]() |
|
I don't track management changes for the apps I use because who has the time? How can we protect ourselves against a malicious company changing their tech behind the scenes?
|
![]() |
|
This. And how likely is it that if their users value their app's data then there's an acquirer willing to wipe out all the users' data from the phones before they take over?
|