Hacker News new | past | comments | ask | show | jobs | submit
Sibling comments point out (and I believe, corrections are welcome) that all that theater is still no protection against Apple themselves, should they want to subvert the system in an organized way. They’re still fully in control. There is, for example, as far as I understand it, still plenty of attack surface for them to run different software than they say they do.

What they are doing by this is of course to make any kind of subversion a hell of a lot harder and I welcome that. It serves as a strong signal that they want to protect my data and I welcome that. To me this definitely makes them the most trusted AI vendor at the moment by far.

As soon as you start going down the rabbit hole of state sponsored supply chain alteration, you might as well just stop the conversation. There's literally NOTHING you can do to stop that specific attack vector.

History has shown, at least to date, Apple has been a good steward. They're as good a vendor to trust as anyone. Given a huge portion of their brand has been built on "we don't spy on you" - the second they do they lose all credibility, so they have a financial incentive to keep protecting your data.

Apple have name/address/credit-card/IMEI/IMSI tuples stored for every single Apple device. iMessage and FaceTime leak numbers, so they know who you talk to. They have real-time location data. They get constant pings when you do anything on your device. Their applications bypass firewalls and VPNs. If you don't opt out, they have full unencrypted device backups, chat logs, photos and files. They made a big fuss about protecting you from Facebook and Google, then built their own targeted ad network. Opting out of all tracking doesn't really do that. And even if you trust them despite all of this, they've repeatedly failed to protect users even from external threats. The endless parade of iMessage zero-click exploits was ridiculous and preventable, CKV only shipped this year and isn't even on by default, and so on.

Apple have never been punished by the market for any of these things. The idea that they will "lose credibility" if they livestream your AI interactions to the NSA is ridiculous.

> They made a big fuss about protecting you from Facebook and Google, then built their own targeted ad network.

What kind of targeting advertising am i getting from apple as a user of their products? Genuinely curious. I’ll wait.

The rest of your comment may be factually accurate but it isn’t relevant for “normal” users, only those hyper aware of their privacy. Don’t get me wrong, i appreciate knowing this detail but you need to also realize that there are degrees to privacy.

loading story #42070119
> If you don't opt out, they have full unencrypted device backups, chat logs, photos and files.

Also full disk encryption is opt-in for macOS. But the answer isn't that Apple wants you to be insecure, they just probably want to make it easier for their users to recover data if they forget a login password or backup password they set years ago.

> real-time location data

Locations are end to end encrypted.

loading story #42070921
It's disingenuous to compare Apple's advertising to Facebook and Google.

Apple does first party advertising for two relatively minuscule apps.

Facebook and Google power the majority of the world's online advertising, have multiple data sharing agreements, widely deployed tracking pixels, allow for browser fingerprinting and are deeply integrated into almost all ecommerce platforms and sites.

They have not been punished because they have not abused their access to that data.
loading story #42069642
Didn’t Edward reveal Apple provides direct access to the NSA for mass surveillance?

> allows officials to collect material including search history, the content of emails, file transfers and live chats

> The program facilitates extensive, in-depth surveillance on live communications and stored information. The law allows for the targeting of any customers of participating firms who live outside the US, or those Americans whose communications include people outside the US.

> It was followed by Yahoo in 2008; Google, Facebook and PalTalk in 2009; YouTube in 2010; Skype and AOL in 2011; and finally Apple, which joined the program in 2012. The program is continuing to expand, with other providers due to come online.

https://www.theguardian.com/world/2013/jun/06/us-tech-giants...

loading story #42070700
loading story #42069904
> There's literally NOTHING you can do to stop that specific attack vector.

E2E. Might not be applicable for remote execution of AI payloads, but it is applicable for most everything else, from messaging to storage.

Even if the client hardware and/or software is also an actor in your threat model, that can be eliminated or at least mitigated with at least one verifiably trusted piece of equipment. Open hardware is an alternative, and some states build their entire hardware stack to eliminate such threats. If you have at least one trusted equipment mitigations are possible (e.g. external network filter).

loading story #42066004
Strictly speaking there's homomorphic encryption. It's still horribly slow and expensive but it literally lets you run compute on untrusted hardware in a way that's mathematically provable.
loading story #42069845
loading story #42069609
{"deleted":true,"id":42068698,"parent":42064286,"time":1730924359,"type":"comment"}
As to the trust loss, we seem to be already past that. It seems to me they are now in the stage of faking it.
...in certain places: https://support.apple.com/en-us/111754

Just make absolutely sure you trust your government when using an iDevice.

loading story #42069163
loading story #42066085
> History has shown, at least to date, Apple has been a good steward.

cough* HW backdoor in iPhone cough*

loading story #42069415
> that all that theater is still no protection against Apple themselves

There is such a thing as threat modeling. The fact that your model only stops some threats, and not all threats, doesn't mean that it's theater.

loading story #42067834
> They’re still fully in control. There is, for example, as far as I understand it, still plenty of attack surface for them to run different software than they say they do.

But any such software must be publicly verifiable otherwise it cannot be deemed secure. That's why they publish each version in a transparency log which is verified by the client and handwavy verified by public brains trust.

This is also just a tired take. The same thing could be said about passcodes on their mobile products or full disk encryption keys for the Mac line. There'd be massive loss of goodwill and legal liability if they subverted these technologies that they claim to make their devices secure.

The "we've given this code to a third party to host and run" part can be a 100% effective stop to any Apple-internal shenanigans. It depends entirely on what the third party is legally obligated to do for them. (Or more specifically, what they're legally obligated to not do for them.)

A simple example of the sort of legal agreement I'm talking about, is a trust. A trust isn't just a legal entity that takes custody of some assets and doles them out to you on a set schedule; it's more specifically a legal entity established by legal contract, and executed by some particular law firm acting as its custodian, that obligates that law firm as executor to provide only a certain "API" for the contract's subjects/beneficiaries to interact with/manage those assets — a more restrictive one than they would have otherwise had a legal right to.

With trusts, this is done because that restrictive API (the "you can't withdraw the assets all at once" part especially) is what makes the trust a trust, legally; and therefore what makes the legal (mostly tax-related) benefits of trusts apply, instead of the trust just being a regular holding company.

But you don't need any particular legal impetus in order to create this kind of "hold onto it and don't listen to me if I ask for it back" contract. You can just... write a contract that has terms like that; and then ask a law firm to execute that contract for you.

Insofar as Apple have engaged with some law firm to in turn engage with a hosting company; where the hosting company has obligations to the law firm to provide a secure environment for the law firm to deploy software images, and to report accurate trusted-compute metrics to the law firm; and where the law firm is legally obligated to get any image-updates Apple hands over to them independently audited, and only accept "justifiable" changes (per some predefined contractual definition of "justifiable") — then I would say that this is a trustworthy arrangement. Just like a trust is a trust-worthy arrangement.

loading story #42069647
Yep. If you don't trust apple with your data, don't buy a device that runs apples operating system
loading story #42064785
loading story #42070247
loading story #42066409
loading story #42066447
Exactly. You can only trust yourself [1] and should self host.

[1] https://www.youtube.com/watch?v=g_JyDvBbZ6Q

loading story #42064538
loading story #42065335
"Sibling comments point out (and I believe, corrections are welcome) that all that theater is still no protection against Apple themselves, should they want to subvert the system in an organized way. They're still fully in control."

It stands to reason that that control is a prerequisite for "security".

Apple does not delegate its own "security" to someone else, a "steward". Hmmm.

Yet it expects computer users to delegate control to Apple.

Apple is not alone in this regard. It's common for "Big Tech", "security researchers" and HN commenters to advocate for the computer user to delegate control to someone else.

Its not that they couldn't, its that they couldn't without a watcher knowing. And frankly this tradeoff is not new, nor is it unacceptable in anything other than "Muh Apple"
Indeed, the attestation process, as described by the article, is more geared towards unauthorized exfiltration of information or injection of malicious code. However, "authorized" activities are fully supported where that means signed by Apple. So, ultimately, users need to trust that Apple is doing the right thing, just like any other company. And yes, it means they can be forced (by law) not to do the right thing.
You're getting taken in by a misdirection.

>for them to run different software than they say they do.

They don't even need to do that. They don't need to do anything different than they say.

They already are saying only that the data is kept private from <insert very limited subset of relevant people here>.

That opens the door wide for them to share the data with anyone outside of that very limited subset. You just have to read what they say, and also read between the lines. They aren't going to say who they share with, apparently, but they are going to carefully craft what they say so that some people get misdirected.

loading story #42070767