If Apple controls the root of trust, like the private keys in the CPU or security processor used to check the enclave (similar to how Intel and AMD do it with SEV-SNP and TDX), then technically, it's a "trust us" situation, since they likely use their own ARM silicon for that?
Harder to attack, sure, but no outside validation. Apple's not saying "we can't access your data," just "we're making it way harder for bad guys (and rogue employees) to get at it."
I don't think they do. Your phone cryptographically verifies that the software running on the servers is what it says it is, and you can't pull the keys out of the secure enclave. They also had independent auditors go over the whole thing and publish a report. If the chip is disconnected from the system it will dump its keys and essentially erase all data.
loading story #42065085
loading story #42063626
loading story #42063402
It was always "trust us". They make the silicon, and you have no hope of meaningfully reverse engineering it. Plus, iOS and macOS have silent software update mechanisms, and no update transparency.
Hey can you help me understand what you mean? There's an entry about "Hardware Root of Trust" in that document, but I don't see how that means Apple is avoiding stating, "we can't access your data" - the doc says it's not exportable.
"Explain it like I'm a lowly web dev"
loading story #42063396
every entity you hand data to other than yourself is a "trust us" situation
loading story #42067240
+1 on your comment.
I think having a description of Apple's threat model would help.
I was thinking that open source would help with their verifiable privacy promise. Then again, as you've said, if Apple controls the root of trust, they control everything.
loading story #42063907
loading story #42063861