Hacker Newsnew | past | comments | ask | show | jobs | submit | fsflover's commentslogin

Do you consider F-Droid a "vetted app store"?

Yes I do but I don't want to help my parents install it

You should, as it is much safer than the one from Google.

Yeah, but it doesn't include the apps that lay-people want to use, such as Facebook, Venmo, and Google Maps. I like open source software as much as the next guy, but most of it seems very off-brand to your average joe. Setting people up with Firefox is one thing, trying to get them to use AntennaPod instead of Spotify is a much taller order.

> We need a financial way to reward the resistance

Here you go: https://eff.org, https://edri.org, https://noyb.eu


Those links would be more effective if you were to spend a couple of lines explaining what these organizations are and what they do...

These are NGOs fighting for our digital rights like privacy, security and device ownership. Wikipedia can explain much better than me.

See also: https://news.ycombinator.com/from?site=eff.org

https://news.ycombinator.com/from?site=edri.org

https://news.ycombinator.com/from?site=noyb.eu


How did original engineering certification prevent dangerous constructions? Did it force everyone to use a Big Company?

You can use TPM with Heads and a hardware key to ensure Windows can't infect the other partition.

Does switching to Qubes OS count?

No, but contributing to it does, or more so the packages it depends on which are more cross cutting.

Doesn't this mean that no matter how securely your phone is locked, Apple (and probably the three-letter agencies) can always unlock it by installing an appropriate update?

Not necessarily. If the secret is protected in the secure element against something only you can provide (physical presence of RFID, password, biometric etc) then it is ok.

BUT you must trust the entire Apple trusted chain to protect you.

That is a rather big BUT.


> If the secret is protected in the secure element against something only you can provide (physical presence of RFID, password, biometric etc) then it is ok.

But we already established unlocking is not possible, so going with the argument it's implied there is a side-channel. Nothing, but a secret in your brain is something only you can (willingly) provide. Especially not biometric data, which you distribute freely at any moment. RFID can be relayed, see carjacking.

If you can side-step the password, to potentially install malware/backdoor, that's inherently compromising security.


If the data you care about is encrypted with a token locked behind your passcode input, and it's not theoretically brute forceable by being a 4 character numeric only thing, then not easily, no.

Could they produce an update that is bespoke and stops encrypting the next time you unlock, push it to your phone before seizing it, wait for some phone home to tell them it worked, and then grab it?

Perhaps, but the barrier to making Apple do that is much higher than "give us the key you already have", and only works if it's a long planned thing, not a "we got this random phone, unlock it for us".

(It's also something of a mutually-assured destruction scenario - if you ever compel Apple to do that, and it's used in a scenario where it's visibly the case that 'the iPhone was backdoored' is the only way you could have gotten that data, it's game over for people trusting Apple devices to not do that, including in your own organization, even if you somehow found a legal way to compel them to not be permitted to do it for any other organization.)


> Perhaps, but the barrier to making Apple do that is much higher than "give us the key you already have", and only works if it's a long planned thing, not a "we got this random phone, unlock it for us".

The attack situation would be e.g. at the airport security check, where you have to part with your device for a moment. That's a common way for law enforcement and intelligence to get a backdoor onto a device. Happens all the time. You wouldn't be able to attribute it to Apple collaborating with agencies or them using some zero-day exploit. For starters, you likely wouldn't be aware of the attack at all. If you came home to a shut-down phone, would you send your 1000$ device to some security researcher thinking it's conceivably compromised, or just connect it to a charger?

If you can manually install anything on a locked phone, that's increasing the attack surface, significantly. You wouldn't have to get around the individual key to unlock the device, but mess with the code verification process. The latter is an attractive target, since any exploit or leaked/stolen/shared key will be potentially usable on many devices.


Sure, but that'd be a waste.

Part of the reason e.g. Cellebrite is obsessive about not telling people many specifics about their product capabilities outside of NDA is that Apple is quite serious about trying to fix these things, and "we can crack every iPhone before the 14" probably tells them a fair bit about what might have a flaw.

Tools like that lose a lot of value if anyone paying enough attention can infer they exist, even indirectly, like if all the TSA agents you know suddenly switch to Android phones, or some of them tell you not to bring iPhones through security and won't tell you why, or a thousand other vectors for rumors to start.

All it takes is enough rumors for people to say it's enough to not trust any more, and suddenly you've lost a lot of the value of a secret information source.

So if you have a tool like that, where most people don't think it's readily available, the way you probably use it is very sparingly, to keep it that way.


There is a difference in targeted software supply attacks vs. weakening encryption for everyone by introducing a master key. Apple would be required to cooperate by US law, it may never become public either. But as I said, Apple doesn't have to know, or "know". This feature inherently compromises security. Contrary to device encryption, OS update security depends on a single key held by Apple (rather several devOps guys...), which could be stolen, leaked or shared.

Would you bet, the NSA can't sign iOS updates?

> So if you have a tool like that, where most people don't think it's readily available, the way you probably use it is very sparingly, to keep it that way.

Of course. This is reserved for targeted attacks against journalists and other enemies of the state.

> All it takes is enough rumors for people to say it's enough to not trust any more, and suddenly you've lost a lot of the value of a secret information source.

As if Apple users would care...

https://www.apple.com/legal/transparency/us.html

https://gizmodo.com/apple-iphone-privacy-analytics-class-act...

https://thenextweb.com/news/apple-apps-on-big-sur-bypass-fir...

https://www.theguardian.com/us-news/2025/oct/23/trump-white-...

https://www.404media.co/iceblock-owner-after-apple-removes-a...

https://www.404media.co/apple-gave-governments-data-on-thous...

https://www.404media.co/fbi-extracts-suspects-deleted-signal...


None of those articles are inconsistent with the claim that Apple cares about security, though?

"We can be legally compelled to give up data we have" and "we thought letting people have custom kernel modules was a bigger threat" are not particularly incompatible with "we design things so we don't have keys to your data we can be compelled to give up" and valuing people's security. (I am not a fan of the latter, to be clear, but there are reasonable reasons you could argue for it.)

But yes, I would probably, at the moment, bet that if the NSA can sign a custom iOS build on consumer hardware, Apple doesn't know about how, both because that's a very hard secret to keep, and because you'd see a massive uptick in people avoiding Apple devices in governments that might be of interest to US intelligence if even a rumor of that got out.


GNU/Linux exists on mobile, too. Sent from my Librem 5.

Tell me more about this


This is why orgs like https://eff.org exist.

But eff isn’t going to come to my aid if it’s isn’t a big story, like wireguard. We’re all just arguing circularly around the fact that companies with massive footprints can and do operate in a manner where it’s assumed that zero access is the industry standard for “normal users”

I would still ask them, and even if they can't help, they fight for such rights for everyone.

Qubes OS is a niche security-oriented operating system that runs everything in VMs with about 70k users [0]. Even less users contribute with amazing things like running in RAM [1] or using very minimal VMs with Debian and Fedora [3] to minimize the attack surface and bloat.

[0] https://doc.qubes-os.org/en/latest/introduction/statistics.h...

[1] https://forum.qubes-os.org/t/qubes-os-live-mode-dom0-in-ram-...

[2] https://forum.qubes-os.org/t/how-i-learned-to-love-liteqube-...


The solution is to use AGPLv3.

I’m maybe daft but AGPLv3 doesnt prevent $Evilcorp from using it, they just need to share any modifications or forks they made?

And at this point, it appears running code through an LLM to translate it eliminates copyright (and thus the licence), so $Anycorp can use it.

Our stuff is AGPL3 licenced and if this present trend continues we might just switch to MIT so at least the little guys can take advantage of it the way the big guys can.


I think the whitewashing of code through LLMs is still unproven if it actually works for a reasonably complex project and also it’s still kind of legal Wild West - I think no one knows for sure how it will work out.

There are piles of examples of it working for complex projects and libraries now especially if they have good test suites your clone can pass.

Also they are even getting quite good at reverse engineering binaries.

Anything not released as FOSS, will have a FOSS copy made.

There is no moat and the reign of restrictive licenses on software is effectively over.


Can you share any of these examples? I haven’t been able to find any…

Only if they provide the software or software as a service. Then I suspect it's good enough if the modifications or forks made are shared internally if software is used only internally, but on the other hand I'm not a lawyer.

> if software is used only internally

Internal users are still users tho. They are entitled to see source code and license allows them to share it with the rest if of the world.


Employers might argue that such internal use and distribution would fall under the “exclusively under your behalf” clause in the GPLv3, which is inherited by the AGPLv3.

Oh, I guess it would. Ignore me.

In reality most $Evilcorp have policies against AGPLv3, which is why projects can make moneh selling a less-restricted enterprise license for the same code.

I often hear this but I don’t really understand it. Not saying you need to explain it to me but what is the issue with AGPLv3 that turns those corporations away?

To my non-lawyer eyes it looks like MIT or Apache2 but modifications need to be made public as well.

If you don’t make any modifications then it should be fine? Or do most $Evilcorp aim to make modifications? Or is AGPLv3 something like garlic against vampires (doesn’t make sense but seems to work)?


AGPLv3 includes that “distribution” includes essentially communicating with the service over the network, as opposed to the GPL concept of like, sending a shrink wrapped binary that someone downloads and runs themselves.

So basically they are worried that they have no way of avoiding one or more of their tens of thousands of engineers “distributing” it to customers by including it in some sort of publicly accessible service. AFAIK there’s no settled case regarding what level of network communication qualifies - like if I run a CRUD app on Postgres and Postgres was AGPL, am I distributing Postgres?

Now the second part is that you only have to give out your changes to the AGPL software to those that it was “distributed” to. Most people aren’t changing it! If anything they’re just running a control plane in front of it…

but it goes back to the corporate legal perspective of “better safe than sorry” - we can’t guarantee that one of our engineers, isn’t changing it in some way that would expose company internals, then triggering a condition where they have to distribute those private changes publicly.


Oh I see that makes sense, thanks for the explanation!

This is the point. They can use and modify it, but they also have to share their modifications, i.e., help its development. Yet most megacorps never even touch this license.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: