In this case you set the anti-phishing code in your account settings (arbitrary string). Then they include it in all email comms (in the top right of the email body). So if you get an email from what looks like "Crypto.com", but with a different anti-phishing code - then you can be certain that it's phishing.
Oh, it's just for email. That makes sense. Seems pretty weak since it relies on the user noticing the absence of a security feature.
I'd probably prefer the emails have no links and train people to never click from an email. Make them log in the same way they usually do to take any actions.
Correct me if I'm wrong, but I believe your comment is misguided.
The PIN is a security option that prevents a SIM-swapping attacker from registering a new device under your phone number unless they know the PIN. You can opt out of it (and it might be opt-in to begin with). You can also easily opt out of PIN reminders. Both of these options are in Settings -> Account.
As for server state - my understanding is that Signal attempts to be zero-knowledge overall, but they definitely store some state on the server. I believe it's encrypted using your private key that's not backed up to the server. Setting the PIN does not change that.
Server state comment aside, it seems your main complaint is about a pop-up PIN entry UI that can be opted out of? I get that it might seem annoying, but it feels like a fairly weak criticism of a messaging platform, certainly not one that should warrant an impression that Signal is "on the way out"?
My complaint with them is the whole thing with mobilecoin. They hid that integration for a year, by not pushing server updates and when the news hit, they promised to do an AMA explaining it all. Its been months since that has happened and the AMA never happened.
In short, Signal wanted to store what had been purely client-side information (contact lists, for example) on their server, but - in principle at least - in a form Signal could not access.
The PIN in question is used to provide access to that information.
> Server state comment aside, it seems your main complaint is about a pop-up PIN entry UI that can be opted out of?
The dialog to force the user to set the server-side PIN disabled the app. You either had to do it, or stop using Signal. There was no opt-out.
I had a look at the app now. I found the settings you mentioned. It's not clear to me from what I see there is this if an app-locking PIN, a SIM protection PIN, or a server-side state PIN, or all three rolled into one.
In any event, at the time it happened, the presented dialog was full-screen and could not be dimissed; even if there had been options to disable this (and there were not prior to the full-screen dialog - I looked, in an effort to dismiss the permanent partial-screen dialog) you could not get to them, because it was a full-screen dialog which you could not dismiss; you could not get to the app, and so could not get to settings.
The only option was to stop using Signal or provide a PIN so your client-side state could be stored server-side.
Fair. And I think I know what you're referring to.
Yes, they do upload your contact list, but I believe there's a prompt at setup time that allows you to opt out? It might even be an OS-level prompt to the tune of "Signal would like to access your Contacts". Not 100% sure on that one as I haven't set up a brand new Signal installation in years.
It's done to help their user acquisition. It uploads your contacts to match against other contact lists and let you know who's on Signal. I recall seeing a blog post explaining how they are doing it in a fully encrypted way, possibly using Secure Enclave (? though I think the 2021 version of that would probably involve ZK proofs/homomorphic encryption of some kind, and I hope they put some time into that).
I don't recall ever having to set a PIN specifically for that. And besides, a 4-6 digit PIN would be a terribly insecure way to "encrypt" anything server-side :) But yes, that would be a shame if it were the case.
> It's done to help their user acquisition. It uploads your contacts to match against other contact lists and let you know who's on Signal.
I may be wrong, but I think this functionality existed prior to the server-side state effort. I recall when people in my contact list joined Signal, I was notified.
However, these days I do not keep contacts in the phone contact list. It's too big and juicy a target.
> And besides, a 4-6 digit PIN would be a terribly insecure way to "encrypt" anything server-side :)
A few more usecases that I've added to my workflow since discovering container tabs:
* Work/personal separation
* Multiple AWS accounts
Also, I am very impressed with how well they're integrated into Firefox. For example, opening a link in a new tab will preserve the container. CMD+Shift+T will restore a recently closed tab and remember its original container. I really like the color coding too.
Use a different Firefox profile for work/personal.
Sign-in to personal (while at work) to sync bookmarks if firewall allows. Depending on whether you trust your employer you could sync further personal settings.
Don't sign-in to work profile and no work bookmarks or settings/logins get shared.
Don't get me wrong: multi-account containers are a blessing for those rare days when you absolutely have to log-in to facebook.
Containers actually still are integrated in the standard installation of Firefox and can be used even without an addon. You just need to turn them on in about:config:
privacy.userContext.enabled to true
privacy.userContext.ui.enabled to true
privacy.userContext.longPressBehavior to 2
I love using containers with privacy.firstparty.isolate set to true too for extra protection.
This add-on is a GUI to the underlying mechanism. They even say you can choose to use the non-GUI lower level hooks. They think this GUI is a nicer user experience
One can only dream. It was so far ahead of its time, and I still consider it superior to any smartwatch out on the market. I have mine from 5 years ago that still works great. We need more products like this.
Apple watch is better for most people IMO. The health features, notifications, apps and other integration at least. Max battery life isn't as great but charging once a day isn't so bad for the much better data.
Pebble didn't really get a chance to do health properly. They were making massive leaps in progress with each release and had all of these features before anyone else did.
I'd keep in mind that cloud providers have well-known IP blocks that can sometimes be rate-limited by various internet sites/services, primarily to combat botting. You might inadvertently get caught in the IP range that's being actively rate limited by e.g. Instagram. YMMV.
This kind of comment indicates how effective the marketing/media kool-aid has been. Having paid very close attention to how both systems work (and having used both iOS and Android extensively in recent months), I don't think I'd necessarily declare either of these platforms a clear privacy champion.
If you’re trusting third-party apps with important data and not effectively restricting their permissions, then the OS doesn’t really matter. I’m glad that Apple is doing more to make permissions granular. If you don’t install anything outside of what your phone came with, iOS would certainly be the clear privacy champion.
On iOS, your adversaries are third-party app developers. On Android, your adversaries are third-party app developers and the OS vendor, whose entire business model is hoovering up your data. iOS is, in mathematical jargon, strictly better on that basis alone.
How exactly is in case of Android the OS vendor the adversary? Considering the fact it was Apple that lied about not streaming Siri conversations while giving them out to 3rd parties? With no ability to revoke consent or even choose?
Also Apple (as opposed to Google) is the one actively cooperating with Chinese government and giving them decryption keys for iCloud.
If you look beyond the marketing spiel it's kinda ridiculous marking one of those corporations to be more trustworthy than the other.
2.) Apple admits it was giving Siri recordings to contractors with no ability to opt out of the functionality. This included false activations not meant for Siri. Seemingly what happened near iPhones did not stay on iPhones:
3.) In 2016 Apple shifted their iCloud servers to China to abide Chinese limitations and give access to content on governmental request. It seems, again, what happens on iPhone does not stay on iPhones: https://www.latimes.com/business/technology/la-fi-apple-chin...
But of course, this whole topic on HN shows that the money on marketing is well spent. Lesson for Google and Facebook: up the marketing spending to keep repeating how privacy concious they are. ;)