> Notice that Signal doesn't do anything that much different; Signal does the key exchange and unless you verify each user's key offline, you have to trust it.
Unfortunately verification of reproducible builds is not baked in the OS (Android/iOS) so it's still possible to target someone with malicious update. When vast majority of people don't verify the build it's possible that the attack would go unnoticed.
> so it's still possible to target someone with malicious update
I'm no Android or iOs dev, so I might be wrong, but to my knowledge there is no feature to push an app update specifically to a narrow set of devices?
So at the very least, third parties (Apple/Google) would have to be involved in such an attack. This removes some entities from the list that could create an attack.
Also, Apple/Google have a big reason not to play such games. Their app stores are partially so popular because they, as companies, are trusted. Apple/Google would only do this if they'd be legally required to. IF they were involved, even against their will, this would mean tremendous risk to trust in these companies, meaning risk to the stock. And for a publicly traded company, there is no bigger motivator. Apple/Google would get out all the lobbying power they have, trying to fight off whatever coercion tool the US government uses against them to make them comply.
Even if there'd be no opposition from Apple or Google, people outside would notice sooner or later that they've got malicious updates. If they use it once or twice, they might go undetected, but if governments or other entities start using this as a vector repeatedly, it will get to the public.
This doesn't mean that I think that these issues aren't important. Reproducible builds, binary transparency, gossip protocols, all these things are very important areas to invest research in, but right now they aren't a vector that is being abused on observable scales.
> I'm no Android or iOs dev, so I might be wrong, but to my knowledge there is no feature to push an app update specifically to a narrow set of devices?
Yes, it's possible to target "narrow set of devices" by using Device Catalog. An excerpt from the ToS:
> Google Play Console Device Catalog Terms of Service
> By using the device catalog and device exclusion tools in the Play Console (“Device Catalog”), You consent to be bound by these terms, in addition to the Google Play Developer Distribution Agreement (“DDA”). If there is a conflict between these terms and the DDA, these terms govern Your use of the Device Catalog. Capitalized terms used below, but not defined below, have the meaning ascribed to them under the DDA.
> 1. The Device Catalog allows You to review a catalog of the Devices supported by Your app and search the Devices by their hardware attributes. It also allows You to exclude specific Devices that are technically incompatible with Your app.
Yes, signal is better than FB or sms.. But the whole requiring phone number puts a nail in it on my end.
So Signal can learn who talks with whom via requests going through their LDAP-like server. They can get an idea how long calls are, and if it was a vid or audio call. They know the times of communication.
You know, they can see the metadata. When's the last time we had problems with metadata? The POTS network? Yep.
And you're indeed right the client has reproducible builds. But the server side certainly doesn't. And we have no way to ascertain that.
> You know, they can see the metadata. When's the last time we had problems with metadata? The POTS network? Yep.
Yes, metadata is a problem, particularly with calls. However, Signal recently added the sealed sender (https://signal.org/blog/sealed-sender/) feature which makes the server blind to who the sender of a message is.
> And you're indeed right the client has reproducible builds. But the server side certainly doesn't.
That's true, but the server side is much less important when it comes to cryptographic assurances.
Signal is definitely not a panacea, but by many counts it's better than anything else that currently exists and has any semblance to something a typical user can use.
For what it's worth, they don't retain any of that metadata. This has been tested in court:
> We’ve designed the Signal service to minimize the data we retain about Signal users, so the only information we can produce in response to a request like this is the date and time a user registered with Signal and the last date of a user’s connectivity to the Signal service.
Everytime Signal is brought up someone just has to chime in saying ‘we must abandanon Signal at all costs because metadata’. The metadata limitation is well known and if metadata interception is a problem for your threat model there are steps to obscure your identity or you should use a different tool. For the 99% of other cases where I just don’t want anyone snooping on my conversation with friends and family but don’t care that people know I’m obviously conversing with my friends and family Signal is great. Let’s not throw Signal out just because the metadata is still there.
Briar is good if metadata is a prime concern, but even Matrix, XMPP and email have very similar metadata problems to Signal, plus contact discovery problems as you can't casually gather that your friend or relative is on the platform (phone numbers mostly solve this).
If metadata is good enough to drone strike weddings, it's probably good enough to throw you in a concentration camp too. And since data never dies, it might be enough to throw your grand kids in concentration camps.
Now, protecting everyone's meta data is hard (probably impossible), and I don't mean to be defeatist - but "it's just metadata" doesn't sit well in a post Snowden world. We know all large intelligence agencies hoover up this stuff.
And we also know that agencies are made up of people, and some people abuse their access.
I certainly don’t mean to discount the importance of metadata. I specifically mentioned ensuring Signal fits your threat model.
To suggest that metadata of communication over Signal between my spouse and I will be used against my grand kids one day is a bit absurd though. Of course there’s tons of metadata connecting my spouse and I. It would be more suspicious if there wasn’t.
Spouse, "family" and friends are different goalposts. Mapping friends and family is AFAIK a key part of who gets bombed by the cia. Sure, if your spouse is found to be an "enemy of the state" under a new totalitarian government - your immediate family will have problems.
If a friend turns out to be union organizer, you might be banned from jobs, if the government decides to collude with employers (again).
Let's not forget Signal is FOSS and has reproducible builds (https://signal.org/blog/reproducible-android/). This makes it far easier to trust its verification code.