Viewing a single comment thread. View all comments

duh374 t1_j9x1a2w wrote

The difference is that signal is open source, whatsapp is not. Signal you can verify their encryption, whatsapp you just have to blindly trust.

32

einmaldrin_alleshin t1_j9xkeqd wrote

You can actually read out the public key from Whatsapp and use that to verify the encryption scheme.

But that would be of little use if they could extract private keys or plaintext messages from the device.

2

Prestigious_Push_947 t1_j9zp767 wrote

Whatsapp doesn't encrypt metadata though, which is a massive security vulnerability. There's absolutely no excuse to be using whatsapp.

1

carlosvega t1_j9xnebx wrote

But where is the proof that the app code is the same as GitHub code? 🤔 do they provide some hash or something?

−1

SirCB85 t1_j9xso2m wrote

They allow you to compile your own executable kf the app from the code visible on GitHub (for Systems that allow sideloading, sorry Apple fans).

7

carlosvega t1_j9y2aau wrote

Yeah, that I know, but I was wondering if they publish the md5 of the apk or compiled app so that you can test later on or something. Or if it’s possible to check the md5 of the downloaded apps from the store. I am not sure why I am downvoted, I think it is a legitimate question.

Some bad guys could fork the app, add some changes and publish it in third party stores.

https://symantec-enterprise-blogs.security.com/blogs/threat-intelligence/open-source-apps-google-play

Something similar to this: https://www.infosecurity-magazine.com/news/malicious-python-libraries-found/

And I am not the first one asking this question:

https://opensource.stackexchange.com/questions/11098/what-guarantees-that-the-published-app-matches-the-published-open-source-code

Edit: a colleague just shared this with me! https://signal.org/blog/reproducible-android/

3

hodor137 t1_j9x20xo wrote

Oh yea, it's absolutely more trustworthy than Whatsapp

−2

drawkbox t1_j9x9cnn wrote

Being open source doesn't make it secure. You can just view the code. There are tons of other attack vectors past that, CI/build, dependencies, ghost users, suveillance masquerading as moderation/spam checking and so on.

Open source libraries have been owned right in front of everyone. OpenSSL had the Heartbleed hole for years, everyone owned. Log4j/Log4Shell owned every device with Java on it including all Android phones for over a decade...

−9

kcabnazil t1_j9xf5jy wrote

I hope noone is downvoting this because they think it is inaccurate.

It is, however, missing the point.

Being open source means you can show to have security objectively, not through obscurity. It means others can not only analyze it for weaknesses, but contribute resolution to those weaknesses as well.

Whether or not that open source code is what's really used to build an application... is another matter. I wonder if that can be objectively proved for Signal. It definitely can't be for others ;)

13

drawkbox t1_j9xu5tw wrote

Agreed, security through obscurity is always a bad idea. Zero trust is the only way and less third parties helps you minimize the attack vectors.

My comment here addresses some of these points

While OSS is has code to review openly, that is a good company level trust, but that also is a potential weak area where people will overly trust and let in a bad dependency that not even the company knows got compromised. It can also let you target dependencies that the code uses without even needing to steal the code. You can trust that the company that open sources will make sure their code looks good and has less holes possibly, but not always.

It has happened in OSS for decades now to the largest toolkits with the most eyes and broadest use, because that is the best way to get into systems now, via the devs who are the weak link sadly. As a dev I am blown away at the lack of awareness of devs and these issues.

3