drawkbox

drawkbox t1_jdlyskx wrote

It isn't often one person or a group like the "Traitorous Eight". go on to make entire industries and new platforms. They did it though and that included Gordon Moore and Robert Noyce. Moore and Noyce later split from that and made NM Electronics which became Intel.

This was back when engineers/product people ran things and competition via skill not just funding was the driving force. Imagine a new company today fully controlled by the engineers/creatives/product people, it happens but not as often. We need to get back to that.

The Moore's Law is an interesting case study in creating a term/law that supersedes you and inspires your self interest but also the interest of the industry and innovation.The root of Moore's Law was making more products and cheaper, allowing more to use computing.

> Prior to establishing Intel, Moore and Noyce participated in the founding of Fairchild Semiconductor, where they played central roles in the first commercial production of diffused silicon transistors and later the world’s first commercially viable integrated circuits. The two had previously worked together under William Shockley, the co-inventor of the transistor and founder of Shockley Semiconductor, which was the first semiconductor company established in what would become Silicon Valley. Upon striking out on their own, Moore and Noyce hired future Intel CEO Andy Grove as the third employee, and the three of them built Intel into one of the world’s great companies. Together they became known as the “Intel Trinity,” and their legacy continues today.

> In addition to Moore’s seminal role in founding two of the world’s pioneering technology companies, he famously forecast in 1965 that the number of transistors on an integrated circuit would double every year – a prediction that came to be known as Moore’s Law.

> "All I was trying to do was get that message across, that by putting more and more stuff on a chip we were going to make all electronics cheaper," Moore said in a 2008 interview.

> With his 1965 prediction proven correct, in 1975 Moore revised his estimate to the doubling of transistors on an integrated circuit every two years for the next 10 years. Regardless, the idea of chip technology growing at an exponential rate, continually making electronics faster, smaller and cheaper, became the driving force behind the semiconductor industry and paved the way for the ubiquitous use of chips in millions of everyday products.

When he did become successful he also gave back.

Moore gave us more. Then when he made it he gave even more.

> During his lifetime, Moore also dedicated his focus and energy to philanthropy, particularly environmental conservation, science and patient care improvements. Along with his wife of 72 years, he established the Gordon and Betty Moore Foundation, which has donated more than $5.1 billion to charitable causes since its founding in 2000.

3

drawkbox t1_jddii1o wrote

ChatGPT basically used Google Brain created AI tech, transformers. These were used to build ClosedGPT. For that reason it is NopeGPT. ChatGPT is really just datasets, which no one knows, they could swap at any time run some misinformation then swap the next day. This is data blackboxing and gaslighting at the up most level. Not only that it is largely funded by authoritarian or originally authoritarian money...

Google Brain and other tech is way more open already than "Open"AI.

ChatGPT/OpenAI just front ran the commercial side, but long term they aren't really innovating like Google is on this. They look like a leader from the marketing/pump but they are a follower.

5

drawkbox t1_jdage34 wrote

Transformers, the T in GPT was invented at Google during Google Brain. They made possible this round of progress.

> Transformers were introduced in 2017 by a team at Google Brain[1] and are increasingly the model of choice for NLP problems,[3] replacing RNN models such as long short-term memory (LSTM). The additional training parallelization allows training on larger datasets. This led to the development of pretrained systems such as BERT (Bidirectional Encoder Representations from Transformers) and GPT (Generative Pre-trained Transformer), which were trained with large language datasets, such as the Wikipedia Corpus and Common Crawl, and can be fine-tuned for specific tasks.

Bard will most likely win long term, though I wish it was just called Google Brain. Bard is a horrible name.

>

8

drawkbox t1_jdafxt0 wrote

On the flipside AI's blackbox and swappable datasets that take massive wealth to build, will be used for misinformation more than social media has.

Even the OpenAI CEO Sam Altman admits this.

> But a consistent issue with AI language models like ChatGPT, according to Altman, is misinformation: The program can give users factually inaccurate information.

8

drawkbox t1_jcivo4f wrote

Definitely a front. There has been some consolidation in porn sites and MindGeek owner has been all sorts of attacks. They even burned his house down. It was some organized crime stuff. This ECP is on the sketch list for sure. Sounds like Ethos Capital that tried to buy .ORG TLD as a private equity firm.

Montreal mansion of Pornhub owner destroyed in criminal fire

Investigations should start on ECP right away.

23

drawkbox t1_jcg8ch7 wrote

They definitely had some schemes, also why they wanted their own store to pull more of this.

> In a complaint announced in December as part of a settlement package with Epic, the FTC said that Epic deployed a variety of design tricks known as dark patterns aimed at getting consumers of all ages to make unintended in-game purchases. Fortnite’s counterintuitive, inconsistent, and confusing button configuration led players to incur unwanted charges based on the press of a single button. The company also made it easy for children to make purchases while playing Fortnite without requiring any parental consent. According to the FTC’s complaint, Epic also locked the accounts of customers who disputed unauthorized charges with their credit card companies.

The most egregious part

> According to the FTC’s complaint, Epic also locked the accounts of customers who disputed unauthorized charges with their credit card companies.

I love Unreal Engine, Epic Games needs to stop messing around before they ruin it.

14

drawkbox t1_ja27hqu wrote

The content is accurate, the opinion is based on the findings and additional inputs.

The less third parties you have the better the opsec.

Messaging apps also get cam/mic/location/contacts permissions, Signal is no different, one more entity with your face/voice/place/friends.

You are free to keep using Signal, but trusting a third party messenger is really, really risky.

Agree to disagree.

1

drawkbox t1_j9y8ajk wrote

> To scrape my Signal messages, you need access to my physical device, and you need my passcode.

Same with getting access to your iCloud/Apple account.

> To get access to iMessage messages, all you need to do is get my latest backup, or the backup of the person I talked to, off of Apple’s servers, for example, with a legal request, which completely bypasses the need for any user level auth/encryption.

As I said, anything stored in a cloud will have some oversight. Anyone that thinks storing something in a cloud is secure from oversight is dim.

If you think anyone from enforcement can just get your iCloud/Apple account, that would also mean they are able to access your device and everything on it including your "encrypted end to end" Signal messages that are plaintext on your client.

Messaging apps also get cam/mic/location/contacts permissions, Signal is no different, one more entity with your face/voice/place/friends.

You can trust Signal, a third party, rock on. Acton is involved in both, from WhatsApp, went to Facebook, and then when people stopped trusting Facebook he made Signal to catch those leaving. The story is he didn't trust Facebook, no one should, but can you trust Signal/Acton or is it a front. You decide. Problem is you trust them so much they got ya. I mean Elon Musk and Edward Snowden recommend it... is must be safe /s. Signal is maybe safe, maybe safe from some five eyes, but not all eyes not in the five. Even then, there are always ways to get in via dependencies and devs (mainly devops) are the weak link today sadly.

Agree to disagree, good discussion though.

0

drawkbox t1_j9y5gri wrote

You dismiss user level auth/encryption like it is nothing. If anyone had access to user level auth why go to the backup file, just go to the device, load up Signal and scrape their messages.

All other messaging apps have the "ghost user" problem confirmed.

Signal has a shim for spam that is very unclear, I'll just say that. There are other things around Signal that make is sus in my opinion.

> The source code for spam detection is not public.

So there is a plausible deniability reason to hide some code... you have to trussssst. Here's the kicker, you can't check for spam if you aren't seeing the message.

Even on the self installed versions...

> Signal's servers are partially open source, but the server software's anti-spam component is proprietary and closed source due to security concerns

Signal does handle users being added better, but this could just be theater as well.

> The real problem with the GCHQ proposal is that it targets a weakness in messaging/calling systems that’s already well-known to providers, and moreover, a weakness that providers have been working to close — perhaps because they’re worried that someone just like GCHQ (or probably, much worse) will try to exploit it. By making this proposal, the folks at GCHQ have virtually guaranteed that those providers will move much, much faster on this.

> And they have quite a few options at their disposal. Over the past several years researchers have proposed several designs that offer transparency to users regarding which keys they’re obtaining from a provider’s identity service. These systems operate by having the identity service commit to the keys that are associated with individual users, such that it’s very hard for the provider to change a user’s keys (or to add a device) without everyone in the world noticing.

> As mentioned above, advanced messengers like Signal have “submerged” the group chat management into the encrypted communications flow, so that the server cannot add new users without the digitally authenticated approval of one of the existing participants. This design, if ported to in more popular services like WhatsApp, would seem to kill the GCHQ proposal dead.

I personally don't trust Signal for a few reasons beyond these items, but if you trust them then rock on.

−2

drawkbox t1_j9y44d1 wrote

> end to end encrypted chat service

End to end means nothing though when a client isn't in your control and may have another user attached. Again, common in many messaging apps.

iMessage is end to end encryption to iMessage, just like Signal to Signal. If you use SMS it is not.

> We designed iMessage to use end-to-end encryption, so there's no way for Apple to decrypt the content of your conversations when they are in transit between devices. Attachments you send over iMessage (such as photos or videos) are encrypted so that no one but the sender and receiver(s) can access them

Apple can't magically make SMS secure, it is not secure by default as it was really from telephone diagnostics and repurposed for messaging. So when you message Android it goes SMS. SMS was from SS7 and really only for diagnostic or messages to customers for testing. MMS is better but still not that great. Apple should bring iMessage to Android and do better on messengers, it is leading many people to third parties that open up opsec issues.

> With Signal, the whole idea is you shouldn’t have to trust Signal with your messages, because they don’t have the ability to read them, even if they wanted to.

That is a bold statement. Yes, on the surface. Again, ghost users, compromised clients, endpoints can have problems. Also Signal does have a proprietary shim for monitoring, spam checking and other things, could easily be used to surveil or sift. There are probably a dozen ways or more to get around it currently at the dependency level or breaking their encryption as it is custom.

The best opsec is always less third parties.

−1

drawkbox t1_j9y2ig6 wrote

Signal definitely seems the best out of them if you are into using a third party messenger, for now.

I would still trust the OS level messaging on mobile over third parties because of the scale, future funding, incentives and trust. The OS already has access to your info. Other people getting access to your data is probably always easier on third party systems, even if the third party is trustable, not ever person or dependency is.

iMessage is secure, if you are going straight SMS yes that is more open. I also know what Apple wants and their goals fully, that is a secure platform that isn't just messaging.

The fact is though, every system has holes and security issues, so the best opsec is less third parties, big or small or open or closed...

Just ask Jeff Bezos after he got hacked via WhatsApp temp hole by something sent to him by freaking MBS.

1

drawkbox t1_j9xzevz wrote

> If a cloud service is truly end to end encrypted, and designed well, nobody but the end user should be able to access the data.

I agree this is just not the case with so many holes and side channels out there. The cloud is good for securing content from others, oversight will always find a way. Anyone that thinks otherwise is a suka.

> Or if you have access to the files on Apple’s server, then no user auth is required.

User auth still required but yeah you could hack Apple I supposed and get it. Good luck though.

> There are many commercial and open source tools that are able to read the backup file for you. Elcomsoft, iMazing and the Citizen Lab Mobile Verification Toolkit are some examples.

If those apps are getting the user context then sure. If not then no.

Take Elcomsoft for instance with LastPass vs Password managers. That is why you don't install clients or extensions, like LastPass.

Read this closely:

>> Windows Data Protection API Not Used

>> One may argue that extracting passwords stored by the Google Chrome browser is similarly a one-click affair with third-party tools (e.g. Elcomsoft Internet Password Breaker). The difference between Chrome and LastPass password storage is that Chrome makes use of Microsoft’s Data Protection API, while LastPass does not.

>> Google Chrome does, indeed, store user’s passwords. Similar to third-party password managers, the Windows edition of the Chrome browser encrypts passwords when stored. By default, the encrypted database is not protected with a master password; instead, Chrome employs the Data Protection API (DPAPI) introduced way back in Windows 2000. DPAPI uses AES-256 to encrypt the password data. In order to access passwords, one must sign in with the user’s Windows credentials (authenticating with a login and password, PIN code, or Windows Hello). As a result, Google Chrome password storage has the same level of protection as the user’s Windows login.

>> This, effectively, enables someone who knows the user’s login and password or hijacks the current session to access the stored passwords. This is exactly what we implemented in Elcomsoft Internet Password Breaker.

>> However, in order to extract passwords from Web browsers such as Chrome or Microsoft Edge, one must possess the user’s Windows login and password or hijack an authenticated session. Analyzing a ‘cold’ disk image without knowing the user’s password will not provide access to Chrome or Edge cached passwords.

>> This is not the case for the LastPass Chrome extension (the desktop app is seemingly not affected). For the LastPass database, the attacker will not need the user’s Windows login credentials of macOS account password. All that’s actually required is the file containing the encrypted password database, which can be easily obtained from the forensic disk image. Neither Windows credentials nor master password are required.

>> macOS has a built-in secure storage, the so-called keychain. The Mac version of Chrome does not use the native keychain to store the user’s passwords; neither does the iOS version. However, Chrome does store the master password in the corresponding macOS or iOS keychain, effectively providing the same level of protection as the system keychain. Elcomsoft Password Digger can decrypt the macOS keychain provided that the user’s logon credentials (or the separate keychain password) are known.

Elcomsoft mentions the OS level protections on these.

> It wouldn’t be the first time someone’s iCloud account was hacked into.

If someone gets into iCloud they are most likely getting into the device and again, the point of a "secure" messenger or cloud falls apart because they have access to their user. Yes, people should be careful with their user, it opens up everything.

> not even the service provider hosting the cloud service can access your data.

If you believe this then you believe in magic. Even if a provider tried to do this, software has holes... See OpenSSL/Log4j/Log4Shell/on and on and on and on... The fact that you trusted it because they said they don't look, it was probably a lie, but even if it wasn't they can get in.

1

drawkbox t1_j9xxebn wrote

I mean pretty much anything in a cloud should be considered secure from everything but law enforcement.

The point is you still need the user context/auth. These files only work with the OS to access them. Like an iOS app or Windows Store data/settings in an app, that is specifically signature/encrypted to your user. Outside of that context it is useless. Third party ones are usually not tied to OS/browser/app for a reason.

I think most people are worried about hackers/ransomware/criminals over law enforcement so if you use the cloud that is why people are willing to make that tradeoff. The most insecure place is the local systems most likely, very easy to compromise a user compared to Apple/Google/Microsoft. It is possible but way more difficult. You almost have to be rogue state level funded for that.

> Well.. then just don’t add participants you don’t trust to your group chats…

Sure, but there is a 'ghost' user ability. In many messengers this is used to look for spam/moderation or other potentially nefarious reasons. Any chat system that has the ability to connect more than two people has the potential for you to not see the user. This is the most common use like in honeypot apps.

Encrypted messaging app used by criminals was actually an FBI honeypot

>> The encrypted messaging app in question was called ANOM, and was installed on special smartphones that couldn’t make calls or send emails. ANOM purported to be end-to-end encrypted, meaning only the sender and receiver could view messages. In reality, every single message was passed to police, who used them to make the arrests.

Ghost users is a major problem in "secure" messaging apps. There are plausible deniability reasons for them, spam detection, moderation etc, that is the rub.

> if there’s an easily available plaintext version too?

It is only plaintext in the context of the user... Taking it out of that context it is no longer. People make this claim about browser password managers but everything is tied at the system level to the user. Sure if the person gets the user context then they can get the files unencrypted, that is how it works. That would mean everything is compromised even your "secure" third party messenger like Signal.

> Signal is open source and you can run your own server if you want.

Yes. Still doesn't mean a third party that relies on messaging only is trustable.

Apple/Google/Microsoft have a vested interest in securing all your content, you don't have to worry about them stealing messages or siphon.

All of them will be open to law enforcement most likely because there are so many attack vectors in systems and especially third parties that don't have the sophistication at the cyber security level simply due to cost.

1

drawkbox t1_j9xvw64 wrote

Good info. The leaked screenshot I wish someone had a good version of it, so small.

The point with iCloud is that it is always under the users security context, that is encrypted. The backup files themselves weren't but across the board the OS and cloud level access requires the user context, if you were to take those outside the system you'd still need the auth context.

For law enforcement that is more accessible on iCloud, but for others it is more difficult like cyber criminals or ransomware and other things.

Phone backups also don't have to go to iCloud, it is wise to for not losing content, but you can still backup to desktop or other.

The point is, they aren't a third party, they don't make money only from messaging and they have a very vested interest in making sure their system is secure from third parties. If you add a third party into the mix like on messaging, you better trust it because your OS/device already can see that AND the third party. Adding more attack vectors is really security by obscurity, but with more obscurity.

> Signal by default doesn’t keep its data in device backups. You’d need to build a custom client to get it to do that. There’s no way to get Signal to not end-to-end encrypt it’s chats, it’s on by default and can’t be turned off.

This all falls apart when a participant is added (ghost or actual) that gets the entire convo. This is very common in messaging apps and a known issue on WhatsApp, Telegram, many other ones and Signal also has the ability to attach users to convos. The moment you have another participant all the end to end encryption is moot.

Focusing on just encrypted backups is probably what third parties want people to focus on, because they are third parties and want you to use them, but even if you trust them, how long can they be trusted. When it is bought out by private equity later that can get bad. Now they might sift everything. It is alot like those VPNs that say "we retain no logs" but they divert them to a third party and when it is reviewed the logs surely aren't there, but they are still out there.

2

drawkbox t1_j9xu5tw wrote

Agreed, security through obscurity is always a bad idea. Zero trust is the only way and less third parties helps you minimize the attack vectors.

My comment here addresses some of these points

While OSS is has code to review openly, that is a good company level trust, but that also is a potential weak area where people will overly trust and let in a bad dependency that not even the company knows got compromised. It can also let you target dependencies that the code uses without even needing to steal the code. You can trust that the company that open sources will make sure their code looks good and has less holes possibly, but not always.

It has happened in OSS for decades now to the largest toolkits with the most eyes and broadest use, because that is the best way to get into systems now, via the devs who are the weak link sadly. As a dev I am blown away at the lack of awareness of devs and these issues.

3

drawkbox t1_j9xtvb4 wrote

I am more zero trust but if you are going to trust, trust fewer third parties. Even if trustable. Third parties get sold. Third parties need to make money from that not other ways only (Apple/Google for instance don't need you using messaging to survive).

If you are already on a browser, a password store/generator is safer without a third party involved. The OS, browser and company already have you, why involve a third party?

Same with messaging... Trusting WhatsApp/Signal/Telegram is not only another level third party, it is your most private content... why trust a funded/private equity/questionable source system if you don't have to.

Signal does appear to be the best of them, however being open is not safer always.

The new trick is dependency/build attacks, so good sometimes the main company doesn't even know it is happening (see SolarWinds that was hacked via TeamCity CI, the bad bits were being put into the dependencies at build, code was fully independently verified). The problem is blanket trust. It is what led to the OpenSSL Heartbleed hole, the Log4j/Log4Shell hole and pretty much any bit hole in the last year was part of open source.

When a company gets their source code stolen (LastPass for instance) the point is to find dependencies they can manipulate, not even the code itself. Almost all closed code uses dependencies that are open or known, and have known holes, the key there is utilizing that when you know the code flow. Open source actually makes that part easier, no need to steal source code.

I am a big OSS fan, but I hate how devs are the weak link today. Devs today are so willing to trust a third party because they heard about it or it saves a day. Those are the MOST targeted dependencies...

3

drawkbox t1_j9xtbji wrote

Though the iMessage/iCloud backups are linked to a user and everything it keyed on that. Now they can additionally encrypt but it was always encrypted under the user.

I see this same complaint with browser password managers in the browser (not extension), they do encrypt now but they used to just by the user. You'd have to login as the user to be able to decrypt everything or access it. Things like Signal, Telegram, LastPass, Bitwarden and other third party style systems that do not encrypt by user, it is encrypted but you can break it outside the context of the user, not possible with backups, iMessage, Chrome/Safari/Edge passwords etc.

> Importantly, as a sender, you have no idea if your recipient is taking the proper precautions, and no way to enforce it.

By default Signal/Telegram both use your number and if one participant of the chat (even a 'ghost' user) isn't, or even if they are, all that data is wide open. Telegram by default has encryption off. If one of your recipients is that way, well you are wide open.

1

drawkbox t1_j9x9o8o wrote

Being open source does not mean it is secure. If anything it means people will overly trust it.

Open source libraries have been owned right in front of everyone. OpenSSL had the Heartbleed hole for years, everyone owned. Log4j/Log4Shell owned every device with Java on it including all Android phones for over a decade...

Opening up private messages to a third party isn't a good idea. If you are on Apple, use iMessenger. Apple can already get your info. Same on Google. Using an additional third party client, as well as a desktop client, that opens you up to all sorts of attack vectors even it you trust the company, they can be hacked. Trust leads to intrusions.

1

drawkbox t1_j9x9cnn wrote

Being open source doesn't make it secure. You can just view the code. There are tons of other attack vectors past that, CI/build, dependencies, ghost users, suveillance masquerading as moderation/spam checking and so on.

Open source libraries have been owned right in front of everyone. OpenSSL had the Heartbleed hole for years, everyone owned. Log4j/Log4Shell owned every device with Java on it including all Android phones for over a decade...

−9