The US federal government issues its employees smart cards (Common Access Cards) that contain digital certs. Government employees can use these to send and receive S/MIME encrypted emails. That's a couple million users!
Our small company has been encrypting all emails by default with S/MIME for 15-20 years. A company can generate its own certs for free from a company root cert, use a provider like Sectigo for $20/year, or get US Government ECA certs for about $100/year.
You can read encrypted emails on company-managed mobile devices that have Knox chips to secure access to the certificate. We're careful to back up all our old keys so we can always read old emails.
Some drawbacks are:
- Email "search" features only see the subjects, not the contents, of encrypted emails.
- You can't read encrypted emails via web email.
- Few others have S/MIME certs. Most major government contractors seem confused when we ask about encrypting emails with them...
Johnny may not encrypt, but every business really can.
Someone needs to design a super dumb and robust system where I can safely store all my keys on all devices I use an account. The fact that whatsapp, signal and other platforms tend to have a primary device for keys is bonkers to me. A primary device that can randomly die, get stolen or fall in a lake.
I have lost chat histories more times than I can remember, and I have to be extra diligent about this these days.
I don’t even want to think about pgp when I have to manually take care of this problem. Not because of my own skills, but because I could never make it reliable for my family and friends on their side.
> I have lost chat histories more times than I can remember, and I have to be extra diligent about this these days.
As per Signal’s diehard proponents, losing chat history is a feature, not a bug (I’m not being facetious when saying this, and you can see comments of this kind in Signal related threads here).
Edited to add: I don’t agree with that premise and have long disliked losing chat history.
I know you are not being facetious. My problem is random Joe on the street sees it as a bug. He really does care more about actually being able to talk with his wife than Signal’s mathematically correct principles. He needs it to be reliable first, secure second.
You've been downvoted, but I think that's a fair take. There will always be tension between security and usability; it's difficult (impossible?) to do the absolute best in both metrics.
Signal's development team can decide that they prioritize security over usability to whatever degree they like, and that's their prerogative. That may result in fewer users, and a less than stellar reputation in the usability space, but that's up to them. And if we (the unpaying user base) don't like it, we are free to use something else that better meets our needs.
Maybe an answer is to have a control for each message that you can set to plain text or encrypted based on a cloud backed up key of encrypted based on a key only on this device. The you could message "hi mum, running late" without complications while being able to hard encrypt when you want?
Signal is already complication free (at least until your phone falls in a lake) making the control useless.
(And you probably don't need to worry about losing the 'running late' message in the lake... The need for good encryption and reliable backup on any given message is likely somewhat correlated.)
Perhaps it’s a marketing problem, then. Signal is marketed as a secure and full-featured alternative to things like WhatsApp and iMessage. Most people start reading that sentence after the word “secure”, and then are surprised and disappointed when a device replacement loses all their history.
I think it would be better if Signal more loudly communicated the drawbacks of its encryption approach up-front, warning away casual users before they get a nasty surprise after storing a lot of important data in Signal.
I’ve heard Signal lovers say the opposite—that getting burned with data loss is somehow educational for or deserved by casual users—and I think that’s asinine and misguided. It’s the equivalent of someone saying “ha! See? You were trading away privacy for convenience and relying on service-provider-readable message history as a record all along, don’t you feel dumb?”, to which most users’ will respond “no, now that you’ve explained the tradeoffs…that is exactly how I want it to work; you can use Signal, but I want iMessage”.
It shouldn’t take data loss to make that understood.
(i am a security person who prioritizes security over usability but) you missed the point a bit. If a privacy program is used only by people that have something to hide it turns into a smoking gun. If you care about being targeted by government you should really hope regular people use signal a lot, because government absolutely has (or can procure) a list of people that use signal.
My company recently really cut back on slack retention. At first I was frustrated, but we all quickly got over it and work carried on getting done at the same pace as before and nothing really got impacted like many of us imagined it might.
That bears little resemblance to the Signal concerns. The reason people are worried about losing their personal messages is not lost productivity.
It's also not even really the same situation. A more apt analogy would be, if switching work laptops sometimes meant you could no longer read any Slack history.
I'd hate this, slack is an extension of my memory and it being long lived and searchable can be a super power - you don't have to remember all the details of everything, just enough of the who, what, when to find the rest.
Once communication with my customers moved to teams. I've had a very hard time to find historical agreements and decisions.
I try very hard to create a robust system for ADR logging now. And not just for system architecture. But for all decisions and agreements in my projects and across changes.
Well I don't think most people choose who they work with. Even if you like your team a lot, you might have a discussion with someone from another team or division, and that's where it's useful to have a good chat history haha.
Signal's threat model is that everything around you is hostile to you, except the parties you interact with. You are an undercover rebel in a totalitarian sect which would sacrifice you to Cthulhu if they see your chat history. Losing it is much better than disclosing it.
Your threat model is likely random black hat hackers who would try to get into your communication channels and dig some dirt to blackmail you, or to impersonate you to scam your grandmother out of several thousand dollars. Signal protects quite well against it. But the chance of this happening even in an unencrypted channel is low enough. You don't mind making the security posture somehow weaker, but preserve the possibility to restore your chat history if your secure device is lost or destroyed.
I suppose the problem could be solved by an encrypted backup with a long key which you keep on a piece of paper in your wallet, and / or in a bank in a safe deposit box. Ideally it would be in the format that the `age` utility supports.
But there is no way around that paper with the long code. If this code is stored on your device, and can be copied, it will be copied by some exploit. No matter how inconspicuous a backdoor you are making, somebody will find it and sneak into it. Should it happen in a publicized case, the public opinion will be "XYZ is insecure, run away from it!".
> If this code is stored on your device, and can be copied, it will be copied by some exploit.
Yeah... We really need some key-management hardware where the secrets can be copied by some channel that is not the primary one. This used to be more common, before the IT companies started pushing everything into the cloud.
I have recently started to see computer boards with write protection for the UEFI data, what is a related thing that also did go away because mostly of Microsoft. So, maybe things are changing back.
So, the requirement is a system to store all your keys and that it can be duplicated as many times you wish. It looks like a local password manager, let's say keepass. I use it and have copies of the encrypted db on every device of mine, plus the client to access the passwords. I don't know if it qualifies for dumbness but it feels pretty robust. It survived the fall into the lake test (a river in my case.)
But I see every customer of mine using web based password managers, because they want to share and update passwords with all their team. Of course those password managers can use E2E encryption and many do, but my instinct is that if you are using somebody's else service for your data, you can be locked out from your data.
Anyway, it's the concept of having many passwords and having to manage them that's not dumb enough. The most that people do is letting the browser store and complete passwords. The password can be the same 1234pass on every single site.
Web-based password manager user here! It's worth noting that Bitwarden and 1Password (probably all the others too) let you export all of your data into an encrypted archive, so anyone who does this periodically won't be "locked out".
(Naturally, this requires extra effort on the users' part, so who knows how many are actually using this ability.)
I set up automatic backups of WhatsApp to my self-hosted Nextcloud once. Since you need 'tested backups', I tried to decrypt these WhatsApp backups independent of my phone, but this was not possible. You need the original device. There are some hacks online, but they are always out of date.
I am tending now to running Mautrix Whatsapp bridge and backing up my data through this.
my proposal devices is like yubikey but instead of yubikey hardware in place like USB devices form
its in the form of ring or bracelet, its small enough and can be carried everywhere with you all the time
its use NFC like technology, it works without battery, fast and "secure enough" for 99% of people
what if the device is stolen???? we can add authorization like biometric (fingerprint etc) while touching devices so it can be sure the real owner is "giving" auth
The problem is not a personal hardware security module, as you noted we have them. The problem is that people want redundancy that undermines the point. If you can easily have a copy of your ring just in case, how do you know who has done that process and watches you all the time? Biometrics sounds like a solution yet they are implemented as a cosmetic security layer and this situation is pointless to fix since we leave them everywhere we go.
Proton doesn't provide public APIs for retrieving the public GPG keys associated with their users' accounts, nor do they provide a way to send encrypted mail to their users' accounts without using their official apps.
Ergo, Proton is not really working to further the state of cryptography for email, they're only working to compel users to use their proprietary software (and ultimately their paid services).
If services which do automated sending of emails to their subscribers/users have no way to encrypt those emails for its users who are on proton mail, I don't understand how Proton can claim to care about encryption.
Ideally, you'd be able to provide the service your key directly (you can do it in Sourcehut for example, IIRC), and they use that key without relying on a third-party server. Maybe using something like WebFinger could be a solution too, for automatic key discovery from a "trusted" party (the recipient's email server).
> ...nor do they provide a way to send encrypted mail to their users' accounts without using their official apps.
I'm confused by this complaint. Sending encrypted mail is the job of the sender. You can PGP encrypt your mail and send it to a Proton user just like any other recipient. I've done this at work when I need to send myself paystubs.
I have used this to send signed/encrypted mail to a ProtonMail recipient. It worked, until he responded inline without encrypting it to my private key, thereby completely defeating the point.
(Later I informed him of how to automatically sign and encrypt outgoing mails to my account, as that is possible too, but not obvious at all.)
PM should make the more obvious, but in principle the interoperability is there and works.
Proton still appears to suffer from Lavabit's pathologies in several ways because it ultimately stores GPG private keys, hasn't had their "zero-access encryption" audited by an independent third-party, it hosts servers in privacy-hostile jurisdictions that can be seized, and they've already handed user data to authorities over 30k times. [0] Proton Mail is a simulacra of privacy as a service that lies to its customers.
At present time, the best way to assure privacy is to lease (using cryptocurrency) VPS instances in a neutral, privacy-respecting country and self-host a web-mail stack oneself. There isn't really a practical way around this because powerful nation states are able to demand access to customer data from almost every cloud/VPS provider in their jurisdiction.
If you're at the point where your hosting your own mail, you may as well GPG encrypt your own messages (with your locally stored private key), which makes the jurisdiction irrelevant
Of course, this still assumes your correspondents will be capable of doing the same.
I’ve got hundreds of emails from the early 2010s between a couple of coworkers and myself that I can no longer read because they were S/MIME encrypted and I’ve got no idea what happened to my keys or even if my current client supports it anymore.
I wish the client stored it decrypted once received.
> I wish the client stored it decrypted once received.
Me too. I already have my systems with fulldisk encryption, I need the communication to be end encrypted.
Email clients (like Thunderbird) keeping emails stored encrypted, just makes it harder for these tools to search, label and automate stuff around content.
I'm sorry for your loss, but this sounds like an antipattern. Hundreds of emails between co-workers and it was all contemporaneously related to work in progress or cat pictures of your own cats, didn't contain PII or proprietary information of your employer or unaware third parties? And you want it back? From far enough away (that I might as well be in orbit) this seems preferable to an unencrypted drive ending up in somebody's hands for "refurbishment" (cough printers with hard drives).
No one is innocent. I refuse to use LE and operate my own CA instead, and as a consequence of scareware browser warnings I publish http: links instead of https: (if anyone cares, you know to add the "s" don't you?). I run my own mailserver which opportunistically encrypts, and at least when it gets to me it's on hardware which I own and somebody needs a search warrant to access.. as opposed to y'all and your gmail accounts. I do have a PGP key, but I don't include it on the first email with every new correspondent because too many times it's been flagged as a "virus" or "malicious".
Clearly we live in a world only barely removed from crystals and ouija boards.
> Hundreds of emails between co-workers and it was all contemporaneously related to work in progress or cat pictures of your own cats, didn't contain PII or proprietary information of your employer or unaware third parties?
You're merely defining away the problem. You have no idea what was in those emails.
Who knew I’d need to do this? I’d never needed to do this either my emails in the decades prior.
You’ve also got no idea what was in those emails. Could be some valuable knowledge or logs about some crazy rare bug or scenario, and would be useful to review today.
We just turned on S/MIME by default, to “be secure”, whatever that means. There was no warning in the email client about losing access to the email if you lost your keys.
Citing BOFH is all well and good inside certain circles. In the real world, people don’t like spending time or effort on poorly thought out and implemented solutions.
It's weird. Almost all web traffic is now https - even though very little of it is sensitive. Email, on the other hand, is quite often sensitive, and yet...no one cares.
It seems like the bigger day to day issue is the possibility of downgrades from STARTTLS or a server that doesn’t support TLS. Encryption in the GPG isn’t necessary or even would be unwanted (for a company to have records of all the emails).
So there are mechanisms to put encrypted things in workplace emails and then have some mechanism for receiver in a different organization to unencrypt. I have seen a mechanism that comes down to magic links, which I found ironic (though yes, intercepting is less of a threat than sending the data unencrypted).
I feel like supporting an option to not send an email unless STARTTLS happens is the way to go. There’s probably a lot of practical problems for, say, online Outlook or Gmail supporting that option when sending an email. But I feel like that’s the easiest solution.
Unfortunately, those are 2 different problems. It’s easy to have servers store encryption keys to make https work. You only need to encrypt trafic between you and a server for 5 seconds at a time.
It’s hard for personal communications. The server shouldn’t know the keys, and they need to survive for decades.
No, they couldn't. What they could do--and what they did do--was push for the move of TLS connections for the MX-MX hop of email; I don't have the stats off the top of my head for how prevalent that is, but I think it's in the 80-90% range of email being delivered in this method.
But end-to-end encrypted email? It breaks everything. You need to get all the MUAs to support it (very few do either S/MIME or PGP). You'll break webmail--the most popular way to use email--without lots of major investment. And encrypted email breaks things like spam filtering or server-side filters catastrophically. Key discovery is also unsolved.
There was a time when I was on the everybody-should-use-encrypted-email train. But I've since grown up and realized that encrypted email fundamentally breaks email in ways that people are unprepared for, and people have already figured out how to route around the insecurity of email via other mechanisms.
Google makes money off search, which requires that users want to visit websites. All websites using HTTP are not secure. Unsecure websites are uninteresting to most users, but most users don't have the know-how to distinguish what sites are using HTTPS and which aren't. So the simplest solution is to get all websits to switch to HTTPS before it becomes a problem
Another possibility is Google is in an industry that makes money by collecting information about users, and by supporting universal HTTPS, they gained a competitive advantage over ISPs and others regarding user data for Google searches and other services.
Yeah, at some point people are going to work out that the problem isn't Johnny, it's email. Email is distinctively hostile to secure messaging. No matter what software Johnny uses, "secure" email will always be inferior to alternative options.
* PGP doesn't encrypt email metadata, so the attacker gets a record of every senders, receiver, time, date, and subject line, for free, with PGP actually working at its best.
* Email usually isn't usable without storing it server-side (for multi-client access), and without being able to search it. That requires your email to be in clear text on the server. That's solved with an on-prem mail server, but not many people have that - very few end users can operate one.
* Email endpoints generally aren't secure, so even if you somehow secure your personal mail store, possibly nothing is secure except your draft messages. Every email is sent to or received from other people, so your messages are subject to their security practices.
One key difference is that Signal intentionally makes design choices to make it harder to use incorrectly, and PGP is comically easy to use incorrectly.
Mike Waltz is just about dumb enough to pile out his own eyes with his thumbs. At which point we will be regaled with the danger of thumbs forevermore.
Of course not, but unlike chat services, everyone has an email address or phone number, so if I need to reach out to them for something other than a casual chat, e.g. an invitation, a birthday felicitation, or a document that they should review later, email is a method through which I can reach all of them.
Similarly, reaching out to companies for support also often happens over email.
True, I found out the hard way a couple of months ago as I seem to have lost the ability to recover my previously profile setup from Google workspace email
Maybe Johnny doesn't have a need to encrypt. The post card in India was just a card with message written on both sides, fully visible in plain text. It's very common that a postman would read out the letter to recipients sometimes, when they deliver it. Privacy is not an universal need.
Poor are those people who are forced to hide their message in encrypted formats,
The point is, why not let people to have freedom of not having to encrypt? And why such freedom is considered as poor? This is like forcing everyone to have a smart phone, car, passport, zillions of IDs, internet profiles and calling their shackled life as rich.
The other day someone was shocked to see that I don't have FB and instagram accounts. When did people lose their freedom not have social media accounts?
Because if the default is unencrypted, you'll accidentally send secrets in plaintext one day. And if the default is encrypted and works well - why would you ever take time to explicitly disable that? What's the situation where you want to say "just in case someone intercepts this message, I want them to be able to read it"?
I consider e-mails to be digital versions of postcards. Both are obsolete but have some usage scenarios. There is no need to use private communication in obsolete postcard type messaging, so there is no need for encryption. For private communications there are other better(easier) means which people use.
>Auditors obsess over encryption at rest—from laptop FDE to databases’ security theaterish at-rest encryption—and over encryption in transit, usually meaning TLS.
Very hard to parse sentence. The monospace font means the em-dash isnt emmy enough, so I couldn't tell it apart from the hyphen on first, second, and third attempt. I wish people would put spaces around it, and to hell with what the style guide says.
>In 2025, it’s pretty much the same. In some respects, it’s worse:
Well not quite, if you use mutt, it is easy to encrypt emails with gpg. The setup could be a bit hard for new people, but if they have good reading comprehension it is easy.
Thunderbird has its own gpg-like based internal encryption. I really do not like it, I wish they built it on gnupg like the old plugin did.
All you need to do is get your key to the people you want to send encrypted email to and you need to get theirs. There are key servers or you can mail the public key to them.
To me, if on Cell Phones, all bets are off. I would never use email on Cell Phones.
The US federal government issues its employees smart cards (Common Access Cards) that contain digital certs. Government employees can use these to send and receive S/MIME encrypted emails. That's a couple million users!
Our small company has been encrypting all emails by default with S/MIME for 15-20 years. A company can generate its own certs for free from a company root cert, use a provider like Sectigo for $20/year, or get US Government ECA certs for about $100/year.
You can read encrypted emails on company-managed mobile devices that have Knox chips to secure access to the certificate. We're careful to back up all our old keys so we can always read old emails.
Some drawbacks are:
- Email "search" features only see the subjects, not the contents, of encrypted emails.
- You can't read encrypted emails via web email.
- Few others have S/MIME certs. Most major government contractors seem confused when we ask about encrypting emails with them...
Johnny may not encrypt, but every business really can.
Someone needs to design a super dumb and robust system where I can safely store all my keys on all devices I use an account. The fact that whatsapp, signal and other platforms tend to have a primary device for keys is bonkers to me. A primary device that can randomly die, get stolen or fall in a lake.
I have lost chat histories more times than I can remember, and I have to be extra diligent about this these days.
I don’t even want to think about pgp when I have to manually take care of this problem. Not because of my own skills, but because I could never make it reliable for my family and friends on their side.
> I have lost chat histories more times than I can remember, and I have to be extra diligent about this these days.
As per Signal’s diehard proponents, losing chat history is a feature, not a bug (I’m not being facetious when saying this, and you can see comments of this kind in Signal related threads here).
Edited to add: I don’t agree with that premise and have long disliked losing chat history.
I know you are not being facetious. My problem is random Joe on the street sees it as a bug. He really does care more about actually being able to talk with his wife than Signal’s mathematically correct principles. He needs it to be reliable first, secure second.
GP here. I agree. I should’ve stated that I don’t like losing chat history and have seen that as a problem with Signal.
I have edited my previous comment to reflect that I don’t like losing chat history.
> He needs it to be reliable first, secure second.
Than he should use something else. I need signal to be secure first, second and third and reliable in edge cases like this a distant number.
You've been downvoted, but I think that's a fair take. There will always be tension between security and usability; it's difficult (impossible?) to do the absolute best in both metrics.
Signal's development team can decide that they prioritize security over usability to whatever degree they like, and that's their prerogative. That may result in fewer users, and a less than stellar reputation in the usability space, but that's up to them. And if we (the unpaying user base) don't like it, we are free to use something else that better meets our needs.
Maybe an answer is to have a control for each message that you can set to plain text or encrypted based on a cloud backed up key of encrypted based on a key only on this device. The you could message "hi mum, running late" without complications while being able to hard encrypt when you want?
Signal is already complication free (at least until your phone falls in a lake) making the control useless.
(And you probably don't need to worry about losing the 'running late' message in the lake... The need for good encryption and reliable backup on any given message is likely somewhat correlated.)
Perhaps it’s a marketing problem, then. Signal is marketed as a secure and full-featured alternative to things like WhatsApp and iMessage. Most people start reading that sentence after the word “secure”, and then are surprised and disappointed when a device replacement loses all their history.
I think it would be better if Signal more loudly communicated the drawbacks of its encryption approach up-front, warning away casual users before they get a nasty surprise after storing a lot of important data in Signal.
I’ve heard Signal lovers say the opposite—that getting burned with data loss is somehow educational for or deserved by casual users—and I think that’s asinine and misguided. It’s the equivalent of someone saying “ha! See? You were trading away privacy for convenience and relying on service-provider-readable message history as a record all along, don’t you feel dumb?”, to which most users’ will respond “no, now that you’ve explained the tradeoffs…that is exactly how I want it to work; you can use Signal, but I want iMessage”.
It shouldn’t take data loss to make that understood.
Yeah, but if use proton for everything else and signal only for my secret world domination plans, traffic analysis will be so much easier…
Congrats on not being one of the people concerned about being targeted by their government, now or in the future.
Hundreds of millions are not so lucky.
(i am a security person who prioritizes security over usability but) you missed the point a bit. If a privacy program is used only by people that have something to hide it turns into a smoking gun. If you care about being targeted by government you should really hope regular people use signal a lot, because government absolutely has (or can procure) a list of people that use signal.
Signal has a backup service in beta, that you can use right now.
My company recently really cut back on slack retention. At first I was frustrated, but we all quickly got over it and work carried on getting done at the same pace as before and nothing really got impacted like many of us imagined it might.
That bears little resemblance to the Signal concerns. The reason people are worried about losing their personal messages is not lost productivity.
It's also not even really the same situation. A more apt analogy would be, if switching work laptops sometimes meant you could no longer read any Slack history.
I'd hate this, slack is an extension of my memory and it being long lived and searchable can be a super power - you don't have to remember all the details of everything, just enough of the who, what, when to find the rest.
It's fine until you need evidence someone agreed to something months ago but all records have been deleted.
Yeah, mail is the primary source of this.
Once communication with my customers moved to teams. I've had a very hard time to find historical agreements and decisions.
I try very hard to create a robust system for ADR logging now. And not just for system architecture. But for all decisions and agreements in my projects and across changes.
I expect that some types of people (in middle management, especially) may see the lack of this as a good thing.
Methinks the better solution here is to get better friends?
Well I don't think most people choose who they work with. Even if you like your team a lot, you might have a discussion with someone from another team or division, and that's where it's useful to have a good chat history haha.
A certain type of person sees this as a feature, not a bug.
This is a difference in the threat model.
Signal's threat model is that everything around you is hostile to you, except the parties you interact with. You are an undercover rebel in a totalitarian sect which would sacrifice you to Cthulhu if they see your chat history. Losing it is much better than disclosing it.
Your threat model is likely random black hat hackers who would try to get into your communication channels and dig some dirt to blackmail you, or to impersonate you to scam your grandmother out of several thousand dollars. Signal protects quite well against it. But the chance of this happening even in an unencrypted channel is low enough. You don't mind making the security posture somehow weaker, but preserve the possibility to restore your chat history if your secure device is lost or destroyed.
I suppose the problem could be solved by an encrypted backup with a long key which you keep on a piece of paper in your wallet, and / or in a bank in a safe deposit box. Ideally it would be in the format that the `age` utility supports.
But there is no way around that paper with the long code. If this code is stored on your device, and can be copied, it will be copied by some exploit. No matter how inconspicuous a backdoor you are making, somebody will find it and sneak into it. Should it happen in a publicized case, the public opinion will be "XYZ is insecure, run away from it!".
> If this code is stored on your device, and can be copied, it will be copied by some exploit.
Yeah... We really need some key-management hardware where the secrets can be copied by some channel that is not the primary one. This used to be more common, before the IT companies started pushing everything into the cloud.
I have recently started to see computer boards with write protection for the UEFI data, what is a related thing that also did go away because mostly of Microsoft. So, maybe things are changing back.
So, the requirement is a system to store all your keys and that it can be duplicated as many times you wish. It looks like a local password manager, let's say keepass. I use it and have copies of the encrypted db on every device of mine, plus the client to access the passwords. I don't know if it qualifies for dumbness but it feels pretty robust. It survived the fall into the lake test (a river in my case.)
But I see every customer of mine using web based password managers, because they want to share and update passwords with all their team. Of course those password managers can use E2E encryption and many do, but my instinct is that if you are using somebody's else service for your data, you can be locked out from your data.
Anyway, it's the concept of having many passwords and having to manage them that's not dumb enough. The most that people do is letting the browser store and complete passwords. The password can be the same 1234pass on every single site.
Web-based password manager user here! It's worth noting that Bitwarden and 1Password (probably all the others too) let you export all of your data into an encrypted archive, so anyone who does this periodically won't be "locked out".
(Naturally, this requires extra effort on the users' part, so who knows how many are actually using this ability.)
Maybe I'm old but I never expect chat history to be a permanent thing. It's like talking to someone, it should be ephemeral.
If you need a record, use email. Recording and archiving every conversation with someone is just weird.
Thanks for listening, now you dang kids can get off my lawn
I set up automatic backups of WhatsApp to my self-hosted Nextcloud once. Since you need 'tested backups', I tried to decrypt these WhatsApp backups independent of my phone, but this was not possible. You need the original device. There are some hacks online, but they are always out of date.
I am tending now to running Mautrix Whatsapp bridge and backing up my data through this.
Ask yourself. If you want things to be encrypted by default in the world, would a florist be able to self host nextcloud?
Agreed. I am still unhappy, but perhaps this is entirely my problem.
Apple/Google passkeys.
Two problems: Apple. And Google.
Indeed, passkeys would seem to represent a step forward from single-device to single-account.
Passkeys are often stored/locked per device?
my proposal devices is like yubikey but instead of yubikey hardware in place like USB devices form
its in the form of ring or bracelet, its small enough and can be carried everywhere with you all the time
its use NFC like technology, it works without battery, fast and "secure enough" for 99% of people
what if the device is stolen???? we can add authorization like biometric (fingerprint etc) while touching devices so it can be sure the real owner is "giving" auth
The problem is not a personal hardware security module, as you noted we have them. The problem is that people want redundancy that undermines the point. If you can easily have a copy of your ring just in case, how do you know who has done that process and watches you all the time? Biometrics sounds like a solution yet they are implemented as a cosmetic security layer and this situation is pointless to fix since we leave them everywhere we go.
if people want to copy then let them copy
-how is that secure????
we would let only 1 device active at a time
if you think secure enclave with Biometric security is "weak" then no one is secure
if you think combination of (fingerprint,DNA,blood variance,retina, star time + position, mental memory etc) is not enough then no one is enough
(we are assuming this is future where we can access all this technology) << this is important point here
also if this is not enough, ppffttt (I dont want to go here) Neuralink device that lives under your skin
> Proton is a notable exception.
Proton doesn't provide public APIs for retrieving the public GPG keys associated with their users' accounts, nor do they provide a way to send encrypted mail to their users' accounts without using their official apps.
Ergo, Proton is not really working to further the state of cryptography for email, they're only working to compel users to use their proprietary software (and ultimately their paid services).
If services which do automated sending of emails to their subscribers/users have no way to encrypt those emails for its users who are on proton mail, I don't understand how Proton can claim to care about encryption.
You can fetch a user's PGP public key via their HKPS endpoint, for example https://mail-api.proton.me/pks/lookup?op=get&search=username.... The one who apparently doesn't support PGP at all is Tuta.
Ideally, you'd be able to provide the service your key directly (you can do it in Sourcehut for example, IIRC), and they use that key without relying on a third-party server. Maybe using something like WebFinger could be a solution too, for automatic key discovery from a "trusted" party (the recipient's email server).
> ...nor do they provide a way to send encrypted mail to their users' accounts without using their official apps.
I'm confused by this complaint. Sending encrypted mail is the job of the sender. You can PGP encrypt your mail and send it to a Proton user just like any other recipient. I've done this at work when I need to send myself paystubs.
Uhm you can curl https://api.protonmail.ch/pks/lookup?op=get&search=$email_ad... for any valid $email_address and get the public key.
I have used this to send signed/encrypted mail to a ProtonMail recipient. It worked, until he responded inline without encrypting it to my private key, thereby completely defeating the point.
(Later I informed him of how to automatically sign and encrypt outgoing mails to my account, as that is possible too, but not obvious at all.)
PM should make the more obvious, but in principle the interoperability is there and works.
Proton still appears to suffer from Lavabit's pathologies in several ways because it ultimately stores GPG private keys, hasn't had their "zero-access encryption" audited by an independent third-party, it hosts servers in privacy-hostile jurisdictions that can be seized, and they've already handed user data to authorities over 30k times. [0] Proton Mail is a simulacra of privacy as a service that lies to its customers.
At present time, the best way to assure privacy is to lease (using cryptocurrency) VPS instances in a neutral, privacy-respecting country and self-host a web-mail stack oneself. There isn't really a practical way around this because powerful nation states are able to demand access to customer data from almost every cloud/VPS provider in their jurisdiction.
0. https://proton.me/legal/transparency
If you're at the point where your hosting your own mail, you may as well GPG encrypt your own messages (with your locally stored private key), which makes the jurisdiction irrelevant
Of course, this still assumes your correspondents will be capable of doing the same.
Encrypt with your public key, surely?
Fair, encrypt your recipient's public key, sign with your private key.
Assume your correspondents can do the same as in, encrypt with your public key and sign with their private key
> in a neutral, privacy-respecting country
Is there such a thing ?
I’ve got hundreds of emails from the early 2010s between a couple of coworkers and myself that I can no longer read because they were S/MIME encrypted and I’ve got no idea what happened to my keys or even if my current client supports it anymore.
I wish the client stored it decrypted once received.
> I wish the client stored it decrypted once received.
Me too. I already have my systems with fulldisk encryption, I need the communication to be end encrypted.
Email clients (like Thunderbird) keeping emails stored encrypted, just makes it harder for these tools to search, label and automate stuff around content.
I'm sorry for your loss, but this sounds like an antipattern. Hundreds of emails between co-workers and it was all contemporaneously related to work in progress or cat pictures of your own cats, didn't contain PII or proprietary information of your employer or unaware third parties? And you want it back? From far enough away (that I might as well be in orbit) this seems preferable to an unencrypted drive ending up in somebody's hands for "refurbishment" (cough printers with hard drives).
No one is innocent. I refuse to use LE and operate my own CA instead, and as a consequence of scareware browser warnings I publish http: links instead of https: (if anyone cares, you know to add the "s" don't you?). I run my own mailserver which opportunistically encrypts, and at least when it gets to me it's on hardware which I own and somebody needs a search warrant to access.. as opposed to y'all and your gmail accounts. I do have a PGP key, but I don't include it on the first email with every new correspondent because too many times it's been flagged as a "virus" or "malicious".
Clearly we live in a world only barely removed from crystals and ouija boards.
> Hundreds of emails between co-workers and it was all contemporaneously related to work in progress or cat pictures of your own cats, didn't contain PII or proprietary information of your employer or unaware third parties?
You're merely defining away the problem. You have no idea what was in those emails.
Whatever was in those emails wasn't important enough for them to unencrypt them in a durable fashion, or put the keys in a safe with the gold bars.
We call this the "scream test" in BOFH land.
Who knew I’d need to do this? I’d never needed to do this either my emails in the decades prior.
You’ve also got no idea what was in those emails. Could be some valuable knowledge or logs about some crazy rare bug or scenario, and would be useful to review today.
We just turned on S/MIME by default, to “be secure”, whatever that means. There was no warning in the email client about losing access to the email if you lost your keys.
Citing BOFH is all well and good inside certain circles. In the real world, people don’t like spending time or effort on poorly thought out and implemented solutions.
It wasn't important enough at the time to the BOFH.
It's weird. Almost all web traffic is now https - even though very little of it is sensitive. Email, on the other hand, is quite often sensitive, and yet...no one cares.
Why?
Nearly all email is encrypted in transit. All major MTA systems send encrypted and accept encrypted as the default.
This article is about encrypting the body of the email which is easy* but no widely implemented standard exists.
* Stupid easy for two nerds to email securely.
* Stupid hard to work with multiple people and non-nerds.
It seems like the bigger day to day issue is the possibility of downgrades from STARTTLS or a server that doesn’t support TLS. Encryption in the GPG isn’t necessary or even would be unwanted (for a company to have records of all the emails).
So there are mechanisms to put encrypted things in workplace emails and then have some mechanism for receiver in a different organization to unencrypt. I have seen a mechanism that comes down to magic links, which I found ironic (though yes, intercepting is less of a threat than sending the data unencrypted).
I feel like supporting an option to not send an email unless STARTTLS happens is the way to go. There’s probably a lot of practical problems for, say, online Outlook or Gmail supporting that option when sending an email. But I feel like that’s the easiest solution.
might age fit the bill?
Unfortunately, those are 2 different problems. It’s easy to have servers store encryption keys to make https work. You only need to encrypt trafic between you and a server for 5 seconds at a time.
It’s hard for personal communications. The server shouldn’t know the keys, and they need to survive for decades.
HTTPS is pervasive because Google encouraged it. Gmail could force S/MIME but they don't care.
No, they couldn't. What they could do--and what they did do--was push for the move of TLS connections for the MX-MX hop of email; I don't have the stats off the top of my head for how prevalent that is, but I think it's in the 80-90% range of email being delivered in this method.
But end-to-end encrypted email? It breaks everything. You need to get all the MUAs to support it (very few do either S/MIME or PGP). You'll break webmail--the most popular way to use email--without lots of major investment. And encrypted email breaks things like spam filtering or server-side filters catastrophically. Key discovery is also unsolved.
There was a time when I was on the everybody-should-use-encrypted-email train. But I've since grown up and realized that encrypted email fundamentally breaks email in ways that people are unprepared for, and people have already figured out how to route around the insecurity of email via other mechanisms.
I think mandatory S/MIME without user-friendly key management would either be reverted pretty soon or it would kill Gmail.
Google would have to build some kind of Let's Encrypt for S/MIME before they turned on the encouragement.
why did google wanted it?
Google makes money off search, which requires that users want to visit websites. All websites using HTTP are not secure. Unsecure websites are uninteresting to most users, but most users don't have the know-how to distinguish what sites are using HTTPS and which aren't. So the simplest solution is to get all websits to switch to HTTPS before it becomes a problem
Another possibility is Google is in an industry that makes money by collecting information about users, and by supporting universal HTTPS, they gained a competitive advantage over ISPs and others regarding user data for Google searches and other services.
Yeah, at some point people are going to work out that the problem isn't Johnny, it's email. Email is distinctively hostile to secure messaging. No matter what software Johnny uses, "secure" email will always be inferior to alternative options.
https://www.latacora.com/blog/2020/02/19/stop-using-encrypte...
"The most popular modern secure messaging tool is Signal"
As Mike Waltz had found out. And Snowden used gpg and I haven't heard of a single message of his having been decrypted.
Snowden also endorsed Signal, fwiw: https://x.com/Snowden/status/661313394906161152
Snowden also used Cryptocat.
Both PGP and Signal will leak if you use them incorrectly, so that comparison doesn't really hold up.
I say this as someone who uses both.
PGP email doesn't match Signal security:
* PGP doesn't encrypt email metadata, so the attacker gets a record of every senders, receiver, time, date, and subject line, for free, with PGP actually working at its best.
* Email usually isn't usable without storing it server-side (for multi-client access), and without being able to search it. That requires your email to be in clear text on the server. That's solved with an on-prem mail server, but not many people have that - very few end users can operate one.
* Email endpoints generally aren't secure, so even if you somehow secure your personal mail store, possibly nothing is secure except your draft messages. Every email is sent to or received from other people, so your messages are subject to their security practices.
One key difference is that Signal intentionally makes design choices to make it harder to use incorrectly, and PGP is comically easy to use incorrectly.
Mike Waltz is just about dumb enough to pile out his own eyes with his thumbs. At which point we will be regaled with the danger of thumbs forevermore.
It's email. 90% of the emails I get are marketing spam or GitHub notifications. Nobody I know uses email to chat with friends
Of course not, but unlike chat services, everyone has an email address or phone number, so if I need to reach out to them for something other than a casual chat, e.g. an invitation, a birthday felicitation, or a document that they should review later, email is a method through which I can reach all of them.
Similarly, reaching out to companies for support also often happens over email.
If you want encrypted communication over email, there's DeltaChat.
DeltaChat are moving away from "classic email" in favour of the ChatMail protocol.
I wish someone would fork DeltaChat so I can keep using it as a client for "classic email".
True, I found out the hard way a couple of months ago as I seem to have lost the ability to recover my previously profile setup from Google workspace email
Issue 1: Establishing lots of reasons why people should encrypt
Issue 2: Making it easy to encrypt
Issue 3: Popularizing encryption or getting more people to do it
Issue 3.. most/many governments are taking active steps to discourage this practice or better still (for them), stamp it out completely.
Maybe Johnny doesn't have a need to encrypt. The post card in India was just a card with message written on both sides, fully visible in plain text. It's very common that a postman would read out the letter to recipients sometimes, when they deliver it. Privacy is not an universal need.
Poor are those people who are forced to hide their message in encrypted formats,
Nobody expects privacy when they send a postcard.
Most people keep their emails behind a password for a reason...
The point is, why not let people to have freedom of not having to encrypt? And why such freedom is considered as poor? This is like forcing everyone to have a smart phone, car, passport, zillions of IDs, internet profiles and calling their shackled life as rich.
The other day someone was shocked to see that I don't have FB and instagram accounts. When did people lose their freedom not have social media accounts?
Because if the default is unencrypted, you'll accidentally send secrets in plaintext one day. And if the default is encrypted and works well - why would you ever take time to explicitly disable that? What's the situation where you want to say "just in case someone intercepts this message, I want them to be able to read it"?
Encrypted communication has lots of practical drawbacks.
For me email is just fine the way it is. Deliverability could be better and Google/Microsoft duopoly is a problem but that's it.
Stop reinventing the wheel.
I consider e-mails to be digital versions of postcards. Both are obsolete but have some usage scenarios. There is no need to use private communication in obsolete postcard type messaging, so there is no need for encryption. For private communications there are other better(easier) means which people use.
> Poor Johnny still won't encrypt
As long as Google, Apple or Microsoft controls your device, all bets are off. You can "encrypt"mails in Outlook but, Microsoft also has your key.
>Auditors obsess over encryption at rest—from laptop FDE to databases’ security theaterish at-rest encryption—and over encryption in transit, usually meaning TLS.
Very hard to parse sentence. The monospace font means the em-dash isnt emmy enough, so I couldn't tell it apart from the hyphen on first, second, and third attempt. I wish people would put spaces around it, and to hell with what the style guide says.
I thought this title was a reference to this David Bowie/NIN song: https://www.youtube.com/watch?v=LT3cERVRoQo
>In 2025, it’s pretty much the same. In some respects, it’s worse:
Well not quite, if you use mutt, it is easy to encrypt emails with gpg. The setup could be a bit hard for new people, but if they have good reading comprehension it is easy.
Thunderbird has its own gpg-like based internal encryption. I really do not like it, I wish they built it on gnupg like the old plugin did.
All you need to do is get your key to the people you want to send encrypted email to and you need to get theirs. There are key servers or you can mail the public key to them.
To me, if on Cell Phones, all bets are off. I would never use email on Cell Phones.
There is also Mailvelope, a browser plugin, that simplifies PGP encryption across web email clients.