IMO v3 Hidden Service address should be put on the front page (it used to be displayed on top every page too what happened to that?). Considering the upgraded encryption algorithms (v3 hidden services replaced SHA1/DH/RSA1024 with SHA3/ed25519/curve25519) and new security features that distribute a hash of the .onion to HSDirs instead of the its plaintext form, I don't even see the point of having a legacy hidden service in the first place.
Also, making people click past big warnings just to use an additional layer of encryption is a really bad practice as Tor already has good encryption (see above) so I think just the http version would be adequate.
>>2856 >making people click past big warnings just to use an additional layer of encryption is a really bad practice
Why? Tor's encryption could be compromised by bioluminescents. Furthermore, you're not supposed to just click past the security warning, you're supposed to verify the fingerprint and the browser will warn you again if the fingerprint changes.
>>2857 Tor Project's page says that it's a non-NIST curve and that's a positive score in book. If it's adopted so rapidly by so many projects then it should have many eyes on it, especially from academia, willing to ambitiously write a juicy paper on how they cracked it. If it's was insecure we would know by now.
>>2859 I checked this multiple times with seperate TBB instances. I can't permanently approve the TLS certificate (that option is greyed out if you are wondering), it has to be approved every time TBB restarts.
It's gonna be fixed soon:
>V2 service may be redirected or disabled in the future. Update your bookmarks.
>>2860 >it's a non-NIST curve and that's a positive score in my book
Yeah I've heard that a lot, but someone calling out a NIST curve as being insecure doesn't automatically exclude his own curves from being insecure/backdoored.
>>2865 I've been looking into it for a bit because I want to create my own application which heavily relies on public key crypto, but to be on the safe side I like to hide the public keys as much as possible and rely on symmetric encryption (shared secrets) whenever possible.
The thing that bothers me with Daniel J. Bernstein is that he calls out NIST curves for being (potentially) insecure without really saying why he thinks so, also he's incredibly effective at marketing his own cryptography and is adopted and trusted by many people without question.
Maybe I'll go for Curve448 because it chooses a less random and more logical Curve than Curve25519 and is still performant enough, but there aren't a lot of implementation out there.
Though most of this is based on gut feeling with barely any understanding of the internals of cryptography, and I can barely do math, haha :)
You can do the secret key exchange with public keys and then switch to symmetric encryption from there, like how TLS does it. Well this is assuming you are making a network application of course.
>The thing that bothers me with Daniel J. Bernstein is that he calls out NIST curves for being (potentially) insecure without really saying why he thinks so, also he's incredibly effective at marketing his own cryptography and is adopted and trusted by many people without question.
Yes that caught my attention too. A few amount of people happens to be behind most of encryption algorithms that have widespread adoption. Daniel J. Berstein (Ed25519, Curve25519, ChaCha20, Poly1305) and John Daemen (AES, SHA-3) seems to be the favorite figures.
>Nearly a decade later Edward Snowden's disclosure of mass surveillance by the National Security Agency and the discovery of a backdoor in their Dual_EC_DRBG, raised suspicions of the elliptic curve parameters proposed by NSA and standardized by NIST. Many researchers feared that the NSA had chosen curves that gave them a cryptanalytic advantage. Since then, Curve25519 and EdDSA have attracted much greater attention.
Though it looks like Bernstein's crypto got attention only after the Dual_EC_DRBG incident.
>>2867 >secret key exchange with public keys and then switch to symmetric encryption from there
Yeah that's how all of them are used, you don't encrypt stuff with public keys but you just derive a shared secret from it
>Though it looks like Bernstein's crypto got attention only after the Dual_EC_DRBG incident.
Right, and another thing that bothers me is the likelihood of Edward Snowden being a limited hangout/controlled opposition in order to maybe push for backdoored/less secure alternative out of fear for the well-vetted(?) standards
>>2868 >the likelihood of Edward Snowden being a limited hangout/controlled opposition in order to maybe push for backdoored/less secure alternative out of fear for the well-vetted(?) standards
it's only a matter of time before he makes some serious mistakes, but at least I don't think he will play into ((their)) hands intentionally.
>2868
The Snowden story is a fairytail. There is no possible way he walked out the door with that amount of information. The NSA is highly compartmentalized and Snowden would have had limited access at best.
>>2871 Even if it isn't fairytale, (((Greenwald))) rendered his informations useless.
But it was a pretty succesful psy-op; lured people into honeypots eg tails and normalized constant 24/7/365 surveillance.
>>2873 This is the issue with people screaming about honeypots. They fail to provide concrete evidence of something being a honeypot, except maybe a very vague connection with NSA.
>>2871 Computers can render such compartmentalization meaningless, especially in any organization too large to manage effectively. NSA has probably not found secrets to managing people and informationi that a Fortune 500 company has not, and yet the latter are see to fuck spectacularly.
>>2877 >So just add an exception?
Yes, security is already provided by the Hidden Service (HS) itself, the address being (part of) the public key of the HS, the TLS on here is just to add an extra layer of security for what its worth.
>>2877 you get a warning because the certificate cannot be validated against a certificate authority (as normally happens with https). given who the certificate authorities are, this behavior is probably not desirable anyway.
either verify manually or skip the procedure by just using the V3 address sans TLS (plain http).
>>2866 >The thing that bothers me with Daniel J. Bernstein is that he calls out NIST curves for being (potentially) insecure without really saying why he thinks so
that is weird, because anyone working in infosec (and especially cryptography) would know exactly the reasons why NIST is untrustworthy. glossing over or omitting certain things is suspicious.
>>2880 Maybe he thinks he doesn't need to re-state obvious facts, idk.
The Dual_EC_DRBG incident has been the main reason for distrusting NIST right? I'm actually not too familiar with NIST myself.
>>2881 >The Dual_EC_DRBG incident has been the main reason for distrusting NIST right?
in the context of this discussion, yes. their 9/11 investigation was also laughable. it's a government organization filled with useful idiots.
>>2907 Disable saving of history and cookies, and also disable private browsing mode. That still doesn't save undesired information but allows configuration info such as tls cert exceptions to persist.
>>2908 onion addresses adhere to Zooko's triangle, they are secure and not issued by a central authority. So it is not desirable to hinder user experience by introducing self-signed TLS certificates as onion addresses already take care of server authentication and end-to-end encryption:
4.4.2. Link authentication type 3: Ed25519-SHA256-RFC5705.
If AuthType is 3, meaning "Ed25519-SHA256-RFC5705", the
Authentication field of the AuthType cell is as below:
Modified values and new fields below are marked with asterisks.
TYPE: The characters "AUTH0003" [8 octets]
CID: A SHA256 hash of the initiator's RSA1024 identity key [32 octets]
SID: A SHA256 hash of the responder's RSA1024 identity key [32 octets]
CID_ED: The initiator's Ed25519 identity key [32 octets]
SID_ED: The responder's Ed25519 identity key, or all-zero. [32 octets]
SLOG: A SHA256 hash of all bytes sent from the responder to the
initiator as part of the negotiation up to and including the
AUTH_CHALLENGE cell; that is, the VERSIONS cell, the CERTS cell,
the AUTH_CHALLENGE cell, and any padding cells. [32 octets]
CLOG: A SHA256 hash of all bytes sent from the initiator to the
responder as part of the negotiation so far; that is, the
VERSIONS cell and the CERTS cell and any padding cells. [32
octets]
SCERT: A SHA256 hash of the responder's TLS link certificate. [32
octets]
TLSSECRETS: The output of an RFC5705 Exporter function on the
TLS session, using as its inputs:
- The label string "EXPORTER FOR TOR TLS CLIENT BINDING AUTH0003"
- The context value equal to the initiator's Ed25519 identity key.
- The length 32.
[32 octets]
RAND: A 24 byte value, randomly chosen by the initiator. [24 octets]
SIG: A signature of all previous fields using the initiator's
Ed25519 authentication key (as in the cert with CertType 6).
[variable length]
To check the AUTHENTICATE cell, a responder checks that all fields
from TYPE through TLSSECRETS contain their unique
correct values as described above, and then verifies the signature.
The server MUST ignore any extra bytes in the signed data after
the RAND field.
>>2910 >durr hurr it's not necessary
1. Nobody's forcing you to use it.
2. If Tor project is compromised in the future, this provides additional security. Isn't this kind of obvious?
>>2907 It just disables the placebo incognito mode, you can easily re-enable private browsing features you want while leaving those you don't disabled. Older tor browser versions used to save the cert configs, but I guess they realized that you can deduce the sites someone visits with this information
>>2909 >implying I don't already memorize the first 4 characters of the fingerprint already
>>2917 >a few characters
A few characters can be bruteforced. Tell me, which pair of these onion addresses and TLS fingerprints truly belong to nanochan?
>>2919 You may be right, but show me the actual private keys which correspond to the fake address/fingerprint pair. That's right you can't.
I personally just remember the nanochancsvnej4vxiidu and pub63xadeet parts, combined with the BC:0E part. Ok so it's not perfect but even bruteforcing that much would be pretty hard, plus I have the real address and fingerprint saved on a file in my computer. If nanochan were life-critical to me for some reason I would take time to memorize the entire address and TLS fingerprint.
>>2921 The mere existence of https://facebookcorewwwi.onion simply proves that remembering a portion of the onion address is no longer enough (and also that v2 hidden services can be broken my non-governmental agencies). nanochan or other hidden services might not be life-critical but its always a good practice to bookmark or write down every other onion address.
>>2919 >>2921 You can bruteforce first characters, but first and last? It's up to impossible. Let's say attacker created v3 onion address starting with nanochancsvnej, it could be done, but to also include ttavqd at the end would require insane amount of power. Similar to hash.
>>2918 >but I guess they realized that you can deduce the sites someone visits with this information
If verifying hash to remain secure discourages you from visiting sites then I have good news for you - 4cuck and pigchan don't require that.
>>2923 What I meant was that by storing the fingerprints firefox / the tor browser couldn't be 100% amnesic anymore, since an attacker with access to the computer is able to identify the websites one has visited using the stored list of fingerprints. Not necessarily needed when you're already keeping other aspects of your computing clean, but for normalfags not wanting to be caught watching porn it might still be useful
>If verifying hash to remain secure discourages you from visiting sites
Not sure how you got thst from my post
IMO v3 Hidden Service address should be put on the front page (it used to be displayed on top every page too what happened to that?). Considering the upgraded encryption algorithms (v3 hidden services replaced SHA1/DH/RSA1024 with SHA3/ed25519/curve25519) and new security features that distribute a hash of the .onion to HSDirs instead of the its plaintext form, I don't even see the point of having a legacy hidden service in the first place.
Also, making people click past big warnings just to use an additional layer of encryption is a really bad practice as Tor already has good encryption (see above) so I think just the http version would be adequate.