TLS Training London – day 2

Written on 8 September 2018, 02:07pm

Tagged with: , , , ,

Again, some notes about the second day of the excellent TLS Training delivered by Scott Helme. 

  • symmetric encryption is fast. AES is fast enough for transferring large amounts of encrypted data (ex. streaming)
  • asymmetric encryption is slow, therefore it’s only used for the authentication, in the beginning of the secured session 
  •  RSA algorithm was actually invented 4 years before: The acronym RSA is made of the initial letters of the surnames of Ron Rivest, Adi Shamir, and Leonard Adleman, who first publicly described the algorithm in 1978. Clifford Cocks, an English mathematician working for the British intelligence agency Government Communications Headquarters (GCHQ), had developed an equivalent system in 1973, but this was not declassified until 1997.
  • Hashing: SHA256 (a subset of the SHA-2 family) is considered strong enough. Alternatives for the future are SHA384 and SHA512 (longer digests), but if the SHA-2 is fundamentally broken, then the SHA-3 family (Keccak) comes to the rescue. It’s like a never-ending cat vs mouse game between cryptographers and cryptanalysts. 
  • The CAs store their private keys in HSMs and rarely rotate them (a lifetime of a few decades is not uncommon)
  • There is a good analogy between digital certificates and passports
  • X509 is the standard describing the structure of the digital certificates. Currently at version 3, it introduced extensions (arbitrary metadata of key + values pairs). Example of an extension: the SAN (Subject Alternative Names) – where a number of domains can be given on top of the common name (CN). In fact, Google Chrome only looks at the SAN when parsing a certificate.
  • The certificate chain is typically composed of the Root CA certificate, then the Intermediate CA certificate(s) and finally, the end-entity certificate (the leaf). The last intermediate certificate has the ‘path length’ parameter set to 0 (it’s children can only be leaves).
  • The Root CA certificates are provided by the client (stored in the browser or OS), while the intermediate CA and end-entity certificates are provided by the server(the intermediate CA cert – for performance reasons)
  • It takes on average 5-6 years to become a Root CA. And if you want this, you must work with the following 5 relying parties carrying a set of root keys in their trust store: Apple, Google, Java, Mozilla, Microsoft. Let’s Encrypt started in 2016 and it’s not yet a Root CA; they are currently using another root CA to cross-sign their certificates (IdenTrust). 
  • The Web PKI is governed by the CAB Forum – an entity where the  Certificate Authorities and the major browsers are represented.
To be continued…

TLS Training London – day 1

Written on 6 September 2018, 08:55pm

Tagged with: , , ,

Some notes after the first day of the TLS training session with Scott Helme

——

  • the core protocols powering the Internet were not designed with security in mind
  • you pwn the cookie, you pwn the user
  • the server should not encrypt the cookie contents because there is nothing to hide to the browser
  • the submarine cable map is amazing, but the landing sites are possible points of failure when it comes to your privacy
  • we’ve reached the HTTPS tipping point – meaning that more than 50% of the Internet traffic is encrypted, but 90% of the sites are still on plain, old HTTP
  • the goal of encryption: to encrypt the data for just as long as it’s needed
  • when checking into a hotel, we would rather not have running  water than not having wi-fi 🙂
  • SSL was initially the Netscape’s baby, but it was renamed to TLS under the pressure of Microsoft
  • TLS 1.3 was officially adopted as a standard, and it comes with major performance improvements as well as mandatory forward secrecy. But it will take a couple of years until it will be implemented at large scale by the hardware manufacturers 
  • TLS 1.3 should have been really named TLS 2.0 if it was not for some poorly coded, but widely used hardware
  • it becomes more and more clear the significant impact of the Snowden revelations on how people look at their privacy and web security (example: Lavabit and forward secrecy)
  • the recommended lifespan of a certificate is about 12 months
  • common domain validation methods: email challenge, DNS text record or a random HTTP path
  • client clock skew: you can change your device time to cheat on Candy Crush, but this can lead to invalid HTTPS certificates for your device only
  • if you are a big organisation, you better have a backup CA (or at least one that is ready to issue a new certificate in a matter of minutes, not days)
  • cipher suite format: TLS_KX_AUTH_CIPHER_HASH. 
    • Key Exchange (KX): just use ECDHE, or if not supported, DHE. But never use RSA because of the lack of forward secrecy
    • Authentication: RSA is still good enough
    • Symmetric key encryption (used because it’s faster than asymmetric): AES 128 is good enough; AES 256 better but slower
  • sometimes, good security practices are followed not because of the security advantages, but because of the performance improvements: ChaCha20
  • don’t create a system that relies on the human factor for security (ex. don’t ask the regular user to type https:// in his browser) 
  • good: HTTPS, better: HTTPS + HSTS, best: HTTPS + HSTS + preload. But having all the browsers load a static list of websites is not a scalable solution
  • BTW – seeing my own domain in the source code of all the modern browsers used by billions of people is cool: transport_security_state_static.json (warning – 6Mb file!) 
  • HSTS is a one-way street: you can’t easily go back from HTTPS to HTTP
  • people are terrified about changing the cookies standards / specifications
  • it looks like the attackers can overwrite your cookies even when using secure cookies over HTTPS. Cookie prefixes are a dirty, but effective solution: you just need to add __Secure- to your cookie name:
Set-Cookie: __Secure-ID=123; Secure; Domain=example.com


“The __Secure- prefix makes a cookie accessible from HTTPS sites only. A HTTP site can not read or update a cookie if the name starts with  __Secure-. “

TV connectivity round-up

Written on 2 December 2017, 09:33pm

Tagged with: , , ,

A short reminder about the ins and outs (pun intended) of the nowadays TV sets and their connected peripherals.

HDMI 1.4 vs 2.0

Official specifications of the HDMI 2.0 standard:

* Enables transmission of High Dynamic Range (HDR) video
* Bandwidth up to 18Gbps
* 4K@50/60 (2160p)
HDMI 2.0b does not define new cables or new connectors. Current High Speed cables (Category 2 cables) are capable of carrying the increased bandwidth.
The newer HDMI 2.1 specifications add support for a range of higher video resolutions and refresh rates including 8K60 and 4K120, and resolutions up to 10K. Dynamic HDR formats are also supported, and bandwidth capability is increased up to 48Gbps.
https://www.hdmi.org/manufacturer/hdmi_2_0/

Two notes:
1. HDMI 2.0 is a hardware update, and both ends must have a HDMI 2.0 compatible chipset
2. In order to enjoy the benefits of the HDMI 2.0, the HDMI cable must be able to sustain the 18Gbps bandwidth
See more in the troubleshooting section below.

What do ARC and MHL mean?

On the back of your TV set, next to the HDMI ports you will see these 2 labels:
ARC – Audio Return Channel – enables the TV to send the audio data to the receiver. All HDMI cables support ARC by default; for TV and receivers compatibility look for the port label and/or user manual. Even if the TV has all the HDMI ports ARC-compatible, only one of them will be used at a time.
MHL – Mobile High Definition Link – allows to connect and mirror smartphones and tablets (both Android and iOS) to the TV.
On some TV sets you might also see HDCP (High-bandwidth Digital Content Protection) which implements a form of digital copy protection.

HDR standards: HDR10 vs Dolby Vision

You know how virtually all the TV producers brag about their newest models being HDR? Well, there are more standards that apply: HDR10, HDR10+, Dolby Vision, Hybrid Log-Gamma, SL-HDR1, etc.
For instance, Apple TV 4K supports the following standards:

4K Standard Dynamic Range (SDR): Used for 4K televisions that don’t support HDR10 or Dolby Vision.
4K High Dynamic Range (HDR, aka HDR10): Used for 4K televisions that support HDR to display video with a broader range of colors and luminance.
4K Dolby Vision: Used for 4K televisions that support Dolby Vision HDR to display video with a broader range of colors and luminance optimized for your television.
https://support.apple.com/en-us/HT208074

To see which HDR standard your TV supports, look at the fine print in the user manual (they should also indicate which HDMI port supports these HDR standards):

4:2:2? 4:2:0?

Here things become a bit more complex. If you’re not interested in the details, remember just that the higher the 3 numbers above, the better. As Apple says in the Apple TV 4K menu, “4:2:0 provides high-quality picture that is compatible with most TVs and HDMI cables. 4:2:2 improves quality, but requires high-speed cables“.

There’s a tradeoff between video quality and bandwidth. With 4K resolutions, 60Hz refresh rates, full 36-bit color depth, HDR capabilities and 32 audio channels, the bandwidth can reach incredible numbers. And as we saw above, HDMI 2.0 is limited to 18Gbps (48Gbps in HDMI 2.1). The solution is to compress the video signal exploiting the limitations of the human eye.
To make the explanation simpler, meet Chroma and Luma:

Chroma is the signal used in video systems to convey the color information
Luma represents the brightness in an image
Digital signals are often compressed. Since the human visual system is much more sensitive to variations in brightness than color, a video system can be optimized by devoting more bandwidth to the luma component (Y’, brightness), than to the color (Cb, Cr).
https://en.wikipedia.org/wiki/Chroma_subsampling

Below you can see how the original image is de-composed in Luma component (black and white – brightness only) and Chroma (color). The Luma is un-altered, but the Chroma is compressed (except for 4:4:4). Depending on the compression type, you can have 4:2:2, 4:2:0 or other subsampling systems (no 4-4-2 system though 🙂 ):

The bandwidth savings are impressive: the 4:2:2 sampling can reduce the necessary bandwidth to 12Gbps, while 4:2:0 further drops the requirement to 9Gbps.

Troubleshooting guide

Knowing all this, if you still have trouble getting the most of your peripherals and TV, here is a quick troubleshooting guide:
– can’t select 4:2:2 chroma: HDMI 2.0 supports 4:2:0 natively, but in order to benefit from 4:2:2 you have to upgrade your HDMI cable. Here is a decent one: Belkin Ultra High speed
– can’t select 4K HDR @60Hz: check your HDMI connectors, some manufacturers only accept HDR on HDMI1 and HDMI2 ports.
– my TV says it’s HDMI 2.0, but it’s not: well, it might be that only the port HDMI1 supports HDMI 2.0, while the others only support HDMI 1.4. That’s sad, but it can happen to older TV sets; just read the manual
– my TV won’t turn on/off when I turn on/off my peripherals: make sure you enable CEC in your TV menu. It can be named differently depending on the TV manufacturer: Bravia Sync for Sony, Anynet+ for Samsung, VIERA Link for Panasonic, EasyLink for Philips, SimpLink for LG.
– I can’t control the sound of my TV from my Apple TV remote: make your Apple TV ‘learn’ the TV remote.
– …but I have a Sonos Playbar linked to my TV: then go to your Sonos settings and pair your Sonos system with the Apple TV remote. However, be aware that a single remote can be paired with the Sonos system.


Image: https://www.dolby.com/us/en/brands/dolby-vision.html