Browser indicators: we are currently in a transition phase. A few years ago, the browsers only indicated secure behavior. In the near future, they will only indicate insecurebehavior (ex: Chrome 75 shows HTTP sites as ‘Not secure’, but it also shows the green address bar for HTTPS sites with EV certificates)
HTTPS usage: 78% overall according to Firefox telemetry, but only 58% among the first 1M websites. However, we are a long way until the browsers will default to HTTPS scheme for loading a website.
Fiddler is really powerful (ex. replay requests, intercept mobile traffic, etc), but Havij (SQL injection) is close to magic when it comes to penetration testing
A few tools: SuperLogout (maybe try this in an incognito window; it will log you out of all the popular websites), ZoomIt (screen zoom and annotation tool), Windows key + . (just try it if you’re on Windows ?)
The expectation of privacy is different on a tech website compared to an online dating one
Trust, but verify: you should trust the CDNs and rely on them for the massive performance improvements, but you must verify them using SRI. Tip: you don’t need to SRI your own assets.
The main value proposition of the Content Security Policyis mitigating XSS attacks. A strategy to get started: use a non-production environment, report only, default-src ‘none’, watch the console and build your CSP by cleaning the console errors one by one.
Again, some notes about the second day of the excellent TLS Training delivered by Scott Helme.
symmetric encryption is fast. AES is fast enough for transferring large amounts of encrypted data (ex. streaming)
asymmetric encryption is slow, therefore it’s only used for the authentication, in the beginning of the secured session
RSA algorithm was actually invented 4 years before: The acronym RSA is made of the initial letters of the surnames of Ron Rivest, Adi Shamir, and Leonard Adleman, who first publicly described the algorithm in 1978. Clifford Cocks, an English mathematician working for the British intelligence agency Government Communications Headquarters (GCHQ), had developed an equivalent system in 1973, but this was not declassified until 1997.
Hashing: SHA256 (a subset of the SHA-2 family) is considered strong enough. Alternatives for the future are SHA384 and SHA512 (longer digests), but if the SHA-2 is fundamentally broken, then the SHA-3 family (Keccak) comes to the rescue. It’s like a never-ending cat vs mouse game between cryptographers and cryptanalysts.
The CAs store their private keys in HSMs and rarely rotate them (a lifetime of a few decades is not uncommon)
There is a good analogy between digital certificates and passports
X509 is the standard describing the structure of the digital certificates. Currently at version 3, it introduced extensions (arbitrary metadata of key + values pairs). Example of an extension: the SAN (Subject Alternative Names) – where a number of domains can be given on top of the common name (CN). In fact, Google Chrome only looks at the SAN when parsing a certificate.
The certificate chain is typically composed of the Root CA certificate, then the Intermediate CA certificate(s) and finally, the end-entity certificate (the leaf). The last intermediate certificate has the ‘path length’ parameter set to 0 (it’s children can only be leaves).
The Root CA certificates are provided by the client (stored in the browser or OS), while the intermediate CA and end-entity certificates are provided by the server(the intermediate CA cert – for performance reasons)
It takes on average 5-6 years to become a Root CA. And if you want this, you must work with the following 5 relying parties carrying a set of root keys in their trust store: Apple, Google, Java, Mozilla, Microsoft. Let’s Encrypt started in 2016 and it’s not yet a Root CA; they are currently using another root CA to cross-sign their certificates (IdenTrust).
The Web PKI is governed by the CAB Forum – an entity where the Certificate Authorities and the major browsers are represented.
In the previous post I briefly described the CSP concept, along with other nice security features like SRI or CORS.
I am trying to implement the concept here, and I am describing the steps I take along the way. I am using securityheaders.io to scan the website and validate the results; and this website is hosted on Cloudflare over HTTPS.
The first impression is – WordPress doesn’t make it easy to have a proper Content Security Policy. But let’s dig into it!
Session 1
The quick wins
– Edit the .htaccess file to add the quick and dirty security headers: X-Frame-Options, X-XSS-Protection, X-Content-Type-Options, Referrer-Policy
– Strict-Transport-Security was already enabled via CloudFlare since October 2017.
Just do it!
– add the CSP directive in the .htaccess file. Start with default-src, whitelist all the usual suspects (Twitter, Google Analytics, Cloudflare, Flickr, etc)
– at this stage, some problems are evident, but at least securityheaders.io already reports an A+ score
– the basic functionality is still there, so progressive enhancement FTW
Session 2
Now refine and iterate
– check the results on result-uri.com report-uri.com, or, even easier, using DevTools
– disable the syntax highlighter and the image highlighter plugins, will find something CSP-friendly later
– whitelist even more in CSP (youtube.com, among others)
– disable the report-uri, the errors are building up quickly
– finding the main problem: inline content
Keep calm and avoid inline content
– I don’t want to whitelist ‘data’, ‘inline’ and ‘eval’, which would defeat half of the purpose, so keep iterating
– as I said, WordPress makes it really difficult by inlining several things (WP emoji being one of them)
– Unless you really know your web site inside and out, I would caution you to use CSP together with WordPress at the moment. (https://walterebert.com/blog/using-csp-wordpress/) Turns out, I really want to know
– removing inline content (https://www.denisbouquet.com/remove-wordpress-emoji-code/), or trying to move it inside files
– Akismet, you too? 🙁
– OK, it’s not only WordPress embedding inline content, it’s also Twitter (well, technically it was me, because I wanted my Twitter feed on the blog)
– hey, the WordPress Admin is impacted too, cannot upload images!!
– {angry+emoji} (I disabled the WP emoji above, remember?)
Session 3
– remember WP-Admin and CSP? Yeah, forget about that {sad-emoji}
– Twitter timeline and follow button embed without inline scripts – done https://publish.twitter.com/
– Aksimet, really? why would you inline that?
– Nevermind, it was not Akismet. Just WordPress…
– it looks like the admin-ajax.php inline call was trigerred by my WordPress theme for the ‘like’ system. I removed it and now the red heart is beating. Not functional, but way cooler than before B-)
– Who needs a syntax highlighter WordPress plugin when you have prismjs.com? Doesn’t support ColdFusion language, but I can always contribute the syntax file myself. I have to replace the code in all the previous posts, but I can automate that and the result will be semantically correct.
– Who needs to click images to make them bigger? HighSlide was nice, but it did not bring too much value.
– Ok, so 98% of cases I don’t need the images to be clickable. I can live with the rest of 2%
End of session 4. CSP enabled, securityheaders.io still indicate A+, I load no inline and eval content. Still allow ‘data’ in CSP, not ideal.
Next steps:
– some posts to be updated for the syntax highlighting and clickable images.
– add SRI to some of the JS libraries
– understand why CloudFlare automatically loads Google Analytics and try to get around their data: embedding
Session 5
– I will always write semantic code in WordPress.
– I will always write semantic code in WordPress.
– I will always write semantic code in WordPress.
– There. That’s 2 hours spent replacing custom [code] tags with <code> tags.
– Most of the images should also be ok, in case a small minority of them is still clickable it will simply open the full-size image in the same tab.
– Why does CloudFlare load Google Analytics?
– D’uh, turns out I had enabled it a few years ago. Along with two other CloudFlare apps (Earth Hour and Net Neutrality). And that was the reason why CloudFlare was inlining scripts. The explanation was in front of my eyes all the time:
If you use certain Cloudflare features, you will need to allow inline scripts in your policy. We include scripts on your domain and add some inline code when you enable Rocket Loader, Cloudflare Apps, or ScrapeShield.
If you do use any of these features, you will need to add the following to your Content Security Policy: script-src 'self' 'unsafe-inline'
January 31, 2018: What is Content Security Policy (CSP), and how can I use it with Cloudflare?
– Next stop: some images inlined as data: elements:
Refused to load the image 'data:image/svg+xml;charset=utf-8,%3Csvg%20xmlns%3D%22http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg%22%20width....'
because it violates the following Content Security Policy directive ...
– This time it was Twitter. So either I accept ‘data:’ in the CSP for images, or I turn off the Twitter timeline… {thinking_emoji} I asked @TwitterDev if there is any workaround
– Remaining steps: rewrite the CSP to make it more restrictive (I started by putting everything inside default-src) and use SRI for a few remote scripts (jQuery maybe?)
A quick way to test the SRI is to alter the hash in the integrity attribute. As soon as you do this, the browser will report the error:
To test the CSP, I connected to the database and manually added some Javascript in a comment:
With the CSP allowing script-src 'unsafe-inline', the code executed:
With the CSP not allowing script-src 'unsafe-inline', the code did not execute and the browser reported the problem:
Next step is to keep monitoring report-uri.com for the CSP violations.
The takeaways
– use the DevTools to validate your CSP rules
– start with validate-only to ‘test in production’
– avoid inline content at all costs
– think about all the plugins that you use: are they really needed?
– make sure you create WordPress content by writing semantically correct code
That’s it! 6 different sessions and about 10 hours allocated to this little project, but I’m happy with the results 🙂