Clear-Site-Data HTTP Header

In an effort to improve security on the client-side modern browsers have introduced a means to allow for web applications request a client to remove persisted data. Of course, not supported in any version of MSIE or Safari, but all modern browsers Chrome 61+, Edge 79+, Firefox 63+ support.

This approach can be useful at logoff or session invalidation to remove data from the client-side, particularly in cases of persistent or reflected XSS.

Clear-Site-Data: “cookies”
Clear-Site-Data: “cache”, “cookies”, “storage”, “executionContexts”

REFERENCES:

Global Privacy Control (GPC) – Sec-GPC HTTP Header – /.well-known/gpc.json file

GPC is the latest attempt at allowing customers to specify how their browsing data is to be shared online, the previous attempt referred to as DNT was a relative failure.

Like with DNT, once the user specifies their preference the browser adds an additional HTTP request header:
Sec-GPC: 1

This can be checked with javascript on the page with the following
const gpcValue = navigator.globalPrivacyControl

Additionally, websites can define that they respect the GPC request by posting a file in a file /.well-known/gpc.json
Content-Type: application/json
{
"gpc": true,
"version": 1,
"lastUpdate": "2021-11-01"
}

Examples:

GPC is currently implemented by default in:
Brave = https://spreadprivacy.com/global-privacy-control-enabled-by-default/

In Firefox, you currently have to enable it manually:
about:config globalprivacycontrol boolean true

REFERENCES:

HSTS preload

If you have already started using HSTS to force users to your HTTPS website, the use of ‘preload’ is another simple addition as it only requires the addition of the keyword to the header.

Once done, you can either wait for your site to be identified (which can take a long time, or forever for less popular websites) or ideally, submit your hostname to be added to the lists preloaded in many modern browsers. The advantage here is that your users will never make a single request to your HTTP website and will automatically be directed to HTTPS.

An HTTP Header example:

Strict-Transport-Security: max-age=63072000; includeSubDomains; preload

Apache2 configuration example:

Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"

REFERENCES:

“Referrer-Policy” HTTP Header

A relatively new HTTP Header that is supported by most modern browsers (except MSIE) is the “Referrer-Policy” header. There have been previous attempts to implement similar protections through use of the ‘rel’ (or ‘rev’) attributes on links to external websites. The latest approach takes a different approach and prevents leaking of internal URLs, and in some cases parameters, to external websites. This is important from a security perspective as you might maintain some sensitive information in your page urls, that would otherwise be inadvertently shared with an external website.

Clearly, you’ll need to determine your own level of security based upon your needs. Example: ‘no-referrer’ would be the most strict and would prevent the browser from sending the ‘Referer'(sic) header even to your own websites pages.

Example header values:

Referrer-Policy: no-referrer
Referrer-Policy: no-referrer-when-downgrade
Referrer-Policy: origin
Referrer-Policy: origin-when-cross-origin
Referrer-Policy: same-origin
Referrer-Policy: strict-origin
Referrer-Policy: strict-origin-when-cross-origin
Referrer-Policy: unsafe-url

Implementation can be accomplished in many ways, the most simple being and addition to your HTTP server configuration similar to the one shown below for Apache 2.x:

Header always set Referrer-Policy strict-origin

REFERENCES:

Content-Security-Policy: block-all-mixed-content

If you are running a secure website, it’s a good idea to prevent non-secure assets from being included on your page. This can often happen through the use of content management system, or even through website vulnerabilities. A simple change in HTTP headers will help browsers to defend against them.


Content-Security-Policy: block-all-mixed-content

Most modern browsers, except MSIE, currently support this approach.
– Firefox 48+

REFERENCES

Content-Security-Policy: upgrade-insecure-requests;

As the web has been shifting to HTTPS for security and performance reasons, there are many methods to migrate users. One simple method is via the use of the Content-Security Header.


Content-Security-Policy: upgrade-insecure-requests;

Most modern browsers, except MSIE, currently support this approach.
– Chrome 43+

REFERENCES

SameSite cookies

Recently, while reading through the updated 2017 OWASP Top Ten RC1 documentation, last updated in 2013, I noticed a recommendation to use Cookies with the “SameSite=strict” value set to reduce CSRF exposure in section A8.

Consider using the “SameSite=strict” flag on all cookies, which is increasingly supported in browsers.

Similar to the way that HttpOnly and Secure attributes have been added, SameSite allows for additional control.

Per the documentation, as of April 2017 the SameSite attribute is implemented in Chrome 51 and Opera 39. Firefox has an open defect, but I would expect it to be added soon to follow Chrome.


Set-Cookie: CookieName=CookieValue; SameSite=Lax;
Set-Cookie: CookieName=CookieValue; SameSite=Strict;

According to the specification you can issue the SameSite flag without a value and Strict will be assumed:


Set-Cookie: CookieName=CookieValue; SameSite

As many programming languages and server runtime environments do not yet support this for session cookies, you can use the Apache Tier1 configuration to append them.


Header edit Set-Cookie ^(JSESSIONID.*)$ $1;SameSite=Strict
Header edit Set-Cookie ^(PHPSESSID.*)$ $1;SameSite=Strict

It looks like PHP.INI might support the following attribute in a future release, but it’s not there yet!

session.cookie_samesite

REFERENCES:

Brotli Compression

If you look at HTTP Headers as often as I do, you’ve likely noticed something different in Firefox 44 and Chrome 49. In addition to the usual ‘gzip’, ‘deflate’ and ‘sdhc’ , a new value ‘br’ has started to appear for HTTPS connections.

Request:

Accept-Encoding:br

Response:

Content-Encoding:br

Compared to gzip, Brotli claims to have significantly better (26% smaller) compression density woth comparable decompression speed.

The smaller compressed size allows for better space utilization and faster page loads. We hope that this format will be supported by major browsers in the near future, as the smaller compressed size would give additional benefits to mobile users, such as lower data transfer fees and reduced battery use.

Advantages:

  • Brotli outperforms gzip for typical web assets (e.g. css, html, js) by 17–25 %.
  • Brotli -11 density compared to gzip -9:
  • html (multi-language corpus): 25 % savings
  • js (alexa top 10k): 17 % savings
  • minified js (alexa top 10k): 17 % savings
  • css (alexa top 10k): 20 % savings


NOTE: Brotli is not currently supported Apache HTTPd server (as of 2016feb10), but will likely be added in an upcoming release.

http://mail-archives.apache.org/mod_mbox/httpd-users/201601.mbox/%[email protected]%3E

Until there is native support, you can pre-compress files by following instructions here…
https://lyncd.com/2015/11/brotli-support-apache/

REFERENCES:

Google and Facebook bypassing P3P User Privacy Settings

I wrote about P3P a very long time ago, and have implemented it on several websites. Some history, the W3C crafted the P3P policy.
Microsoft introduced P3P support in IE6 (in 2001) and it remains implemented in all current versions of the browser. The primary intended use is to block 3rd party cookies within the browser on behalf of the user.

Interesting enough, Microsoft has had been a bit of a struggle with Google and Facebook, which send the following HTTP response headers.

Google’s Response:

P3P: CP="This is not a P3P policy! See http://www.google.com/support/accounts/bin/answer.py?hl=en&answer=151657 for more info."

Facebook’s response:

P3P: CP="Facebook does not have a P3P policy. Learn why here: http://fb.me/p3p"

REFERENCES:

Java User-Agent detector and caching

It’s often important for a server side application to understand the client platform. There are two common methods used for this.

  1. On the client itself, “capabilities” can be tested.
  2. Unfortunately, the server cannot easily test these, and as such must usually rely upon the HTTP Header information, notably “User-Agent”.

Example User-agent might typically look like this for a common desktop browser, developers can usually determine the platform without a lot of work.

"Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; InfoPath.3)"

Determining robots and mobile platforms, unfortunately is a lot more difficult due to the variations. Libraries as those described below simplify this work as standard Java Objects expose the attributes that are commonly expected.

With Maven, the dependencies are all resolved with the following POM addition:

<dependency>
<groupid>net.sf.uadetector</groupid>
<artifactid>uadetector-resources</artifactid>
<version>2014.10</version>
</dependency>


/* Get an UserAgentStringParser and analyze the requesting client */
final UserAgentStringParser parser = UADetectorServiceFactory.getResourceModuleParser();
final ReadableUserAgent agent = parser.parse(request.getHeader("User-Agent"));

out.append("You're a '");
out.append(agent.getName());
out.append("' on '");
out.append(agent.getOperatingSystem().getName());
out.append("'.");

As indicated on the website documentation, running this query for each request uses valuable server resources, it’s best to cache the responses to minimize the impact!

http://uadetector.sourceforge.net/usage.html#improve_performance

NOTE: the website caching example is hard to copy-paste, here’s a cleaner copy.


/*
* COPYRIGHT.
*/
package com.example.cache;

import java.util.concurrent.TimeUnit;
import net.sf.uadetector.ReadableUserAgent;
import net.sf.uadetector.UserAgentStringParser;
import net.sf.uadetector.service.UADetectorServiceFactory;
import com.google.common.cache.Cache;
import com.google.common.cache.CacheBuilder;

/**
* Caching User Agent parser
* @see http://uadetector.sourceforge.net/usage.html#improve_performance
* @author Scott Fredrickson [skotfred]
* @since 2015jan28
* @version 2015jan28
*/
public final class CachedUserAgentStringParser implements UserAgentStringParser {

private final UserAgentStringParser parser = UADetectorServiceFactory.getCachingAndUpdatingParser();
private static final int CACHE_MAX_SIZE = 100;
private static final int CACHE_MAX_HOURS = 2;
/**
* Limited to 100 elements for 2 hours!
*/
private final Cache<String , ReadableUserAgent> cache = CacheBuilder.newBuilder().maximumSize(CACHE_MAX_SIZE).expireAfterWrite(CACHE_MAX_HOURS, TimeUnit.HOURS).build();

/**
* @return {@code String}
*/
@Override
public String getDataVersion() {
return parser.getDataVersion();
}
/**
* @param userAgentString {@code String}
* @return {@link ReadableUserAgent}
*/
@Override
public ReadableUserAgent parse(final String userAgentString) {
ReadableUserAgent result = cache.getIfPresent(userAgentString);
if (result == null) {
result = parser.parse(userAgentString);
cache.put(userAgentString, result);
}
return result;
}
/**
*
*/
@Override
public void shutdown() {
parser.shutdown();
}
}

REFERENCES: