Selenium HtmlUnit driver separated in 2.53.0

I’ve been a user of Selenium testing for several years, though I noticed that some classes related to the HtmlUnit WebDriver were missing after upgrading from 2.52.0 to 2.53.0. After some research, I discovered that it is now a separate dependency allowing for a separate release cycle. Additionally, if you don’t use this (relatively generic) webdriver, you will no longer need to have it in your binaries.

Here’s all you need to do to add it to your Maven projects for testing.

In your pom.xml file:



HTML5 Link Prefetching

Link prefetching is used to identify a resource that might be required by the next navigation, and that the user agent SHOULD fetch, such that the user agent can deliver a faster response once the resource is requested in the future.

<link rel="prefetch" href="" />

<link rel="prefetch" href="/images/sprite.png" />

Supported in:

  • MSIE 11+/Edge
  • Firefox 3.5+ (for HTTPS)
  • Chrome
  • Opera


HTML5 preconnect

In addition to dns-prefetch, you can take browser performance one step further by actually creating a new connection to a resource.

By initiating an early connection, which includes the DNS lookup, TCP handshake, and optional TLS negotiation, you allow the user agent to mask the high latency costs of establishing a connection.

Supported in:

  • Firefox 39+ (Firefox 41 for crossorigin)
  • Chrome 46+
  • Opera

<link rel="preconnect" href="//" />
<link rel="preconnect" href="//" crossorigin />


HTML5 DNS prefetch

I often get into some fringe areas of micro-optimizations of website performance, DNS prefetching is another one of those topics.

To understand how this can help, you must first understand the underlying concepts that are used within the communications used to build your web page.

The first of these is a “DNS Lookup”, where the domain name ( is converted into a numerical address, the IP address of the server that contains the file(s).

In many websites, content is included from other domains for performance or security purposes.

When the domain names are known in advance, this approach can save time on the connection as the lookup can fetched in advance, before it is required on the page to retrieve assets.

This can be particularly useful for users with slow connections, such as those on mobile browsers.

<link rel="dns-prefetch" href="//" />

Supported in:

  • MSIE9+ (MSIE10+ as dns-prefetch)/Edge
  • Firefox
  • Chrome
  • Safari
  • Opera


Brotli Compression

If you look at HTTP Headers as often as I do, you’ve likely noticed something different in Firefox 44 and Chrome 49. In addition to the usual ‘gzip’, ‘deflate’ and ‘sdhc’ , a new value ‘br’ has started to appear for HTTPS connections.





Compared to gzip, Brotli claims to have significantly better (26% smaller) compression density woth comparable decompression speed.

The smaller compressed size allows for better space utilization and faster page loads. We hope that this format will be supported by major browsers in the near future, as the smaller compressed size would give additional benefits to mobile users, such as lower data transfer fees and reduced battery use.


  • Brotli outperforms gzip for typical web assets (e.g. css, html, js) by 17–25 %.
  • Brotli -11 density compared to gzip -9:
  • html (multi-language corpus): 25 % savings
  • js (alexa top 10k): 17 % savings
  • minified js (alexa top 10k): 17 % savings
  • css (alexa top 10k): 20 % savings

NOTE: Brotli is not currently supported Apache HTTPd server (as of 2016feb10), but will likely be added in an upcoming release.[email protected]%3E

Until there is native support, you can pre-compress files by following instructions here…


Firefox 41+ extension signing

In the never-ending quest for browser security, Firefox has started implementing safeguards to only allow signed extensions. I found this out after upgrading to Firefox 41 as my installed version of “Deque FireEyes” stopped working. Thankfully, there is a workaround in Firefox 41, but it goes away in Firefox 42.

  • Firefox 40: warning only!
  • Firefox 41: workaround, via:

    xpinstall.signatures.required = false
  • Firefox 42: BLOCKED! unless signed


Website testing with SortSite

SortSite is a popular desktop software for testing of web applications for broken links, browser compatibility, accessibility and common spelling errors. It is also available as a web application known as “OnDemand“.

You can generate a free sample test of your website at:


Deque FireEyes accessibility testing plugin

I’ve done a lot of accessibility testing and development work over my career. One of the many free tools that I use in that role is FireEyes. Deque also has some commercial packages for developer use.

FireEyes adds a new tab on the Firebug tab bar and adds the ability to analyze a web site for WCAG 2.0 Level A and AA and Section 508 accessibility violations. The Stand-Alone version of FireEyes is a browser plugin to the FireFox browser. It requires that the FireBug plugin already be installed


  • Firefox 31-41

    As of 2015aug21, the current version of the extension is NOT signed and will not execute on later versions. [See my later post on this topic]

  • FireBug 2.x – Do NOT install Firebug v3 alpha as the tab will not show.

NOTE: should be on Firebug tab labeled “Worldspace Fireyes”, but does not seem to be available in Firebug3.

NOTE: if you try to download in MSIE, you must rename the .zip to .xpi, and then open with Firefox.


HTML4 script defer

This HTML4 attribute was intended to defer/delay execution of specific javascript code until after the page is rendered. In theory, this makes the website “appear” faster as the functions relevant to the User-Interface can be executed before other “background” processes that would otherwise block the screen from displaying.

<script defer="defer" src="example.js"></script>

NOTE: Do not use defer for external scripts that might depend on each other if you need to support MSIE9 and earlier.


Java User-Agent detector and caching

It’s often important for a server side application to understand the client platform. There are two common methods used for this.

  1. On the client itself, “capabilities” can be tested.
  2. Unfortunately, the server cannot easily test these, and as such must usually rely upon the HTTP Header information, notably “User-Agent”.

Example User-agent might typically look like this for a common desktop browser, developers can usually determine the platform without a lot of work.

"Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; InfoPath.3)"

Determining robots and mobile platforms, unfortunately is a lot more difficult due to the variations. Libraries as those described below simplify this work as standard Java Objects expose the attributes that are commonly expected.

With Maven, the dependencies are all resolved with the following POM addition:


/* Get an UserAgentStringParser and analyze the requesting client */
final UserAgentStringParser parser = UADetectorServiceFactory.getResourceModuleParser();
final ReadableUserAgent agent = parser.parse(request.getHeader("User-Agent"));

out.append("You're a '");
out.append("' on '");

As indicated on the website documentation, running this query for each request uses valuable server resources, it’s best to cache the responses to minimize the impact!

NOTE: the website caching example is hard to copy-paste, here’s a cleaner copy.

package com.example.cache;

import java.util.concurrent.TimeUnit;
import net.sf.uadetector.ReadableUserAgent;
import net.sf.uadetector.UserAgentStringParser;
import net.sf.uadetector.service.UADetectorServiceFactory;

* Caching User Agent parser
* @see
* @author Scott Fredrickson [skotfred]
* @since 2015jan28
* @version 2015jan28
public final class CachedUserAgentStringParser implements UserAgentStringParser {

private final UserAgentStringParser parser = UADetectorServiceFactory.getCachingAndUpdatingParser();
private static final int CACHE_MAX_SIZE = 100;
private static final int CACHE_MAX_HOURS = 2;
* Limited to 100 elements for 2 hours!
private final Cache<String , ReadableUserAgent> cache = CacheBuilder.newBuilder().maximumSize(CACHE_MAX_SIZE).expireAfterWrite(CACHE_MAX_HOURS, TimeUnit.HOURS).build();

* @return [email protected] String}
public String getDataVersion() {
return parser.getDataVersion();
* @param userAgentString [email protected] String}
* @return [email protected] ReadableUserAgent}
public ReadableUserAgent parse(final String userAgentString) {
ReadableUserAgent result = cache.getIfPresent(userAgentString);
if (result == null) {
result = parser.parse(userAgentString);
cache.put(userAgentString, result);
return result;
public void shutdown() {