HSTS preload

If you have already started using HSTS to force users to your HTTPS website, the use of ‘preload’ is another simple addition as it only requires the addition of the keyword to the header.

Once done, you can either wait for your site to be identified (which can take a long time, or forever for less popular websites) or ideally, submit your hostname to be added to the lists preloaded in many modern browsers. The advantage here is that your users will never make a single request to your HTTP website and will automatically be directed to HTTPS.

An HTTP Header example:

Strict-Transport-Security: max-age=63072000; includeSubDomains; preload

Apache2 configuration example:

Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"

REFERENCES:

Clear Linux bash history

While having history available with the simple use of the up arrow is a convenience feature common to most linux builds it can come with some risk. One such risk is when you have inadvertently typed a password instead of a command, or had to pipe credentials into a command.

Thankfully, you can clear the entire history with a variety of methods, the most common are below but others are available in the references.


history -c && history -w


cat /dev/null > ~/.bash_history && history -c && exit

REFERENCES:

“Referrer-Policy” HTTP Header

A relatively new HTTP Header that is supported by most modern browsers (except MSIE) is the “Referrer-Policy” header. There have been previous attempts to implement similar protections through use of the ‘rel’ (or ‘rev’) attributes on links to external websites. The latest approach takes a different approach and prevents leaking of internal URLs, and in some cases parameters, to external websites. This is important from a security perspective as you might maintain some sensitive information in your page urls, that would otherwise be inadvertently shared with an external website.

Clearly, you’ll need to determine your own level of security based upon your needs. Example: ‘no-referrer’ would be the most strict and would prevent the browser from sending the ‘Referer'(sic) header even to your own websites pages.

Example header values:

Referrer-Policy: no-referrer
Referrer-Policy: no-referrer-when-downgrade
Referrer-Policy: origin
Referrer-Policy: origin-when-cross-origin
Referrer-Policy: same-origin
Referrer-Policy: strict-origin
Referrer-Policy: strict-origin-when-cross-origin
Referrer-Policy: unsafe-url

Implementation can be accomplished in many ways, the most simple being and addition to your HTTP server configuration similar to the one shown below for Apache 2.x:

Header always set Referrer-Policy strict-origin

REFERENCES:

Remove Guest Account in Ubuntu

While the Guest session can be useful for some people, I’ve generally considered it to be security vulnerability as unauthorized users could gain physical access to some areas of your system that are not secured as well as they “should” be.

Additionally, the default behavior that allows for the username(s) to be stored and listed on the login screen are less than ideal.

Here we remove both!

  1. Create the config folder:
    sudo mkdir -p /etc/lightdm/lightdm.conf.d
  2. Create a new config file:
    sudo vi /etc/lightdm/lightdm.conf.d/10-ubuntu.conf
  3. Add the following:

    [SeatDefaults]
    user-session=ubuntu
    greeter-show-manual-login=true
    greeter-hide-users=true
    allow-guest=false
  4. Reboot

REFERENCES:

Content-Security-Policy: block-all-mixed-content

If you are running a secure website, it’s a good idea to prevent non-secure assets from being included on your page. This can often happen through the use of content management system, or even through website vulnerabilities. A simple change in HTTP headers will help browsers to defend against them.


Content-Security-Policy: block-all-mixed-content

Most modern browsers, except MSIE, currently support this approach.
– Firefox 48+

REFERENCES

Content-Security-Policy: upgrade-insecure-requests;

As the web has been shifting to HTTPS for security and performance reasons, there are many methods to migrate users. One simple method is via the use of the Content-Security Header.


Content-Security-Policy: upgrade-insecure-requests;

Most modern browsers, except MSIE, currently support this approach.
– Chrome 43+

REFERENCES

Clear Ubuntu ‘bash’ history

After a lot of use, your history file can become full of a lot of old commands… once in a while, it can be useful (and safer) to clean them up.

NOTE: this can be especially important if you have ever used a password as a command line parameter as it is stored without encryption in a text file.

Preferred:

cat /dev/null > ~/.bash_history && history -c && exit

Also useful:

history -c
history -w

REFERENCES:

Install Fail2Ban on Ubuntu to protect services

Many common adminstrative services such as VPN and SSH are exposed on known port numbers, unfortunately this makes it easy for hackers to use tools to attempt to access the systems. Use of countermeasures such as Fail2Ban can block them after a few failed attempts.

Installation Steps:

  1. sudo apt-get install fail2ban
  2. sudo cp /etc/fail2ban/jail.conf /etc/fail2ban/jail.local
  3. sudo vi /etc/fail2ban/jail.local
  4. Update:
    destemail & sender
  5. OPTIONAL:
    Splunk:
    sudo /opt/splunkforwarder/bin/splunk add monitor /var/log/fail2ban.log -index main -sourcetype Fail2Ban

    Splunk (manual):
    sudo vi /opt/splunkforwarder/etc/apps/search/local/inputs.conf

    [monitor:///var/log/fail2ban.log]
    disabled = false
    index = main
    sourcetype = Fail2Ban

  6. sudo service fail2ban restart

REFERENCES:

Squid3 Proxy on Ubuntu

Using a personal proxy server can be helpful for a variety of reasons, such as:

  • Performance – network speed and bandwidth
  • Security – filtering and monitoring
  • Debugging – to trace activity

Here are some simple steps to get you started,  obviously you will need to further “harden” security to make it production ready!


sudo apt-get install squid3


cd /etc/squid3/
sudo mv squid.conf squid.orig
sudo vi squid.conf

NOTE: the following configuration works, but will likely need to be adapted for your specific usage.


http_port 3128
visible_hostname proxy.EXAMPLE.com
auth_param digest program /usr/lib/squid3/digest_file_auth -c /etc/squid3/passwords
#auth_param digest program /usr/lib/squid3/digest_pw_auth -c /etc/squid3/passwords
auth_param digest realm proxy
auth_param basic credentialsttl 4 hours
acl authenticated proxy_auth REQUIRED
acl localnet src 10.0.0.0/8 # RFC 1918 possible internal network
acl localnet src 172.16.0.0/12 # RFC 1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC 1918 possible internal network
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines
#acl SSL_ports port 443
#http_access deny to_localhost
#http_access deny CONNECT !SSL_ports
http_access allow localnet
http_access allow localhost
http_access allow authenticated
via on
forwarded_for transparent

Create the users and passwords:

sudo apt-get install apache2-utils (required for htdigest)
sudo htdigest -c /etc/squid3/passwords proxy user1
sudo htdigest /etc/squid3/passwords proxy user2

Open up firewall port (if enabled):

sudo ufw allow 3128

Restart the server and tail the logs:

sudo service squid3 restart
sudo tail -f /var/log/squid3/access.log

OTHER FILE LOCATIONS:

/var/spool/squid3
/etc/squid3

MONITORING with Splunk…

sudo /opt/splunkforwarder/bin/splunk add monitor /var/log/squid3/access.log -index main -sourcetype Squid3
sudo /opt/splunkforwarder/bin/splunk add monitor /var/log/squid3/cache.log -index main -sourcetype Squid3

REFERENCES:

Java Dependency Vulnerability scanning with Maven victims-enforcer

One of the OWASP guidelines for secure applications is to not use components with known vulnerabilities. Unfortunately it can be a very difficult and time consuming task to keep up with these manually, automation can save you countless hours!

See https://www.owasp.org/index.php/Top_10_2013-A9-Using_Components_with_Known_Vulnerabilities.

NOTE: victims-enforcer can be used in conjunction with the OWASP dependency scanner. I have only found it to be problematic in ‘tycho’ builds.


<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.4.1</version>
<dependencies>
<dependency>
<groupId>com.redhat.victims</groupId>
<artifactId>enforce-victims-rule</artifactId>
<version>1.3.4</version>
<type>jar</type>
</dependency>
</dependencies>
<executions>
<execution>
<id>enforce-victims-rule</id>
<goals>
<goal>enforce</goal>
</goals>
<configuration>
<rules>
<rule implementation="com.redhat.victims.VictimsRule">
<!--
Check the project's dependencies against the database using
name and version. The default mode for this is 'warning'.

Valid options are:

disabled: Rule is still run but only INFO level messages and no errors.
warning : Rule will spit out a warning message but doesn't result in a failure.
fatal : Rule will spit out an error message and fail the build.
-->
<metadata>warning</metadata>

<!--
Check the project's dependencies against the database using
the SHA-512 checksum of the artifact. The default is fatal.

Valid options are:

disabled: Rule is still run but only INFO level messages and no errors.
warning : Rule will spit out a warning message but doesn't result in a failure.
fatal : Rule will spit out an error message and fail the build.
-->
<fingerprint>fatal</fingerprint>

<!--
Disables the synchronization mechanism. By default the rule will
attempt to update the database for each build.

Valid options are:

auto : Automatically update the database entries on each build.
daily : Update the database entries once per day.
weekly: Update the database entries once per week.
offline : Disable the synchronization mechanism.
-->
<updates>daily</updates><!-- was: auto -->

</rule>
</rules>
</configuration>
</execution>
</executions>
</plugin>

Vulnerability database is sourced from: https://victi.ms with backing from RedHat.

REFERENCES: