Private Caching/Relay DNS Server

Since I’ve run a few small websites (like this one) out of my home for years, I’ve found it useful to run a DNS server inside of my firewall. Not only does this make it easier to maintain the websites, but it allows me to lock down security and increase performance of many of my applications.

I run a the following services that use DNS:

  • Apache JAMES – mail server that does lookups to send email and filter inbound SPAM.
  • Analog – web server log analysis.
  • Apache HTTPD – web server, used to host websites, private domains used for internal purposes.

To make things more efficient, I currently have my DNS setup to forward all requests to OpenDNS, allowing for ‘adult’ website filters and analysis of DNS activity.

Some open-source/free DNS servers that I recommend:


Unreal Tournament 2004

I’m not much of a game player, but a group of guys from the office decided that it would be fun to blow off some steam by fragging each other once in a while. I’ve got a dedicated server setup, but for obvious reasons it’s password protected. If you are a regular reader of this blog and are interesting in joining us on occasion, please drop me an email or reply to this post. Even better… see me IRL.

Happy fragging!

Installing Perl CGI on Apache (for Windows)

Installing Perl on a Win32 installation of Apache is trivial. Just a few short years ago (roughly the year 2000) most commercial website still ran large amounts of Perl code. Several open-source projects like BugZilla still rely on this powerful scripting language.

Here’s a few simple steps and advice to consider when the need comes to add this feature to your installation.

  1. Download Perl for Win32 – ActiveState Perl is the standard distribution to use, and installation is a snap.URL = Get the MSI file version as it’s executable (the AS version is a ZIP file for manual installs)

    b) The default path it chooses is “C:\Perl”, I advise that you use “c:\usr” instead as it makes it easier to port programs to and from UNIX/LINUX.

    c) The MSI installer takes care of the PATH file settings, so you should have no other work for installation.

  2. Modify the Apache httpd.conf file to enable (uncomment or add the following lines).

    AddHandler cgi-script .cgi
    AddHandler cgi-script .pl

  3. Restart Windows to ensure that the new configuration is available to the operating system.
  4. Test your install…a) Create a new file on the server named /cgi-bin/ with the following content:

    print “Content-type:text/html\n\n”;
    print “hello world”;

    b) Start (or restart) the Apache service.

    c) Access the file in the browser, example:

    URL = http://localhost/cgi-bin/

    d) If everything works, you should see the words “hello world”, otherwise, if you see the source code or ‘500 Server Error’ then the config has a problem.

Happy Scripting.


I’ve used EveryDNS (free service) for years to host my DNS services.    Recently I found that they now offer public DNS service for lookups as OpenDNS.   While I still run my own private DNS server for caching and various private addresses.  I now do a simple forward lookup to their servers to gain the extra services they provide… notably Phishing  and typo protection.

Setup is very simple for most users, and even a non-technical person should have no problems following their installation instructions for a single computer/device or an entire network.
Happy networking!!!

WAMP Servers

I often find myself administering WAMP (Windows, Apache, PHP/Perl/Python, mySQL) servers…. usually this occurs because it is better ‘supported’ (or perhaps ‘tolerated’) configuration in a corporate alternative to the more common LAMP (Linux… etc.) variety. This gives you the benefit of a centrally controlled operating system while maintaining a mostly open source server environment. Albeit with Microsoft’s poor security record, you’ll be patching it a LOT!

Many common distributions exist… here’s some helpful resources with downloads:

If you are a Java shop, you might also consider the following…

Configuration of each of these is a topic in it’s own right. If you need a shortcut to development, you may want to check out this!

Good luck!!!

Proxy Auto-config

There comes a need for many organizations (or individuals) to establish proxy servers on their network. This is usually done for reasons of security or network topology. While the use of proxy servers simpifies some aspects of networking, it comes at the cost of maintaining the browser configuration of every network device (usually browsers). Netscape provided a mechanism to automate much of this problem by allowing the browser to retrieve the proxy configuration from a centrally managed server.

The proxy autoconfig file is written in JavaScript, it should be a separate file that has the proper filename extension and MIME type when provided from a webserver.

The file must define the function:

function FindProxyForURL(url, host)




4. ApacheHTTP config.

Add the following to the httpd.conf file:

Redirect permanent /wpad.dat {yourdomain}/proxy.pac
AddType application/x-ns-proxy-autoconfig .pac


/* 'proxy.pac' - This is the main function called by any browser */
function FindProxyForURL(url, host)

if (isPlainHostName(host) || // No Proxy for Non FQDN names
shExpMatch(host, “*.localnet”) || // No Proxy for internal network
shExpMatch(host, “”) || // No Proxy for LocalHost
shExpMatch(host, “localhost”) || // No Proxy for LocalHost
shExpMatch(host, “mailhost”) || // No Proxy for MailHost
dnsDomainIs(host, “”) || // No Proxy
return “DIRECT”;

else {

} //End else

} // End function FindProxyForUrl

NOTE: Also see my ‘WPAD’ blog entry.

PHP on Apache 2.2 (Win32)

This came as a shock to me a while back, when i started evaluating an upgrade to Apache 2.2 from Apache 2.0.58. It seems that PHP doesn’t ship with a handler for Apache 2.2, as such after a huge headache and little bit of searching I found this article and downloads available at

It should also be added that other great binary assets are available at these sites/

Enabling A Secure Apache Server w/SSL Certificates

If you’ve taken some time to wander around my site, you may have noticed that I also have SSL enabled (with url’s). Here’s the steps you can take on your site/server – provided you have proper access.

Download and install Apache-OpenSSL and OpenSSL – I’ve found to be a reliable source for precompiled binaries for Win32 platforms.

Install OpenSSL, and add the following environmental variable.
OPENSSL_CONF=[apache_root]/bin/openssl.conf (.cnf?)

Generate a private key:
openssl genrsa —des3 —out filename.key 1024

Create CSR Request…
openssl req —new —key filename.key —out filename.csr

This step will ask for several pieces of information, here’s my example:

Country Name (2 letter code) [AU]:US
State or Province Name (full name) [Some-State]:Illinois
Locality Name (eg, city) []:Carol Stream
Organization Name (eg, company) [Internet Widgits Pty Ltd]:Dean Scott Fredrickson
Organizational Unit Name (eg, section) []:Giant Geek Communications
Common Name (eg, YOUR name) []
Email Address []:[email protected]
Please enter the following 'extra' attributes
to be sent with your certificate request
A challenge password []:xxxxxxxxxxxxxx
An optional company name []:Giant Geek Communications

You can now send this CSR to a valid Certifying Authority…
I currently use

It’s very likely that the CA will need to verify your identity, typically this requires you to fax a copy of your id card/passport or business papers. A D-U-N-S Number (from Dun and Bradstreet) will make this easier for businesses.

If you don’t plan on having lots of users, you can create a Self-signed certificate…
openssl x509 —req —days 30 —in filename.csr —signkey filename.key —out filename.crt

You’ll need to install the files received from the CA, but it’s pretty trivial so I’ll leave it for later.

P3P 1.0 Implementation guide

Standards documentation is available from W3C at:


  1. Version P3P 1.1 is currently in the works.
  2. Throughout the specifications you’ll see references to “Well-Known Location”, this refers to the default path and naming of these files in the /w3c/ folder.
  3. In my examples below, I have left MOST data empty, the “

xxx” indicates a field that must match between these files.

<link type="text/xml" rel="P3Pv1" href="/w3c/p3p.xml" />

HTTP Header:

p3p: policyref="/w3c/p3p.xml", CP="TST"


<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<META xmlns="">
<POLICY-REF about="/w3c/privacy.xml#xxx">
<COOKIE-INCLUDE name="*" value="*" domain="*" path="*" />


<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<POLICIES xmlns="">
<POLICY name="xxx" discuri="/index.html" xml:lang="en">
<DATA ref=""></DATA>
<DATA ref="#business.department"></DATA>
<DATA ref=""></DATA>
<DATA ref=""></DATA>
<DATA ref=""></DATA>
<DATA ref=""></DATA>
<DATA ref=""></DATA>
<DATA ref=""></DATA>
<DATA ref=""></DATA>
<DATA ref=""></DATA>
<DATA ref=""></DATA>
<DATA ref=""></DATA>
<DATA ref=""></DATA>
<DISPUTES resolution-type="service" service="/index.html" short-description="Customer Service">
<CONSEQUENCE>We record some information in order to serve your request and to secure and improve our Web site.</CONSEQUENCE>
<DATA ref="#dynamic.clickstream"/>
<DATA ref="#dynamic.http"/>




I’ve been asked about this file in many projects i’ve worked on. It resides in the root of the website, and has no external references to it, however, there is usually a lot of requests for it in the server logs. (Or… “404 Not Found” Errors if it doesn’t exist).

Additionally, automated security audit software will often indicate that this file itself is a possible security problem as it can expose hidden areas of your website (more on this later).

Here’s what it’s all about….

ROBOTS.TXT is used by spiders and robots, primarily so that they can index your website for search engines (which is usually a good thing). However…. there are times when you don’t want this to occur. Some spiders/robots can be too agressive on your servers, consuming precious bandwidth and CPU utilization, or they can dig too deep into your content. As such you might want to control their access.

The Robots Exclusion Protocol sets out several ways to accomplish this goal. Of course the spider must comply with this convention.
1. ROBOTS.TXT can be used to limit the access:

Example that limits only the images, javascript and css folders:

#robots.txt - for info see
User-agent: *
Disallow: /images/
Disallow: /js/
Disallow: /css/

2. A <meta> tag on each webpage indicating spider actions to take.

<meta name="robots" content="INDEX, FOLLOW, ALL" />

Values, there are a few other attributes, but these are the most common….
INDEX -index this page
NOINDEX – do not index this page
FOLLOW -follow links from this page
NOFOLLOW -do not follow links from this page

In most cases, a spider/robot will first request the ROBOTS.TXT file, and then start indexing the site. You can exclude all or specific spiders from individual files or directories.

As for the earlier bit on security, since this file is available to anyone, you should never indicate sensitive areas of your website in this file as it would be an easy way to find those areas.