Occasionally, there’s an idea so simple and powerful that you have to drop whatever you’re doing and implement it immediately.

Yesterday, I read the Jon Udell article that’s making the rounds (via Mefi and Flutterby). I didn’t immediately grok it, but seeing it in action (1, 2, 3) did the trick.

Visually, I was inspired by Mark Pilgrim’s concise display, but didn’t want to periodically parse through my Apache logs. I wanted real-time results without limiting myself to one particular web server log format. So I wrote a Perl script that’s now included on every entry page via SSI, using flat files to store the data.

As a result, there may be some issues with scalability on heavily trafficked sites, but I’d think most weblogs wouldn’t have a problem. Anyway, if you want to try it, all it requires is Perl, server-side includes, and a world-writable directory to store the files in. Download Waxy Backlinks now. Installation info inside.

Installation is pretty easy, so long as you know your way around a Unix shell.

1. Rename the file from backlink.txt to backlink.cgi.

2. Save the file to a directory readable by your web server and make it executable (e.g. ‘chmod 755 backlink.cgi’)

3. Create a directory to store the cache files in and make it world-writable (e.g. ‘chmod 777 backlink_dir/’)

4. Edit line 16 of backlink.cgi, changing the ‘$backlinkdir’ directory to point to your own cache directory.

5. Add the following server-side include to your .shtml file(s), where you want the backlinks to be displayed:

<!–#exec cgi=”/cgi/backlink.cgi” –>

That’s it! If you’d like, you can optionally customize the display by changing the header, footer, and backlink HTML in the script. If you get stuck, I might be able to help.


    Hmm, I think I need to add a couple features to the script. The ability to set a maximum number of links to display, a minimum threshold of visitors to display, and the option to group recent links by top-level domain. As you can see, it doesn’t take long before the list of links gets unruly.

    This is super cool: I may clone it to track user agents the same way (keep an eye on robots vs real readers). I seem to be showing up in my own backlinks, even though I added myself to the “blacklist.”

    Are you correctly adding your hostname to the @blacklist array? Try changing it to read something like:

    my @blacklist = qw($ENV{HTTP_HOST};

    Hi. Found this in a Mefi post which I found through google. Is this Perl script possible to implement on a PHP site. My pages are *.php, so I don’t think I can run SSI on a php page. Thanks!

    Would you please help me to find out why the back link does not work?

    I did try with and SSI works.

    The path is also seems correct.

    my $backlinkdir = ‘/home/virtual/site1/fst/var/www/cgi-bin/mt/backlink’;

    The code was:

    I got an error so I did change it to:

    I don’t know what else should I check?


    I don’t have any idea … it just don’t works.

    on my indexpage it looks like:


    echo $display;


    backlink.cgi looks like:

    my $backlinkdir = ‘/usr/local/httpd/htdocs/kunden/web121/html/public/backlink’;

    php = 0

Comments are closed.