ClarkConnect Notes
Christopher Rath
2002
As I wrote about in my Hacking The 3Com Multi-Purpose Internet Server notes, I am using the ClarkConnect firewall Linux distribution for our home network’s firewall and file server. This page serves to capture my thoughts, experiences, and software I’ve written for our ClarkConnect server.
One of the features of the ClarkConnect distribution is that squidGuard, a web-filtering package, is installed and running. By default, squidGuard is only configured to block drug related websites. For anyone who wants to make full use of squidGuard Blacklists, two steps must be taken: configuring squidGuard to use a wider range of Blacklists, and getting the server set up to automatically download Blacklist updates. This section of my ClarkConnect notes covers these two tasks.
The squidGuard website contains full details about how squidGuard Blacklists are configured; however, here’s my squidGuard.conf file which shows multiple lists being configured—I have configured blocking of drug, gambling, and pornographic websites plus allowed for local lists of sites to be blocked or allowed (note that the ‘redirect’ line in the ‘acl’ section is not actually split across two lines, but has been shown this way to avoid an overly wide display):
logdir /var/log/squid dbhome /etc/squidGuard/db dest allow_local { domainlist local/allow_domains urllist local/allow_urls } dest deny_local { domainlist local/deny_domains urllist local/deny_urls } dest drugs { domainlist drugs/domains urllist drugs/urls } dest gambling { domainlist gambling/domains urllist gambling/urls } dest porn { domainlist porn/domains urllist porn/urls } acl { default { pass allow_local !deny_local !gambling !porn !drugs all redirect http://192.168.1.1/cgi-bin/squidGuard.cgi?clientaddr=%a &clientname=%n&clientuser=%i&clientgroup=%s&targetgroup=%t&url=%u } }
It is important to note that each of the files referred to in the squidGuard.conf file must exist. If you attempt to use my .conf file you must manually create the local directory and the files my .conf file assumes exist there. The following command may be used to create the directory and empty files (run as root):
cd /etc/squidGuard/db
mkdir local
touch local/allow_domains local/allow_urls local/deny_domains local/deny_urls
chown -R squid:suvlet local
What this squidGuard.conf file does it to allow domains/urls matched by the ‘allow_local’ rule to be accessed, but denies access to ‘deny_local’, ‘drugs’, ‘gambling’, and ‘porn’. If the domain/url has not matched anything after those rules have been applied, then it is allowed to be accessed (this is what the final ‘all’ means).
The databases in the local directory (described by the ‘allow_local’ and ‘deny_local’ rules) are files maintained by the local system administrator. These local files are not updated or maintained by the squidGuard robot. Consequently, these local rules will not be overwritten when a new Blacklist is downloaded. Thus, as you find specific sites you wish to specifically allow or deny at your site you edit the appropriate file, rebuild the squidGuard databases, and restart squidGuard.
Once you have edited your squidGuard.conf file, you may need to rebuild the Blacklist databases and restart Squid. This can be done by running the following commands as root (if things don’t appear to be working properly then check the squidGuard log in the /var/log/squid directory):
/usr/sbin/squidGuard -C all
/etc/rc.d/init.d/squid restart
If things still aren't working, check the ownership of the files in the squidGuard Blacklist database. All the files in /etc/squidGuard/db must be owned by userid ‘squid’ and groupid ‘suvlet’. The following command can be used to reset the file ownership (run it as root, and remember to restart Squid afterwards):
chown -R squid:suvlet /etc/squidGuard/db/*
ClarkConnect
version 1.0 does not contain any method for automatically downloading new squidGuard
Blacklists. The squidGuard website itself does not offer a script to perform this task, and so it
was necessary for me to write a script.
I started my task by posting a query to one of the ClarkConnect user discussion forums, asking if anyone else had already written such a script. A sysadmin named Mike anonymously posted his script. So I took that and rewrote it into something a little more robust. I also created a small PHP webconfig module and put it all together into an RPM to simplify installation.
The RPM can be downloaded here:
Note that my refreshSG script is dependent upon wget, and wget must be installed before my refreshSG RPM is installed.
For the convenience of those who may not want the whole RPM, the refreshSG script itself and the accompanying man page can be downloaded here:
Here are some basic installation instructions:
ftp ftp.redhat.com
cd /pub/redhat/linux/7.2/en/os/i386/RedHat/RPMS/
bin
get wget-1.7-3.i386.rpm
rpm -Uvh wget-1.7-3.i386.rpm
wget http://www.rath.ca/Misc/3Com-3C19504/refreshSG-1.1-1.i386.rpm
rpm -Uvh refreshSG-1.1-1.i386.rpm
The refresh script will not run until you explicitly turn it on via its configuration page. The configuration page also allows you to change the URL the refreshSG script pulls the Blacklist from. When the script does run, it will write a log of its session to /var/log/refreshSG, and that log file is displayed in the lower half of the configuration page.
The refreshSG log file will grow over time and Linux manages this using a program called logrotate that is run each day by cron. To have logrotate manage the refreshSG log file, create a file called /etc/logrotate.d/refreshSG and place the following lines in that file:
/var/log/refreshSG {
monthly
compress
notifempty
missingok
}
v1.0 [2002-07-16] — First public release (via the ClarkConnect General Forum). Released into the public domain.
v1.1 [2002-07-22] — Second public release. Have added extra help to the ClarkConnect webconfig page, added a man page, and exposed two additional variables to the refreshSG.conf file.
v1.2 [2002-07-23] — Third public release. Fixed another typo in the webconfig screen and changed the way the /etc/crontab entry is made.