CSU Libraries
Web Server Migration Plan
Plan | Reasons | Servers | To Do | Features and Questions
Plan
- LTS plans to get rid of both Manta and Clam some time this summer.
- Most e-mail has already been moved from Manta to Grouper.
- LTS plans to move the Web site from the current Web server, Clam, an IBM RS/6000 machine running AIX, to Snook, a Dell server running Red Hat Enterprise Linux (RHEL).
- Estimated server downtime: less than a minute.
- Estimated time where Web authors should not make changes: one to two hours.
Reasons
- The IBM machines are several years old. The Dell server is faster and has more memory.
- The IBM machines are both expensive to keep running. RHEL is much less expensive.
- Manta and Clam are relatively difficult to maintain, and we have had a number of software problems, e.g. with MySQL and Samba.
- AIX is not as widely used and documented as Linux, so new staff members have more to learn. We have several LTS staff members who are skilled with Linux.
- Linux is free distribution and open source and comes with a lot of built-in software and tools for Web hosting, including Apache, Samba, PHP, MySQL, the Bash shell, SSH, man pages, and text editors.
- RHEL is automatically updated to keep the system secure.
- We can back up the server using the same backup server as several other machines, reducing the work of maintaining backup tapes.
Servers
- Clam: existing web server (AIX)
- Snook: new web server (RHEL 4)
- Gregvogl: test machine (RHEL 3)
To Do
Before the Move: get existing features to work
- Notify Web authors (the Web team, Web scripters, LTSadm, maybe all Libraries staff) of coming changes (done)
- Notify users of scheduled downtime? (no - should not affect users, just a minute to switch the alias)
- Add Snook to the Arkeia backup system (Brian). (done)
- Use up2date to automatically download updates and security patches. (done)
- Reinstall and resize partitions and logical volumes to make better use of space. (done)
- Configure Apache. (done)
- Configure CGI, PHP and MySQL. (done)
- Install MySQLAdmin. (done)
- Install mysql-administrator from mysql.com. (done)
- Install mnoGoSearch. (done)
- Install webmin for web-based system administration. (done)
- Get existing forms, scripts and search engines to work. (done)
- Move my site to lib.colostate.edu/lts/gvogl. (done)
- Create user accounts and groups for all web authors. (done)
- Configure Samba to share directories (www, html, logs, etc.). (done)
- Create backups of Clam and Snook (done)
- Remove old copies of files on Snook (done)
During the Move: test after each step!
- Remind Web authors not to change anything while files are being copied. (done)
- Deny UNIX and Samba write access permissions to Clam HTML files and test. (5 minutes) (done)
- Copy relevant files from Clam to Snook and test. (30 minutes) (done)
- Change permissions globally, fine-tune (for forms and shared directories) and test. (10 minutes) (done)
- Search and replace web pages to point to new script locations (e.g. /var/www/cgi-bin) and test (10 minutes). (done)
- Point the alias lib.colostate.edu to Snook.library.colostate.edu instead of Clam and test. (5 minutes) (Brian) (done)
- Redirect web browser access to Clam and test. (5 minutes) (done)
After the Move: fix problems and add additional features
- Notify Web authors when the new server is up and ready for testing and use. (done)
- Add user accounts for known web authors and others on demand. (done)
- Ask Web authors who want FTP or command-line login to Snook to come in and create a password. (done)
- Assure that LinkScan works. (Ryan) (done)
- Get Web logs to work for WebTrends. (Ryan) (done)
- Replace all swish-e searches with Google or mnoGoSearch. (done)
- Configure more direct access to Staff Intranet directories (using httpd.conf, smb.conf, directory permissions).
- Create an archive of all data on Clam (e.g. on CDs or DVDs).
- Install MySQL Query Browser.
- Check for security (firewall, user accounts, directories, Samba, Apache, PHP, MySQL, CGI, etc.)
- Buy SSL certificate. (check how it was done on Clam)
- Install and convert all existing web forms to Phorm Jr. (big task)
- Get user web directories of the form ~username to work.
- Fix boot setup so the machine can be rebooted remotely and Linux restarts without delay.
- Move some or all interactive pages on Vulture to Snook. (Dennis?)
Date, time and logistics of the move
- How soon will we be ready to move?
- the intersession (second week of August) is best, since it will not disrupt classes and I will be here
- Will other projects and factors affect the move date?
- server room expansion is done
- squid problems are solved?
- Arkeia backup of Snook
- Libraries home page redesign
- reorganization of Web team
- How long will the switch take?
- reserve a whole day, but probably it will be just two hours
- Should the move be on a weekend or at night when there is less traffic?
- no because I may need technical help from LTS staff
- Be ready to revert to Clam or rename the new server to Clam if problems are detected
- unlikely to be an issue; easy to do if necessary
Features and Questions
- Disk space
- Clam has 67 GB; Snook has 46 GB
- Disk space will not be a problem soon
- Clam is over half full, but most of used space on Clam is old mail in /libhome
- /mantaweb only takes 2 GB; only 6 GB used on Snook
- By contrast, T: drive is nearly full (37 GB used, <3 GB free)
- Partitions and logical volumes
- See RHEL4 online documentation for recommended partition sizes.
- sda1 0.5 GB /boot - 0.1 GB recommended but more may be needed for numerous kernel upgrades
- sda2 6 GB /swap - memory is 4 GB, 6 GB recommended (memory+2)
- volume group vg0 to hold all other file systems (for easier resizing)
- lv0 5 GB /
- lv3 2 GB /tmp
- lv1 5 GB /usr
- lv2 3 GB /home - may go unused but some good as a precaution
- lv4 5 GB /var
- lv5 7.7 GB /var/www
- 1.8 GB unused
- unused - 14.5 GB - save some space for creating temporary partitions and volume groups
- Disk druid can expand file systems; also check the tools disk (#5) in RHEL package.
- Is it easier to install from scratch than to back up and restore partitions or change existing partitions?
- Shrink filesystems using what tool? (ext2resize, resize2fs and e2fsadm are not available
on RHEL4)
- Mount and unmount filesystems using umount /; mount -o remount,rw /
- Convert filesystems between ext3 and ext2 using tune2fs -O ^has_journal device; tune2fs -j device
- Check filesystems using e2fsck
- Shrink logical volumes using lvreduce -L -2G LogVol00
- Shrink partitions using parted device resize partition start end
- Update memory copy of partitions using partprobe
- Hardware configuration and booting
- All 4 disks are used in a RAID 5 configuration. I had to use the Dell OpenManage(?) CD to set this up.
- The machine does not warm reboot simply by restarting from Linux.
- I had to cold boot, and then press Enter to ignore error messages about SCSI devices.
- It may be necessary to turn on the tape drive, or press Ctrl-A during boot and get into the BIOS to change SCSI settings.
- Eric helped with a boot problem. Apparently there is some problem recognizing SCSI
devices. There are three controllers, and it was looking at the wrong one. There
are possibly 3 SCSI devices:
- a Dell PERC card to handle the RAID array, which may have 2 channels
- an Adaptec card which drives the tape library system
- another Perc card? check Dell's Web site for the service tage number.
- How to restart servers automatically upon reboot?
- System Settings | Server Settings | Services has options for several programs, including:
httpd, mysql, sendmail, smb
- System Settings | Server Settings | Services has options for several programs, including:
- HTML files
- OK to move DocumentRoot from /mantaweb/public_html to /var/www/html?
- + /var/www/html is a more standard file location, making life easier for people who know Apache on Linux.
- - Existing users may expect and look for the existing name.
- - We should instead choose a root folder name that is easy to type and remember, like /web?
- - Need to update web forms and other files referencing mantaweb using
find /var/www/html -name "*.html" -exec grep -l "mantaweb/public_html" '{}' >> mantaweb.files ';'
find /var/www/cgi-bin -exec grep -l "mantaweb/public_html" '{}' >> mantaweb.files ';'
more mantaweb.files
perl -pi -e 's/mantaweb\/public_html/var\/www\/html/g' `cat mantaweb.files` - - Make a mantaweb dir and add a symbolic link from /mantaweb/public_html to /var/www/html just in case (?)
- Is this path used other than in web forms and cgi scripts in the scripts account?
- Use tools to archive and compress all of /mantaweb/public_html before transferring.
- Copy large number of files quickly.
- Preserve all file attributes if possible (modification dates, permissions; maybe
owner, group).
- Could make all files the same and then fix individual staff directories.
- Which files need to be hidden from users? These files should be removed from the server.
- chown -R root.web * to make all files owned by root, in the web group
- chmod -R 764 * to make files editable by web group, readable by all
- Do not fill any Clam filesystems. (/mantaweb only has 2 GB free; /libhome has 33 GB free)
- tar -c -f website.tar /mantaweb/public_html; gzip website.tar fails with a directory checksum error after only archiving about a third of the files. Also, Clam does not have a tar -z option.
- find /mantaweb/public_html | xargs tar rvf website.tar fails because 330 filenames contain a space. (We could replace spaces with underscores.)
- find /mantaweb/public_html -exec xargs tar rvf website.tar '{}' ';' doesn't work.
- find /mantaweb/public_html | tar -T- -cf website.tar doesn't work on Clam because AIX tar doesn't have a -T flag, though it works on Linux.
- scp -r -p /mantaweb/public_html root@snook:/var/www/html is very slow and does not preserve owner and group.
- Copying through Samba also does not preserve owner and group, but it does preserve filenames with spaces.
- Copying through Samba duplicates symlink files. find . -type l shows only two symlinks.
- Anything useful in /libhome folders to save? just contains old mail files?
- Only need to move /mantaweb/public_html files from server? What other files?
- OK to move DocumentRoot from /mantaweb/public_html to /var/www/html?
- User accounts
- No easy way to automatically copy accounts from Manta like Clam does
- could create a script to get usernames and passwords from /etc/passwd and /etc/shadow
- No need for everyone that has an e-mail account to have an account on the web server
- Web authors only need an account for read-only access of public html files through Samba.
- Even write access through Samba doesn't require logging in, just an account.
- Only create accounts with passwords upon request (e.g. if someone wants to log in or use FTP).
- Make a list of current users that definitely need accounts:
- web group: LTSadm (Jennifer, Greg, Brian, Ryan, Dennis, Don), web team (Lori, Sari, Lisa, Cathy), Kevin, Sierra
- staff group: all departmental users (start with Donnice, Dave?)
- contract group: LTS, Donnice, Dave
- acqstats group: LTS, Alea
- chgrp web /var/www/html; chmod g+s /var/www/html
- chgrp contract /var/www/html/staff/acq/contracts; chmod g+s /var/www/html/staff/acq/contracts
- Staff do not need to home directories for personal pages of the form http://lib.colostate.edu/~gvogl
- Staff should use Lamar or commercial/free accounts for personal pages
- Home directories for personal pages
- No easy way to automatically copy accounts from Manta like Clam does
- Host name and IP
- Any preferences for new host name and IP?
- Otherwise, we can keep the old host name (Snook.library.colostate.edu) and IP number (129.82.28.131).
- Snook2 (129.82.28.138) is an emergency host name and IP number.
- /etc/hosts should have these lines:
-
- 127.0.0.1 localhost.localdomain localhost
- 129.82.28.131 snook.library.colostate.edu snook
- service network restart will reread the network config files and restart.
- Firewall
- System Settings | Security Level: enable firewall; trust http, ssh, mail; 443:tcp (https), 445:tcp, 139:tcp (samba), 617:tcp (arkeia)
- service iptables stop or start disables or enables the firewall from the command line.
- Apache configuration
- ServerRoot is /usr/local/apache on Clam, /etc/httpd on Linux
- Config file is /usr/local/apache/conf/httpd.conf on Clam, /etc/httpd/conf/httpd.conf on Linux
- Keep httpd.conf as generic as possible.
- Specifically document all non-generic parts with a comment (reason, username and date).
- DocumentRoot "/var/www/html"
- Directory permissions on /var/www/html
- /staff directory: limit to Library IPs (long list)
- /campus directory: limit to campus IPs (129.82)
- AllowOverride AuthConfig for some directories with .htpassword?
- restart with apachectl restart.
- Custom configuration is needed for Web logs so that webmon can process them.
- CustomLog is stored on /mantaweb/web-logs/access_log on Clam, logs/access_log on Snook.
- ServerTokens is Full on Clam, OS on Snook.
- User/Group are nobody/staff on Clam, apache/apache on Linux.
- ServerAdmin is gilbert@library.colostate.edu on Clam, root@localhost on Linux (added .forward).
- UseCanonicalName is off so that lib.colostate.edu/folder redirects to lib.colostate.edu/folder/ and not snook.library.colostate.edu/folder/.
- .htaccess files
- existing .htaccess files should work without change on new server?
- .htaccess files are used for redirect, IP restriction, and .htpassword restriction
- More efficient and secure to centrally control all access restrictions in httpd.conf?
- Redirect http://clam.library.colostate.edu/ to new web server using
one of these:
- .htaccess: RedirectPermanent / http://lib.colostate.edu/ (preferred)
- index.html: <meta http-equiv="Refresh" content="4;url=http://lib.colostate.edu/">
(use index.html from manta?) - httpd.conf: deny from all
- Disallowed files and folders in robots.text are still indexed by
Google but an excerpt of their text contents is not shown with the link.
- To hide a page so that it does not appear in search results, add the following tag to the head section of the page. <meta name="robots" content="noindex">
- To instruct the spider not to follow the links on the page, you can add the "nofollow" value: <meta name="robots" content="noindex,nofollow">
- Samba configuration
- Edit smb.conf to be like Vulture (/etc/samba/smb.conf) and Clam (/usr/local/samba2.0.6/lib/smb.conf)
- hosts allow=129.82. allows all CSU users; may be better to only allow Libraries users
- other options: public=yes or no, writable=yes or no, read only=yes, write list=@web, valid users=@staff
- Assure that existing access to folders still works
- www (/var/www) instead of MantaWeb (/mantaweb)
- valid users = @web
- html (/var/www/html) instead of webpublic (/mantaweb/public_html)
- public=yes, writable=no
- logs (/var/log/httpd) instead of weblogs (/mantaweb/web-logs)
- public=no, writable=no, valid users = @lts @web webmon webmon2
- www (/var/www) instead of MantaWeb (/mantaweb)
- Add Samba access to folders for individual departments (start with Donnice and Dave)
- Or just have a single staff group that has write permissions on the staff intranet??
- Check status with smbclient -L netbiosname -U root%password or smbclient -L localhost -U%
- Run files in /usr/local/samba2.0.6/bin/ on Clam.
- Restart with /etc/rc.d/init.d/smb restart on Linux.
- On Clam, smb.conf is reread periodically? smbstatus rereads smb.conf?
- SSIs
- How to configure so both .shtml and .html files process SSIs? lines in httpd.conf:
- AddType text/html .shtml
- AddHandler server-parsed .shtml
- AddOutputFilter INCLUDES .shtml
- AddType text/x-server-parsed-html .shtml
- AddHandler server-parsed .html
- Options +Include
- XBitHack On (parses files for SSIs if user execute bit is set)
- find /var/www/html -name '*.html' -exec chmod u+x '{}' ';' (sets execute bit on all html files)
- How can this bit be on by default for all created files? Part of smb.conf? create mask and directory mask 0775?
- How to configure so both .shtml and .html files process SSIs? lines in httpd.conf:
- e-mail forms
- Existing CGI/Perl FormHandler.cgi
- + most forms authors already know how to use this tool
- + no need to fix the approximately 26 existing web forms (grep mantaweb to find them)
- - customization is difficult
- - CGI is slow and a resource hog
- - hidden fields for e-mail addresses are harvestable by spammers
- - Is it better to change #!/usrl/local/bin/perl to #!/usr/bin/perl or add a symlink?
- - sendmail.pl on Snook failed to send messages for FormHandler.cgi (Connection refused).
Why?
- Sending mail from the command line works, as does sending from Phorm Jr., so it is not a firewall problem.
- Adding $SMTP_SERVER = 'manta.library.colostate.edu'; to FormHandler.cgi works.
- The problem was in sendmail.cf (O DaemonPortOptions=Port=smtp,Addr=0.0.0.0, Name=MTA instead of O DaemonPortOptions=Port=smtp,Addr=127.0.0.1, Name=MTA)
- Also forms need a sender e-mail address (hidden field named email) or FormHandler.cgi should default to an existing e-mail address.
- To restart sendmail after changing configuration, do service sendmail restart.
- Phorm Jr. seems to be a viable alternative solution.
- + e-mail addresses are not harvestable
- + relatively few form changes are needed; existing page formatting can remain
- + can send to multiple addresses in To: line (joined together with commas)
- + generic or custom templates for general configuration, e-mail message and acknowledgement
- + can cc: the sender (modified phormjr.php adds a Cc: line in the header)
- + can recreate the comments form
- use a PHP script to convert ?name= or ?alias= to a hidden variable
- phormjr.php combines the dotname@server and sends a copy to the sender
- + can include Libraries header and footer in sent acknowledgment page (using a hack)
- + can link back to the referring page in the acknowledgment page, e.g. web site comments
- + can use a default page for an error message that some required fields are empty (no need to create a custom error page for every form)
- + with some programming, a single script is used for all forms
- - some extra work is needed to create so many templates for each form
- - how to make it work with the config files in different directories? or the web team can use a single directory?
- - how to prevent crawler programs from filling out the forms with spam? generate an image from a random character string that a user must read and type in to confirm?
- - how to send different messages to sender and receiver? Is it necessary?
- PHP form mail script
- + e-mail addresses are not harvestable
- message template is a text file which can be in a protected directory
- + relatively few form changes are needed; existing page formatting can remain
- - can't send to multiple addresses?
- need to cc: the sender and send to a list of Libraries staff
- From: and CC: fields seem to be ignored; no more than one address per domain?
- + e-mail addresses are not harvestable
- phpFormGenerator
- + can create forms without any coding
- - hard to customize the resulting form or modify existing forms
- - only sent to one e-mail address
- PHP FormHandler
- - hidden fields for e-mail addresses are harvestable by spammers
- - can't get it to work
- Others from sourceforge.net, freshmeat.net, hotscripts.com?
- - need to search and replace using perl -pi -e 's/\/mantaweb\/public_html/\/var\/www\/html/g' `cat temp/gvogl/webforms.txt`
- Existing CGI/Perl FormHandler.cgi
- CGIs
- Is CGI even necessary? Yes.
- CGI is needed by mnoGoSearch.
- E-mail forms will be replaced, but get the existing e-mail forms to work first, as a backup.
- Some scripts on Vulture are CGI.
- Any other uses?
- Is cgi-wrap needed? Yes.
- Currently in use only by forms? But it will probably be used in the future.
- The scripts user and directory are used and need to be copied over.
- AddHandler cgi-wrapper .cgi
Action cgi-wrapper /cgi-bin/cgiwrap - AddHandler cgi-script .cgi
Action cgi-script /cgi-bin/cgiwrap
- Is CGI even necessary? Yes.
- PHP
- PHP is needed for e-mail forms.
- Is PHP safe to allow for other uses?
- What security and admin tools are needed?
- What other PHP-based software might be needed?
- PHPBB is simple bulletin-board software but no attachments. Better to use wiki or portal groups?
- MySQL
- Install using mysql_install_db --user=mysql; mysqld_safe --user=mysql &
- Restart using /etc/rc.d/init.d/mysqld restart
- Set password using /usr/bin/mysqladmin -u root password='pwd'; /usr/bin/mysqladmin -u root -h snook.library.colostate.edu password='pwd'
- MySQL might be useful for small applications? Or is it better to put all databases on Vulture?
- If needed, also install MySQLAdmin and Query Browser? Other tools?
- What usernames, passwords and databases are needed?
- mnoGoSearch needs a mnogosearch database with username and password.
- Interactive pages
- Should we move some or all web applications and related pages from Vulture to the web server?
- Do we have a list of all web applications that reside on Vulture?
- Database of databases, staff directory, LTS and Web request forms, searches, digital collections
- Which other software packages would have to be installed?
- mnoGoSearch, CONTENTdm, php scripts, others?
- + If everything is in one place, user administration and file management is simplified.
- + It should be relatively easy to port applications, since both are running RedHat.
- - It would take time and effort to port. (How much?)
- - It would take time and effort to change thousands of links from vulture to lib. (How much?)
- - It would take more disk space. (How much? 1.6 GB in /var/www and 2GB in /usr/local/Content?)
- - It may impact server load, but Vulture does not have any huge databases.
- - It may impact server stability if some web applications crash. How often has this been a problem on Vulture?
- - Some space is needed for development? Or development should be on a non-production server?
- - It may require lots of help from LTS staff who know more than I do about Linux system, network and security administration.
- Cron jobs
- Just a few non-AIX jobs in /var/spool/cron/crontabs/root; none need to be put on the new server?
- Log is in /var/adm/cron/log
- # File System Backup Job
30 3 * * 1-5 /usr/local/sbin/fullrmt0
# System Reboot
10 1 * * 0 /etc/shutdown -Fr
# Dennis added 11-02-00 to backup access logs
35 23 * * 6 /usr/local/sbin/accesslog_bkup
# Dennis Added 12-18-01 for Michelle Mach
16 5 * * * /libhome/REFERENCE/mmach/scripts/swishcm.sh >> /dev/null 2>&1 - Is a nightly reboot needed on new server?
- /var/spool/cron/root is the root crontab; edit with crontab -e
- # re-index mnogosearch files nightly at 1:23 am:
- 23 1 * * * /usr/local/mnogosearch/sbin/indexer.cron
- # reboot Saturday at 10:30 pm?? for some reason restart requires a cold boot.
- # 30 22 * * 6 /sbin/shutdown -r now
- Reboot Procedures
- # reboot immediately
- /sbin/shutdown -r now
- # reboot after 5 minutes
- /sbin/shutdown -r +5
- # after reboot, check that daemons (httpd, sendmail, smbd, mysqld, arkeiad) are running, e.g.:
- ps -ef | grep mysqld
- Search engines
- replace or upgrade? remove swish-e? try new versions of swish++ and mnoGoSearch? use Google on all Google-indexed content?
- subsites that could use searches:
- Archives and Special Collections and its subsites; Digital Collections
- Research Subject Guides
- News
- Instruction, How To, Tutorials
- existing search scripts are found in the /libhome/REFERENCE/scripts/public_html/cgi-bin user directory
- SWISH-E searches in /cgi-bin/cgiwrap/scripts/ (to replace with Google or mnoGoSearch):
- search - collection management manual (/staff/cm/manual/)
- searchsage - comments and suggestions (/generalinfo/comments/oldrespons.html)
- searchstaff - staff intranet (/staff/)
- agnicsearch - agnic already uses mnoGoSearch instead
- miscellaneous scripts:
- liaisons - gathers liaisons from staff directory, for reference/purchase/index.html - Kevin
- awareplayer - for those without Authorware Web Player to use Data Game - Kevin (old)
- subjects* - gets info from database of databases on vulture - author? (old?)
- Link scanning
- How does it work? Is server-side access needed at all?
- Are any changes needed? (Ryan?)
- Web server statistics
- How are statistics gathered? from http access logs? webmon?
- Are any changes needed? (Ryan?)
- Log files (access_log* and error_log*) are in /usr/local/apache/logs/ on Clam, /var/log/httpd/ on Snook
- Administrative tools
- Install webmin for a simpler, GUI-based, web-based admin? Or use existing GUI tools?
- www.webmin.com
- rpm -i webmin*.rpm
- Configure RedHat to automatically download updates and security patches.
- /usr/bin/up2date --register
- log in to rhn.redhat.com
- Install webmin for a simpler, GUI-based, web-based admin? Or use existing GUI tools?
- Backup
- Which backup option should we choose? (cost, time, ease of use, security, reliability)
- Add Snook to the Arkeia backup system. (Brian)
- + easier backup and restore
- + no tapes
- Use the existing tape backup system.
- + full system backups
- + no license needed
- Files and directories to back up before reinstall
- /var/www/cgi-bin/
- /var/www/html/index.zhtml
- /var/www/html/website-gv.html
- /var/www/html/website-phorm.html
- /var/www/html/comments.php
- /var/www/html/favicon.ico
- /var/www/html/mantaweb.files
- /var/www/html/php/
- /var/www/html/access/emailform-phorm.html
- /var/www/html/phormjr
- /etc/httpd/conf/httpd.conf
- /etc/samba/smb.conf
- /etc/resolv.conf
- /usr/local/mnogosearch/etc/*.conf
- /var/log/
- /var/spool/mail/
- /etc/: passwd, shadow, group, gshadow, mtab
- Software to reinstall, reconfigure and check
- chgrp web /var/www/html; chmod g+s /var/www/html
- change groups for acqstats and contracts (see instructions for direct access to web server)
- chown -R root.web /var/www/html/* to make all files owned by root, in the web group
- chmod -R 755 /var/www/html/* to make files editable by web group, readable and executable by all
- user accounts: root, gvogl, mysql
- mysql user accounts
- mysqladmin
- mnoGoSearch, xpdf (for pdftotext) and catdoc (for catdoc, catppt, xls2csv)
- symlink /usr/local/bin/perl to /usr/bin/perl
- search and replace /mantaweb/public_html with /var/www/html (see above)
- Aliases
- Change lib.colostate.edu to point to Snook instead of Clam. (Brian) (done)
- Make sure no scripts point specifically to Clam or its IP address, e.g. on Vulture.
Contact: Greg Vogl
Last updated: November 10, 2005