rm-old-files - (log) directory maintainance (archive,move,compress)
--config] config-file1] [[
Saves, renames, compresses, removes files (via
find) based on
control info specified in a configuration file.
A bane of Systems Administration work is removing/moving/compressing old log files.
Many Systems Administrators put in a crontab some sort of
rm-old-files attempts to simplify things using a configuration file
to specify files to chuck, move or compress.
This program is similar to Erik Tron's
tmpwatch. In contrast to that,
this is written in Perl and has probably the potential to do more with
less coding. Also,
rm-old-files is not RedHat centric and may be more
amenable for other Unixes, and possibly even non-Unix platforms.
gzipprogram to use when the files are to be compressed by gzip. If you never compress via gzip, then this option is meaningless.
bzip2program to use when the files are to be compressed by bzip2. If you never compress via bzip2, then this option is meaningless.
compressprogram to use when the files are to be compressed by compress. If you never compress via compress, then this option is meaningless.
zipprogram to use when the files are to be compressed by zip. If you never compress via zip, then this option is meaningless.
The order that configuration files are processes is the order specified on the comand line; however files unprefaced with --config will appear after those prefaced with --config.
The format of a configuration file is a series file filenames enclosed in square bracket followed by a number of parameter names ``equal'' sign and value. That is:
# This is a comment line ; So is this. [filename1] parameter1 = value1 parameter2 = value2
[filename2] parameter1 = value3 parameter2 = value4
Comments start with # or ; and take effect to the end of the line.
This should be familiar to those who have worked with text-readible
Note filenames, (filename1 and filename2 above) must be unique. However there are times when you may want to refer to the same file. One can use relative path specifiers, e.g. filename1 and ./filename1 which refer to the same file even though they appear to be different.
As quoted directly from the Config::IniFiles manual page:
Multiline or multivalued fields may also be defined ala UNIX ``here document'' syntax:
Parameter=<<EOT value/line 1 value/line 2 EOT
You may use any string you want in place of ``EOT''. Note that what follows the ``<<'' and what appears at the end of the text must match exactly, including any trailing whitespace.
There is a special section or ``filename'' called
[global]. This is
where you can set default values for parameters.
For example, since the same kind of compression extension is likely to be used this can be specified here rather than duplicated in all of the sections that use compression.
designates the group of newly created directories in the trash directory.
The following is a sample config file:
# Comments start with # or ; and go to the end of the line.
# The format for each entry is in Microsoft .INI form: # [directory] # parameter1 = value1 # parameter2 = value2
# Parameter names are: # action # days # dryrun # files # group # ignore # method # mode # owner # prolog # restart # trash
# 'action is either 'rm', 'mv' or 'mv-short'. 'rm' means remove if # (files only) is applied it only remove files. 'mv' means move, # and 'mv-short' means move but strip off any directory specifiers. # If 'mv' or 'mv-short' are chosen, the 'trash' parameter should be # given (default is $trashdir_default).
# 'days' is a number of days for old files or directories which have # not been accessed or modified. A day number can have + or - # prefixed. '+' means older than and '-' means younger than.
# 'method' is either 'atime' or 'mtime'. 'atime' means access time. # 'mtime' means modified time.
# 'owner' and 'group', if given, designate the ownership of newly created # directories in the trash directory.
# 'mode' sets the access mask of newly created directories in the trash # directory.
# 'files' is a list of space- or semicolon-separated shell globbing patterns. # Only matching files will be handled and all other files will be ignored. If # 'files' is missing all files and directories will be handled. A subset of # entries passing the 'files'-test can still be ignored using the 'ignore' # pattern described below.
# 'ignore' is a list of shell globbing patterns separated by spaces or # semicola. Matching files will be ignored although the 'files' pattern may # define them as to be handled.
# The above two option obisously don't apply to recusively deleting # whole filetrees - 'files' and 'ignore' patterns are only applied # to the root of the deleted tree.
# The 'files' is filename which should be removed in certain directory. # If 'files' is missing it removes everything.
# 'restart' and 'prolog' are for invoke command after and before the action.
# 'trash' is a temp directory for old files.
# Remove log files which are more than 30 days old... # [/local/homes/mirror/sun/logs] files = * method = mtime days = 30 action = rm
# Zip up log files which are more than 2 days old... # [/var/spool/post.office/log] files = *.log method = mtime days = +2 action = gzip
# Remove old gzip files (created probably as a result of the # above) that are more than 31 days old. # Note the hack in which this directory name is slightly different # from the one above. The program needs to have unique directory # names... [/var/spool/post.office/log/.] files = *.log.gz method = mtime days = 31 action = rm
Config::IniFiles requires that each item named in a section (a log file here) be unique. If you want to do several things to a particular log file, you need to be clever how to specify the same file so it looks distinct, such as by using relative paths. (Symbolic links would work too.) The fact that multiple configuration files can be specified on the invocation complicates figuring out whether the log file names are unique. Alas, this program which relies on Config::IniFiles doesn't warn whether such as condition has occurred.
Any daemon such as this one which is sufficiently flexible is a security risk. The configuration file allows arbitrary commands to be run. In particular if this daemon is run as root and the configuration file is not protected so that it can't be modified, a bad person could have their programs run as root.
So as with all daemons, one needs to take usual security precautions that a careful sysadmin/maintainer of a computer would. If you can run any daemon as an unprivileged user (or with no privileges), do it! If not, set the permissions on the configuration file and the directory it lives in. Commands that need to be run as root you can run via sudo. On Solaris, I often run process accounting which tracks all commands run. Tripwire may be useful to track changed configuration files.
The Perl module the Config::IniFiles(3) manpage is used to parse configuration files. See also tmpwatch(8), recycle-logs.
A related program is http://recycle-logs.sourceforge.net which will rotate logs based on a configuration file.
The current version is maintained (or not) by
Copyright (C) 1997-2002, 2004 Rocky Bernstein, email: email@example.com. This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
$Id: rm-old-files.in.in,v 1.16 2004/10/29 11:52:19 rockyb Exp $