sarg man page on DragonFly

Man page or keyword search:  
man Server   44335 pages
apropos Keyword Search (all sections)
Output format
DragonFly logo
[printable version]

SARG(1)				     SARG			       SARG(1)

NAME
       sarg - Squid Analysis Report Generator

SYNOPSIS
       sarg [options] [logfile...]

DESCRIPTION
       sarg is a log file parser and analyzer for the Squid Web Proxy
       Cache[1]. It allows you to view "where" your users are going to on the
       Internet.

       sarg generates reports in HTML with fields such as: users, IP
       Addresses, bytes, sites, and times. These HTML files can appear in your
       web server's directory for browsing by users or administrators. You may
       also have sarg email the reports to the Squid Cache administrator.

       sarg can read squid or Microsoft ISA access logs. Optionally, it can
       complement the reports with the log of a Squid filter/redirector such
       as squidGuard[2].

OPTIONS
       A summary of options is included below.

       -h --help
	   Show summary of options.

       -a hostname|ip address
	   Limits report to records containing the specified hostname/ip
	   address

       -b filename
	   Enables UserAgent log and writes it to filename.

	       Warning
	       This option is currently unused.

       -c filename
	   Read filename for a list of the web hosts to exclude from the
	   report. See the section called “HOST EXCLUSION FILE”.

       --convert
	   Convert a squid log file date/time field to a human-readable
	   format. All the log files are read and output as one text on the
	   standard output.

	   If the input log file name is -, the input log file is read from
	   standard input.

       --css
	   Output, on the standard output, the internal css sarg inlines in
	   the reports. You can redirect the output to a file of your choice
	   and edit it. Then you can override the internal css with
	   external_css_file in sarg.conf.

	   Using an external css can reduce the size of the report file. If
	   you are short on disk space, you may consider exporting the css as
	   explained above.

       -d date
	   Use date to restrict the report to some date range during log file
	   processing. Format for date is dd/mm/yyyy-dd/mm/yyyy or a single
	   date dd/mm/yyyy. Date ranges can also be specified as day-n,
	   week-n, or month-n where n is the number of days, weeks or months
	   to jump backward. Note that there is no spaces around the hyphen.

       -e email
	   Sends report to email (stdout for console).

       -f filename
	   Reads configuration from filename.

       -g e|u
	   Sets date format in generated reports.
	       e = Europe -> dd/mm/yy
	       u = USA	  -> mm/dd/yy

       -i
	   Generates reports by user and ip address.

	       Note
	       This requires the report_type option in config file to contain
	       "users_sites".

       --keeplogs
	   Don't delete any old report. It is equivalent to setting --lastlog
	   0 but is provided for convenience.

       -l filename
	   Uses filename as the input log. This option can be repeated up to
	   255 times to read multiple files. If the files end with the
	   extension .gz, .bz2 or .Z they are decompressed. If the file name
	   is just -, the log file is read from standard input. In that case,
	   it cannot be compressed.

	   This option is kept for compatibility with older versions of sarg
	   but, starting with sarg 2.3, the log files may be named on the
	   command line without the -l option. It allows the use of wildcards
	   on the command line. Make sure you don't exceed the limit of 255
	   files.

       --lastlog n
	   Limit the number of logs kept in the output directory to n. Any
	   supernumerary report is deleted starting with the oldest report.
	   The value of n must be positive or zero. A value of zero means no
	   report should be deleted.

       -L filename
	   Reads a proxy redirector log file such as one created by squidGuard
	   or Rejik. If you use this option, you may want to configure
	   redirector_log_format in sarg.conf to match the output format of
	   your web content filtering program. This option can be repeated up
	   to 64 times to read multiple files.

       -m
	   Enable advanced processing debug messages. This option produces an
	   enourmous amount of output.

       -n
	   Enables ip address resolution.

       -o dir
	   Writes report in dir.

       -p
	   Generates reports using ip address instead of userid.

       -P prefix --splitprefix prefix
	   This option must be used with --split. If it is provided, the input
	   log is split among several files each containing one day. The name
	   of the output files is made of the prefix and the date formated as
	   -YYYY-MM-DD.

	   The output files are written in the output directory specified with
	   -o or in the current directory.

       -r
	   Output the realtime report on the standard output and exit.

       -s string
	   Limits report to the site specified by string [eg. www.debian.org]

       --split
	   Split the squid log file and output it as text on the standard
	   output omitting the dates outside of the range specified by the -d
	   parameter. If it is combined with --convert the dates are also
	   converted to a human-readable format.

	   If the input log file name is -, the input log file is read from
	   standard input.

	   Combined with -P, the log is written in several files each
	   containing one day worth of the original log.

       --statistics
	   Writes some statistics about the execution time. The statistics
	   include the total execution time; the number of records read in the
	   input log files and the time it took to read them; the number of
	   records and users processed and the time it took to process them.

       -t string
	   Limits the records included in the report based on time-of-day.
	   Format for string is HH:MM or HH:MM-HH:MM. The former reports only
	   the requested time. The latter reports any entry falling within the
	   requested range. This limit complement the limit imposed by option
	   -d.

       -u user
	   Limits reports to user activities.

       -v
	   Write sarg version and exit.

       -w dir
	   Store temporary files in dir. In fact, sarg stores its temporary
	   files in the sarg subdirectory of dir. Be sure to set the HTML
	   output directory to a place outside of the temporary directory or
	   sarg may fail or delete the report when it completes its task.

       -x
	   Writes debug messages to stdout

       -z
	   Writes process messages to stdout.

HOST EXCLUSION FILE
       Sarg can be told to exclude visited hosts from the report by providing
       it with a file containing one host to exclude per line. The "host" may
       be one of the following:

       ·   a full host name,

       ·   a host name starting with a wildcard (*) to match any prefix,

       ·   a single ip address,

       ·   a subnet noted a.b.c.d/e.

       Example 1. Example of a hosts exclusion file
	   *.google.com
	   10.0.0.0/8

       Sarg cannot exclude IPv6 addresses at the moment.

SEE ALSO
       squid(8)

AUTHORS
       This manual page was written by Luigi Gangitano gangitano@lugroma3.org,
       for the Debian GNU/Linux system (but may be used by others). Revised by
       Billy Newsom.

       Currently maintained by Fr�d�ric Marchal
       fmarchal@users.sourceforge.net.

AUTHORS
       Fr�d�ric Marchal <fmarchal@users.sourceforge.net>
	   Docbook version of the manual page

       Billy Newsom
	   Revision of the manual page

       Luigi Gangitano <gangitano@lugroma3.org>
	   Author of the first manual page

COPYRIGHT
       Copyright © 2012 Fr�d�ric Marchal

NOTES
	1. Squid Web Proxy Cache
	   http://www.squid-cache.org/

	2. squidGuard
	   http://www.squidguard.org/

sarg				  27 May 2012			       SARG(1)
[top]

List of man pages available for DragonFly

Copyright (c) for man pages and the logo by the respective OS vendor.

For those who want to learn more, the polarhome community provides shell access and support.

[legal] [privacy] [GNU] [policy] [cookies] [netiquette] [sponsors] [FAQ]
Tweet
Polarhome, production since 1999.
Member of Polarhome portal.
Based on Fawad Halim's script.
....................................................................
Vote for polarhome
Free Shell Accounts :: the biggest list on the net