Searches through Greenplum Database log files for specified entries.
gplogfilter [<timestamp_options>] [<pattern_options>]
[<output_options>] [<input_options>] [<input_file>]
gplogfilter --help
gplogfilter --version
The gplogfilter
utility can be used to search through a Greenplum Database log file for entries matching the specified criteria. If an input file is not supplied, then gplogfilter
will use the $COORDINATOR_DATA_DIRECTORY
environment variable to locate the Greenplum coordinator log file in the standard logging location. To read from standard input, use a dash (-
) as the input file name. Input files may be compressed using gzip
. In an input file, a log entry is identified by its timestamp in YYYY-MM-DD [hh:mm[:ss]]
format.
You can also use gplogfilter
to search through all segment log files at once by running it through the gpssh utility. For example, to display the last three lines of each segment log file:
gpssh -f seg_host_file
=> source /usr/local/greenplum-db/greenplum_path.sh
=> gplogfilter -n 3 /gpdata/*/log/gpdb*.csv
By default, the output of gplogfilter
is sent to standard output. Use the -o
option to send the output to a file or a directory. If you supply an output file name ending in .gz
, the output file will be compressed by default using maximum compression. If the output destination is a directory, the output file is given the same name as the input file.
Timestamp Options
Specifies a starting date and time to begin searching for log entries in the format of YYYY-MM-DD [hh:mm[:ss]]
.
If a time is specified, the date and time must be enclosed in either single or double quotes. This example encloses the date and time in single quotes:
gplogfilter -b '2013-05-23 14:33'
Specifies an ending date and time to stop searching for log entries in the format of YYYY-MM-DD [hh:mm[:ss]]
.
If a time is specified, the date and time must be enclosed in either single or double quotes. This example encloses the date and time in single quotes:
gplogfilter -e '2013-05-23 14:33'
[hh][:mm[:ss]]
. If used without either the
-b
or
-e
option, will use the current time as a basis.
Pattern Matching Options
--case=ignore
option.
ERROR:
,
FATAL:
, or
PANIC:
in the first line.
Output Options
0
) denotes the first entry in the result set and an <offset> of any number greater than zero counts back from the end of the result set.
STDOUT
.
gzip
, where
0
is no compression and
9
is maximum compression. If you supply an output file name ending in
.gz
, the output file will be compressed by default using maximum compression.
Input Options
gplogfilter
will use the
$COORDINATOR_DATA_DIRECTORY
environment variable to locate the Greenplum Database coordinator log file. To read from standard input, use a dash (
-
) as the input file name.
gunzip
. If the input file name ends in
.gz
, it will be uncompressed by default.
Display the last three error messages in the coordinator log file:
gplogfilter -t -n 3
Display all log messages in the coordinator log file timestamped in the last 10 minutes:
gplogfilter -d :10
Display log messages in the coordinator log file containing the string |con6 cmd11|
:
gplogfilter -f '|con6 cmd11|'
Using gpssh, run gplogfilter
on the segment hosts and search for log messages in the segment log files containing the string con6
and save output to a file.
gpssh -f seg_hosts_file -e 'source
/usr/local/greenplum-db/greenplum_path.sh ; gplogfilter -f
con6 /gpdata/*/log/gpdb*.csv' > seglog.out