archiver job from logsearch/207.0.0
Github source:
70a733ee or
master branch
Properties¶
archiver¶
cron_schedule¶Schedule for pausing the archiver to upload log files
- Default
'*/30 * * * *'
data_dir¶Directory for dumping log files
- Default
/var/vcap/store/archiver
method¶Select the method for archiving
- Default
s3
s3¶
access_key_id¶S3 Access Key ID
- Default
""
bucket¶S3 Bucket
- Default
""
endpoint¶S3 Endpoint
- Default
s3.amazonaws.com
prefix¶S3 Prefix
- Default
""
secret_access_key¶S3 Secret Access Key
- Default
""
scp¶
destination¶Destination directory for the tranferred log files
- Default
""
host¶Host to transfer the log files to
- Default
""
port¶Port of the remote host
- Default
22
ssh_key¶Private ssh key in PEM format (file contents, not a path)
- Default
""
username¶If your remote username differs from the default root one
- Default
root
workers¶The number of worker threads that logstash should use (default: auto = one per CPU)
- Default
auto
redis¶
host¶Redis host of queue
key¶Name of queue to pull messages from
- Default
logstash
port¶Redis port of queue
- Default
6379
Templates¶
Templates are rendered and placed onto corresponding
instances during the deployment process. This job's templates
will be placed into /var/vcap/jobs/archiver/ directory
(learn more).
bin/logstash.process(frombin/logstash.process.erb)bin/s3upload.cron(frombin/s3upload.cron.erb)bin/scpupload.cron(frombin/scpupload.cron.erb)config/logstash.conf(fromconfig/logstash.conf.erb)config/scpupload.pem(fromconfig/scpupload.pem.erb)logsearch/metric-collector/files/collector(fromlogsearch/metric-collector/files/collector)
Packages¶
Packages are compiled and placed onto corresponding
instances during the deployment process. Packages will be
placed into /var/vcap/packages/ directory.