Fix static.o.o log compression cron job.
* modules/openstack_project/manifests/static.pp: The gziplogs cron job was erroneously configured to run ONCE EVERY MINUTE in any hour divisible by 6 because the minute parameter was left unspecified and Puppet was defaulting it to "*". This was not apparent until after the cleanup and resizing work on the server sped this job up to the point where it no longer took more than an hour to complete. This change sets it to "0" instead, so it runs at the top of any hour divisible by 6. Also, exclude /srv/static/logs/robots.txt while we're at it, since it was causing a lot of cronspam and shouldn't have been compressed anyway. Change-Id: I7713625dbd2654b8a42b61bd69c3080c77f613c2 Reviewed-on: https://review.openstack.org/21521 Approved: Clark Boylan <clark.boylan@gmail.com> Reviewed-by: Clark Boylan <clark.boylan@gmail.com> Tested-by: Jenkins
This commit is contained in:
parent
f42290f68c
commit
ba9ea03ade
@ -122,8 +122,9 @@ class openstack_project::static (
|
||||
|
||||
cron { 'gziplogs':
|
||||
user => 'root',
|
||||
minute => '0',
|
||||
hour => '*/6',
|
||||
command => 'sleep $((RANDOM\%600)) && flock -n /var/run/gziplogs.lock find /srv/static/logs/ \( -name \*.txt -or -name \*.html \) -exec gzip \{\} \;',
|
||||
command => 'sleep $((RANDOM\%600)) && flock -n /var/run/gziplogs.lock find /srv/static/logs/ -type f -not -name robots.txt \( -name \*.txt -or -name \*.html \) -exec gzip \{\} \;',
|
||||
environment => 'PATH=/var/lib/gems/1.8/bin:/usr/bin:/bin:/usr/sbin:/sbin',
|
||||
}
|
||||
}
|
||||
|
Loading…
x
Reference in New Issue
Block a user