Add robots.txt for logs and docs-draft.

Indexing build logs isn't particularly useful though it's a lot of data.
And indexing draft documentation is actually counter productive
(since it might be wrong).  So adopt a robots.txt for both vhosts.

Change-Id: I8a61a0e7dacb96b26217d31a89a6ae94d9c5f62e
Reviewed-on: https://review.openstack.org/18265
Reviewed-by: Clark Boylan <clark.boylan@gmail.com>
Approved: Jeremy Stanley <fungi@yuggoth.org>
Reviewed-by: Jeremy Stanley <fungi@yuggoth.org>
Tested-by: Jenkins
This commit is contained in:
James E. Blair 2012-12-17 13:14:12 -08:00 committed by Jenkins
parent 5528850aad
commit 1875a89e4b
2 changed files with 20 additions and 0 deletions

View File

@ -0,0 +1,2 @@
User-agent: *
Disallow: /

View File

@ -61,10 +61,28 @@ class openstack_project::static (
ensure => directory, ensure => directory,
} }
file { '/srv/static/logs/robots.txt':
ensure => present,
owner => 'root',
group => 'root',
mode => '0444',
source => 'puppet:///modules/openstack_project/disallow_robots.txt',
require => File['/srv/static/logs'],
}
file { '/srv/static/docs-draft': file { '/srv/static/docs-draft':
ensure => directory, ensure => directory,
} }
file { '/srv/static/docs-draft/robots.txt':
ensure => present,
owner => 'root',
group => 'root',
mode => '0444',
source => 'puppet:///modules/openstack_project/disallow_robots.txt',
require => File['/srv/static/docs-draft'],
}
cron { 'gziplogs': cron { 'gziplogs':
user => 'root', user => 'root',
hour => '*/6', hour => '*/6',