Freshen up with latest from RackerLabs (and include tox.ini)
Added instance hours report Initial version of report to calculate unit hours used for nova instances Breakdown by flavor, flavor class, account/billing types and by tenant. Moved license so script has shebang as the first line Add tenant info cache. Refactor Instance hr report. Added cache table for basic tenant info for reports. Refactor instance_hours report to use table. Improve performance of tenant info update. use bulk sql operations to speed up the tenant info update, as it's taking ~40s/1000 tenants to update on a decent machine. Fix some tests broken by rebase. Fix unittests broken by rebase. Also, renumber migration due to collision. Add Apache license header to new files. Fixed bug with fetching deployment information in reconciler. Reverted old method for fetching current usage's deployment and added new method to fetch latest deployment information for a request_id. Made the field mismatch error message more readable Refactored nova and glance verifier tests the exists are updated with 201 send_status as part of stacktach down repair mechanism Revert "Fixed bug with fetching deployment information in" Revert "Adding host and deployment info to missing exists entries in the nova usage audit" Revert "Added column headers for host and deployment in json reports" Only log ERROR on last retry fixed the wrong status name for sent_failed variable in audit report fixing documentation for urls that are not available for glance deprecating stacky urls (usage, deletes, exists) that are not used anymore Revert "Revert "Added column headers for host and deployment in json reports"" Revert "Revert "Adding host and deployment info to missing exists entries in the nova usage audit"" Revert "Revert "Fixed bug with fetching deployment information in"" Cell and compute info added for verification failures as well. If that is not present(request_id is not populated for an InstanceUsage entry), the cells display '-' Add tox support for move to stackforge Add tox support for move to stackforge Change-Id: Id94c2a7f1f9061e972e90c3f54e39c9dec11943b
This commit is contained in:
parent
6325c1ab5f
commit
8a0f06ac79
107
docs/api.rst
107
docs/api.rst
@ -181,7 +181,7 @@ stacky/timings/uuid/
|
||||
Retrieve all timings for a given instance. Timings are the time
|
||||
deltas between related .start and .end notifications. For example,
|
||||
the time difference between ``compute.instance.run_instance.start``
|
||||
and ``compute.instance.run_instance.end``.
|
||||
and ``compute.instance.run_instance.end``. This url works only for nova.
|
||||
|
||||
The first column of the response will be
|
||||
|
||||
@ -217,7 +217,7 @@ stacky/timings/uuid/
|
||||
]
|
||||
|
||||
:query uuid: UUID of desired instance.
|
||||
:query service: ``nova`` or ``glance``. default="nova"
|
||||
|
||||
|
||||
stacky/summary
|
||||
==============
|
||||
@ -226,7 +226,7 @@ stacky/summary
|
||||
|
||||
Returns timing summary information for each event type
|
||||
collected. Only notifications with ``.start``/``.end`` pairs
|
||||
are considered.
|
||||
are considered. This url works only for nova.
|
||||
|
||||
This includes: ::
|
||||
|
||||
@ -261,7 +261,6 @@ stacky/summary
|
||||
]
|
||||
|
||||
:query uuid: UUID of desired instance.
|
||||
:query service: ``nova`` or ``glance``. default="nova"
|
||||
:query limit: the number of timings to return.
|
||||
:query offset: offset into query result set to start from.
|
||||
|
||||
@ -275,7 +274,7 @@ stacky/request
|
||||
|
||||
The ``?`` column will be ``E`` if the event came from the ``.error``
|
||||
queue. ``State`` and ``State'`` are the current state and the previous
|
||||
state, respectively.
|
||||
state, respectively. This url works only for nova.
|
||||
|
||||
**Example request**:
|
||||
|
||||
@ -708,101 +707,3 @@ stacky/search
|
||||
:query value: notification values to find.
|
||||
:query when_min: unixtime to start search
|
||||
:query when_max: unixtime to end search
|
||||
|
||||
stacky/usage/launches
|
||||
=====================
|
||||
|
||||
.. http:get:: http://example.com/stacky/launches/
|
||||
|
||||
Return a list of all instance launches.
|
||||
|
||||
**Example request**:
|
||||
|
||||
.. sourcecode:: http
|
||||
|
||||
GET /stacky/usages/launches/ HTTP/1.1
|
||||
Host: example.com
|
||||
Accept: application/json
|
||||
|
||||
**Example response**:
|
||||
|
||||
.. sourcecode:: http
|
||||
|
||||
HTTP/1.1 200 OK
|
||||
Vary: Accept
|
||||
Content-Type: text/json
|
||||
|
||||
[
|
||||
["UUID", "Launched At", "Instance Type Id", "Instance Flavor Id"],
|
||||
[
|
||||
... usage launch records ...
|
||||
]
|
||||
]
|
||||
|
||||
:query instance: desired instance UUID (optional)
|
||||
|
||||
stacky/usage/deletes
|
||||
====================
|
||||
|
||||
.. http:get:: http://example.com/stacky/deletes/
|
||||
|
||||
Return a list of all instance deletes.
|
||||
|
||||
**Example request**:
|
||||
|
||||
.. sourcecode:: http
|
||||
|
||||
GET /stacky/usages/deletes/ HTTP/1.1
|
||||
Host: example.com
|
||||
Accept: application/json
|
||||
|
||||
**Example response**:
|
||||
|
||||
.. sourcecode:: http
|
||||
|
||||
HTTP/1.1 200 OK
|
||||
Vary: Accept
|
||||
Content-Type: text/json
|
||||
|
||||
[
|
||||
["UUID", "Launched At", "Deleted At"]
|
||||
[
|
||||
... usage deleted records ...
|
||||
]
|
||||
]
|
||||
|
||||
:query instance: desired instance UUID (optional)
|
||||
|
||||
|
||||
stacky/usage/exists
|
||||
===================
|
||||
|
||||
.. http:get:: http://example.com/stacky/exists/
|
||||
|
||||
Return a list of all instance exists notifications.
|
||||
|
||||
**Example request**:
|
||||
|
||||
.. sourcecode:: http
|
||||
|
||||
GET /stacky/usages/exists/ HTTP/1.1
|
||||
Host: example.com
|
||||
Accept: application/json
|
||||
|
||||
**Example response**:
|
||||
|
||||
.. sourcecode:: http
|
||||
|
||||
HTTP/1.1 200 OK
|
||||
Vary: Accept
|
||||
Content-Type: text/json
|
||||
|
||||
[
|
||||
["UUID", "Launched At", "Deleted At", "Instance Type Id",
|
||||
"Instance Flavor Id", "Message ID", "Status"]
|
||||
[
|
||||
... usage exists records ...
|
||||
]
|
||||
]
|
||||
|
||||
:query instance: desired instance UUID (optional)
|
@ -1,4 +1,5 @@
|
||||
nose
|
||||
coverage
|
||||
hacking
|
||||
mox
|
||||
nose
|
||||
nose-exclude
|
||||
|
269
reports/instance_hours.py
Normal file
269
reports/instance_hours.py
Normal file
@ -0,0 +1,269 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import argparse
|
||||
import datetime
|
||||
import json
|
||||
import math
|
||||
import sys
|
||||
import operator
|
||||
import os
|
||||
|
||||
sys.path.append(os.environ.get('STACKTACH_INSTALL_DIR', '/stacktach'))
|
||||
|
||||
import usage_audit
|
||||
|
||||
from stacktach import datetime_to_decimal as dt
|
||||
from stacktach import models
|
||||
from stacktach import stacklog
|
||||
|
||||
|
||||
class TenantManager(object):
|
||||
def __init__(self):
|
||||
self._types = None
|
||||
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_value, traceback):
|
||||
return False
|
||||
|
||||
@property
|
||||
def type_names(self):
|
||||
if self._types is None:
|
||||
self._types = set()
|
||||
for t in models.TenantType.objects.all():
|
||||
self._types.add(t.name)
|
||||
return self._types
|
||||
|
||||
def get_tenant_info(self, tenant_id):
|
||||
try:
|
||||
tenant = models.TenantInfo.objects\
|
||||
.get(tenant=tenant_id)
|
||||
tenant_info = dict(
|
||||
tenant=tenant_id,
|
||||
account_name=tenant.name)
|
||||
ttypes = dict()
|
||||
for t in tenant.types.all():
|
||||
ttypes[t.name] = t.value
|
||||
except models.TenantInfo.DoesNotExist:
|
||||
tenant_info = dict(
|
||||
tenant=tenant_id,
|
||||
account_name='unknown account')
|
||||
ttypes = dict()
|
||||
for t in self.type_names:
|
||||
ttypes[t] = 'unknown'
|
||||
tenant_info['types'] = ttypes
|
||||
return tenant_info
|
||||
|
||||
|
||||
class InstanceHoursReport(object):
|
||||
|
||||
FLAVOR_CLASS_WEIGHTS = dict(standard=1.0)
|
||||
|
||||
def __init__(self, tenant_manager, time=None, period_length='day'):
|
||||
if time is None:
|
||||
time = datetime.datetime.utcnow()
|
||||
self.start, self.end = usage_audit.get_previous_period(time, period_length)
|
||||
self.tenant_manager = tenant_manager
|
||||
self.flavor_cache = dict()
|
||||
self.clear()
|
||||
|
||||
def clear(self):
|
||||
self.count = 0
|
||||
self.unit_hours = 0.0
|
||||
self.by_flavor = dict()
|
||||
self.by_flavor_class = dict()
|
||||
self.by_tenant = dict()
|
||||
self.by_type = dict()
|
||||
for name in self.tenant_manager.type_names:
|
||||
self.by_tenant[name] = dict()
|
||||
self.by_type[name] = dict()
|
||||
|
||||
def _get_verified_exists(self):
|
||||
start = dt.dt_to_decimal(self.start)
|
||||
end = dt.dt_to_decimal(self.end)
|
||||
return models.InstanceExists.objects.filter(
|
||||
status=models.InstanceExists.VERIFIED,
|
||||
audit_period_beginning__gte=start,
|
||||
audit_period_beginning__lte=end,
|
||||
audit_period_ending__gte=start,
|
||||
audit_period_ending__lte=end)
|
||||
|
||||
def _get_instance_hours(self, exist):
|
||||
if (exist.deleted_at is None) or (exist.deleted_at > exist.audit_period_ending):
|
||||
end = exist.audit_period_ending
|
||||
else:
|
||||
end = exist.deleted_at
|
||||
if exist.launched_at > exist.audit_period_beginning:
|
||||
start = exist.launched_at
|
||||
else:
|
||||
start = exist.audit_period_beginning
|
||||
return math.ceil((end - start)/3600)
|
||||
|
||||
def _get_flavor_info(self, exist):
|
||||
flavor = exist.instance_flavor_id
|
||||
if flavor not in self.flavor_cache:
|
||||
if '-' in flavor:
|
||||
flavor_class, n = flavor.split('-', 1)
|
||||
else:
|
||||
flavor_class = 'standard'
|
||||
try:
|
||||
payload = json.loads(exist.raw.json)[1]['payload']
|
||||
except Exception:
|
||||
print "Error loading raw notification data for %s" % exist.id
|
||||
raise
|
||||
flavor_name = payload['instance_type']
|
||||
flavor_size = payload['memory_mb']
|
||||
weight = self.FLAVOR_CLASS_WEIGHTS.get(flavor_class, 1.0)
|
||||
flavor_units = (flavor_size/256.0) * weight
|
||||
self.flavor_cache[flavor] = (flavor, flavor_name, flavor_class, flavor_units)
|
||||
return self.flavor_cache[flavor]
|
||||
|
||||
def add_type_hours(self, type_name, type_value, unit_hours):
|
||||
if type_value not in self.by_type[type_name]:
|
||||
self.by_type[type_name][type_value] = dict(count=0, unit_hours=0.0)
|
||||
cts = self.by_type[type_name][type_value]
|
||||
cts['count'] += 1
|
||||
cts['unit_hours'] += unit_hours
|
||||
cts['percent_count'] = (float(cts['count'])/self.count) * 100
|
||||
cts['percent_unit_hours'] = (cts['unit_hours']/self.unit_hours) * 100
|
||||
|
||||
def add_flavor_class_hours(self, flavor_class, unit_hours):
|
||||
if flavor_class not in self.by_flavor_class:
|
||||
self.by_flavor_class[flavor_class] = dict(count=0, unit_hours=0.0)
|
||||
cts = self.by_flavor_class[flavor_class]
|
||||
cts['count'] += 1
|
||||
cts['unit_hours'] += unit_hours
|
||||
cts['percent_count'] = (float(cts['count'])/self.count) * 100
|
||||
cts['percent_unit_hours'] = (cts['unit_hours']/self.unit_hours) * 100
|
||||
|
||||
def add_flavor_hours(self, flavor, flavor_name, unit_hours):
|
||||
if flavor not in self.by_flavor:
|
||||
self.by_flavor[flavor] = dict(count=0, unit_hours=0.0)
|
||||
cts = self.by_flavor[flavor]
|
||||
cts['count'] += 1
|
||||
cts['unit_hours'] += unit_hours
|
||||
cts['percent_count'] = (float(cts['count'])/self.count) * 100
|
||||
cts['percent_unit_hours'] = (cts['unit_hours']/self.unit_hours) * 100
|
||||
cts['flavor_name'] = flavor_name
|
||||
|
||||
def add_tenant_hours(self, tenant_info, unit_hours):
|
||||
tenant = tenant_info['tenant']
|
||||
cts = dict(count=0, unit_hours=0.0)
|
||||
for tname, tvalue in tenant_info['types'].items():
|
||||
if tvalue not in self.by_tenant[tname]:
|
||||
self.by_tenant[tname][tvalue] = dict()
|
||||
if tenant not in self.by_tenant[tname][tvalue]:
|
||||
self.by_tenant[tname][tvalue][tenant] = cts
|
||||
cts = self.by_tenant[tname][tvalue][tenant]
|
||||
cts[tname] = tvalue
|
||||
cts['count'] += 1
|
||||
cts['unit_hours'] += unit_hours
|
||||
cts['percent_count'] = (float(cts['count'])/self.count) * 100
|
||||
cts['percent_unit_hours'] = (cts['unit_hours']/self.unit_hours) * 100
|
||||
cts['tenant'] = tenant
|
||||
cts['account_name'] = tenant_info['account_name']
|
||||
|
||||
def compile_hours(self):
|
||||
exists = self._get_verified_exists()
|
||||
self.count = exists.count()
|
||||
with self.tenant_manager as tenant_manager:
|
||||
for exist in exists:
|
||||
hours = self._get_instance_hours(exist)
|
||||
flavor, flavor_name, flavor_class, flavor_units = self._get_flavor_info(exist)
|
||||
tenant_info = tenant_manager.get_tenant_info(exist.tenant)
|
||||
unit_hours = hours * flavor_units
|
||||
self.unit_hours += unit_hours
|
||||
self.add_flavor_hours(flavor, flavor_name, unit_hours)
|
||||
self.add_flavor_class_hours(flavor_class, unit_hours)
|
||||
for tname, tvalue in tenant_info['types'].items():
|
||||
self.add_type_hours(tname, tvalue, unit_hours)
|
||||
self.add_tenant_hours(tenant_info, unit_hours)
|
||||
|
||||
def top_hundred(self, key):
|
||||
def th(d):
|
||||
top = dict()
|
||||
for t, customers in d.iteritems():
|
||||
top[t] = sorted(customers.values(), key=operator.itemgetter(key), reverse=True)[:100]
|
||||
return top
|
||||
top_hundred = dict()
|
||||
for type_name, tenants in self.by_tenant.iteritems():
|
||||
top_hundred[type_name] = th(tenants)
|
||||
return top_hundred
|
||||
|
||||
def generate_json(self):
|
||||
report = dict(total_instance_count=self.count,
|
||||
total_unit_hours=self.unit_hours,
|
||||
flavor=self.by_flavor,
|
||||
flavor_class=self.by_flavor_class,
|
||||
top_hundred_by_count=self.top_hundred('count'),
|
||||
top_hundred_by_unit_hours=self.top_hundred('unit_hours'))
|
||||
for ttype, stats in self.by_type.iteritems():
|
||||
report[ttype] = stats
|
||||
return json.dumps(report)
|
||||
|
||||
def store(self, json_report):
|
||||
report = models.JsonReport(
|
||||
json=json_report,
|
||||
created=dt.dt_to_decimal(datetime.datetime.utcnow()),
|
||||
period_start=self.start,
|
||||
period_end=self.end,
|
||||
version=1,
|
||||
name='instance hours')
|
||||
report.save()
|
||||
|
||||
|
||||
def valid_datetime(d):
|
||||
try:
|
||||
t = datetime.datetime.strptime(d, "%Y-%m-%d %H:%M:%S")
|
||||
return t
|
||||
except Exception, e:
|
||||
raise argparse.ArgumentTypeError(
|
||||
"'%s' is not in YYYY-MM-DD HH:MM:SS format." % d)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser('StackTach Instance Hours Report')
|
||||
parser.add_argument('--period_length',
|
||||
choices=['hour', 'day'], default='day')
|
||||
parser.add_argument('--utcdatetime',
|
||||
help="Override the end time used to generate report.",
|
||||
type=valid_datetime, default=None)
|
||||
parser.add_argument('--store',
|
||||
help="If set to true, report will be stored. "
|
||||
"Otherwise, it will just be printed",
|
||||
default=False, action="store_true")
|
||||
args = parser.parse_args()
|
||||
|
||||
stacklog.set_default_logger_name('instance_hours')
|
||||
parent_logger = stacklog.get_logger('instance_hours', is_parent=True)
|
||||
log_listener = stacklog.LogListener(parent_logger)
|
||||
log_listener.start()
|
||||
|
||||
tenant_manager = TenantManager()
|
||||
report = InstanceHoursReport(
|
||||
tenant_manager,
|
||||
time=args.utcdatetime,
|
||||
period_length=args.period_length)
|
||||
|
||||
report.compile_hours()
|
||||
json = report.generate_json()
|
||||
|
||||
if not args.store:
|
||||
print json
|
||||
else:
|
||||
report.store(json)
|
@ -93,6 +93,14 @@ def _get_exists(beginning, ending):
|
||||
return models.InstanceExists.objects.filter(**filters)
|
||||
|
||||
|
||||
def cell_and_compute(instance, launched_at):
|
||||
usage = InstanceUsage.find(instance, launched_at)[0]
|
||||
deployment = usage.latest_deployment_for_request_id()
|
||||
cell = (deployment and deployment.name) or '-'
|
||||
compute = usage.host() or '-'
|
||||
return cell, compute
|
||||
|
||||
|
||||
def _audit_launches_to_exists(launches, exists, beginning):
|
||||
fails = []
|
||||
for (instance, launches) in launches.items():
|
||||
@ -111,14 +119,12 @@ def _audit_launches_to_exists(launches, exists, beginning):
|
||||
if reconciler:
|
||||
args = (expected['id'], beginning)
|
||||
rec = reconciler.missing_exists_for_instance(*args)
|
||||
msg = "Couldn't find exists for launch (%s, %s)"
|
||||
msg = msg % (instance, expected['launched_at'])
|
||||
launched_at = dt.dt_from_decimal(expected['launched_at'])
|
||||
usage = InstanceUsage.find(instance, launched_at)[0]
|
||||
host = usage.host()
|
||||
deployment = usage.deployment()
|
||||
msg = "Couldn't find exists for launch (%s, %s)"
|
||||
msg = msg % (instance, launched_at)
|
||||
cell, compute = cell_and_compute(instance, launched_at)
|
||||
fails.append(['Launch', expected['id'], msg,
|
||||
'Y' if rec else 'N', host, deployment])
|
||||
'Y' if rec else 'N', cell, compute])
|
||||
else:
|
||||
rec = False
|
||||
if reconciler:
|
||||
@ -126,11 +132,9 @@ def _audit_launches_to_exists(launches, exists, beginning):
|
||||
rec = reconciler.missing_exists_for_instance(*args)
|
||||
msg = "No exists for instance (%s)" % instance
|
||||
launched_at = dt.dt_from_decimal(launches[0]['launched_at'])
|
||||
usage = InstanceUsage.find(instance, launched_at)[0]
|
||||
host = usage.host()
|
||||
deployment = usage.deployment()
|
||||
fails.append(['Launch', '-', msg, 'Y' if rec else 'N', host,
|
||||
deployment])
|
||||
cell, compute = cell_and_compute(instance, launched_at)
|
||||
fails.append(['-', msg, 'Y' if rec else 'N',
|
||||
cell, compute])
|
||||
return fails
|
||||
|
||||
|
||||
@ -249,11 +253,15 @@ def store_results(start, end, summary, details):
|
||||
|
||||
|
||||
def make_json_report(summary, details):
|
||||
report = [{'summary': summary},
|
||||
['Object', 'ID', 'Error Description', 'Reconciled?', 'Cell',
|
||||
'Deployment']]
|
||||
report.extend(details['exist_fails'])
|
||||
report.extend(details['launch_fails'])
|
||||
report = {
|
||||
'summary': summary,
|
||||
'exist_fail_headers': ['Exists Row ID', 'Error Description', 'Cell',
|
||||
'Compute'],
|
||||
'exist_fails': details['exist_fails'],
|
||||
'launch_fail_headers': ['Launch Row ID', 'Error Description',
|
||||
'Reconciled?', 'Cell', 'Compute'],
|
||||
'launch_fails': details['launch_fails']
|
||||
}
|
||||
return json.dumps(report)
|
||||
|
||||
|
||||
|
@ -27,7 +27,7 @@ def _status_queries(exists_query):
|
||||
pending = exists_query.filter(status=models.InstanceExists.PENDING)
|
||||
verifying = exists_query.filter(status=models.InstanceExists.VERIFYING)
|
||||
sent_unverified = exists_query.filter(status=models.InstanceExists.SENT_UNVERIFIED)
|
||||
sent_failed = exists_query.filter(status=models.InstanceExists.VERIFYING)
|
||||
sent_failed = exists_query.filter(status=models.InstanceExists.SENT_FAILED)
|
||||
sent_verifying = exists_query.filter(status=models.InstanceExists.SENT_VERIFYING)
|
||||
return verified, reconciled, fail, pending, verifying, sent_unverified, \
|
||||
sent_failed, sent_verifying
|
||||
@ -101,7 +101,13 @@ def _verified_audit_base(base_query, exists_model):
|
||||
|
||||
failed_query = Q(status=exists_model.FAILED)
|
||||
failed = exists_model.objects.filter(base_query & failed_query)
|
||||
detail = [['Exist', e.id, e.fail_reason] for e in failed]
|
||||
detail = []
|
||||
for e in failed:
|
||||
try:
|
||||
detail.append([e.id, e.fail_reason, e.raw.deployment.name,
|
||||
e.raw.host])
|
||||
except Exception:
|
||||
detail.append([e.id, e.fail_reason, "-", "-"])
|
||||
return summary, detail
|
||||
|
||||
|
||||
|
@ -1,2 +0,0 @@
|
||||
#!/bin/bash
|
||||
nosetests tests --exclude-dir=stacktach --with-coverage --cover-package=stacktach,worker,verifier --cover-erase
|
@ -1,7 +0,0 @@
|
||||
#!/bin/sh
|
||||
virtualenv .venv
|
||||
. .venv/bin/activate
|
||||
pip install -r etc/pip-requires.txt
|
||||
pip install -r etc/test-requires.txt
|
||||
nosetests tests --exclude-dir=stacktach --with-coverage --cover-package=stacktach,worker,verifier --cover-erase
|
||||
|
@ -15,6 +15,7 @@
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import decimal
|
||||
import datetime
|
||||
import functools
|
||||
import json
|
||||
from datetime import datetime
|
||||
@ -514,3 +515,102 @@ def repair_stacktach_down(request):
|
||||
content_type="application/json")
|
||||
return response
|
||||
|
||||
|
||||
def _update_tenant_info_cache(tenant_info):
|
||||
tenant_id = tenant_info['tenant']
|
||||
try:
|
||||
tenant = models.TenantInfo.objects\
|
||||
.select_for_update()\
|
||||
.get(tenant=tenant_id)
|
||||
except models.TenantInfo.DoesNotExist:
|
||||
tenant = models.TenantInfo(tenant=tenant_id)
|
||||
tenant.name = tenant_info['name']
|
||||
tenant.last_updated = datetime.utcnow()
|
||||
tenant.save()
|
||||
|
||||
types = set()
|
||||
for type_name, type_value in tenant_info['types'].items():
|
||||
try:
|
||||
tenant_type = models.TenantType.objects\
|
||||
.get(name=type_name,
|
||||
value=type_value)
|
||||
except models.TenantType.DoesNotExist:
|
||||
tenant_type = models.TenantType(name=type_name,
|
||||
value=type_value)
|
||||
tenant_type.save()
|
||||
types.add(tenant_type)
|
||||
tenant.types = list(types)
|
||||
tenant.save()
|
||||
|
||||
def _batch_update_tenant_info(info_list):
|
||||
tenant_info = dict((str(info['tenant']), info) for info in info_list)
|
||||
tenant_ids = set(tenant_info)
|
||||
old_tenants = set(t['tenant'] for t in
|
||||
models.TenantInfo.objects
|
||||
.filter(tenant__in=list(tenant_ids))
|
||||
.values('tenant'))
|
||||
new_tenants = []
|
||||
now = datetime.utcnow()
|
||||
for tenant in (tenant_ids - old_tenants):
|
||||
new_tenants.append(models.TenantInfo(tenant=tenant,
|
||||
name=tenant_info[tenant]['name'],
|
||||
last_updated=now))
|
||||
if new_tenants:
|
||||
models.TenantInfo.objects.bulk_create(new_tenants)
|
||||
tenants = models.TenantInfo.objects.filter(tenant__in=list(tenant_ids))
|
||||
tenants.update(last_updated=now)
|
||||
|
||||
types = dict(((tt.name,tt.value),tt) for tt in models.TenantType.objects.all())
|
||||
TypeXref = models.TenantInfo.types.through
|
||||
|
||||
changed_tenant_dbids = []
|
||||
new_type_xrefs = []
|
||||
for tenant in tenants:
|
||||
info = tenant_info[tenant.tenant]
|
||||
new_types = set()
|
||||
for type_name, type_value in info['types'].items():
|
||||
ttype = types.get((type_name, type_value))
|
||||
if ttype is None:
|
||||
ttype = models.TenantType(name=type_name,
|
||||
value=type_value)
|
||||
ttype.save()
|
||||
types[(type_name,type_value)] = ttype
|
||||
new_types.add(ttype)
|
||||
cur_types = set(tenant.types.all())
|
||||
if new_types != cur_types:
|
||||
if cur_types:
|
||||
changed_tenant_dbids.append(tenant.id)
|
||||
for ttype in new_types:
|
||||
new_type_xrefs.append(TypeXref(tenantinfo_id=tenant.id, tenanttype_id=ttype.id))
|
||||
TypeXref.objects.filter(tenantinfo_id__in=changed_tenant_dbids).delete()
|
||||
TypeXref.objects.bulk_create(new_type_xrefs)
|
||||
|
||||
|
||||
@api_call
|
||||
def batch_update_tenant_info(request):
|
||||
if request.method not in ['PUT', 'POST']:
|
||||
raise BadRequestException(message="Invalid method")
|
||||
|
||||
if request.body is None or request.body == '':
|
||||
raise BadRequestException(message="Request body required")
|
||||
|
||||
body = json.loads(request.body)
|
||||
if body.get('tenants') is not None:
|
||||
tenants = body['tenants']
|
||||
_batch_update_tenant_info(tenants)
|
||||
else:
|
||||
msg = "'tenants' missing from request body"
|
||||
raise BadRequestException(message=msg)
|
||||
|
||||
@api_call
|
||||
def update_tenant_info(request, tenant_id):
|
||||
if request.method not in ['PUT', 'POST']:
|
||||
raise BadRequestException(message="Invalid method")
|
||||
|
||||
if request.body is None or request.body == '':
|
||||
raise BadRequestException(message="Request body required")
|
||||
|
||||
body = json.loads(request.body)
|
||||
if body['tenant'] != tenant_id:
|
||||
raise BadRequestException(message="Invalid tenant: %s != %s" % (body['tenant'], tenant_id))
|
||||
_update_tenant_info_cache(body)
|
||||
|
@ -0,0 +1,287 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
# -*- coding: utf-8 -*-
|
||||
import datetime
|
||||
from south.db import db
|
||||
from south.v2 import SchemaMigration
|
||||
from django.db import models
|
||||
|
||||
|
||||
class Migration(SchemaMigration):
|
||||
|
||||
def forwards(self, orm):
|
||||
# Adding model 'TenantInfo'
|
||||
db.create_table(u'stacktach_tenantinfo', (
|
||||
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
|
||||
('tenant', self.gf('django.db.models.fields.CharField')(unique=True, max_length=50, db_index=True)),
|
||||
('name', self.gf('django.db.models.fields.CharField')(db_index=True, max_length=100, null=True, blank=True)),
|
||||
('last_updated', self.gf('django.db.models.fields.DateTimeField')(db_index=True)),
|
||||
))
|
||||
db.send_create_signal(u'stacktach', ['TenantInfo'])
|
||||
|
||||
# Adding M2M table for field types on 'TenantInfo'
|
||||
m2m_table_name = db.shorten_name(u'stacktach_tenantinfo_types')
|
||||
db.create_table(m2m_table_name, (
|
||||
('id', models.AutoField(verbose_name='ID', primary_key=True, auto_created=True)),
|
||||
('tenantinfo', models.ForeignKey(orm[u'stacktach.tenantinfo'], null=False)),
|
||||
('tenanttype', models.ForeignKey(orm[u'stacktach.tenanttype'], null=False))
|
||||
))
|
||||
db.create_unique(m2m_table_name, ['tenantinfo_id', 'tenanttype_id'])
|
||||
|
||||
# Adding model 'TenantType'
|
||||
db.create_table(u'stacktach_tenanttype', (
|
||||
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
|
||||
('name', self.gf('django.db.models.fields.CharField')(max_length=50, db_index=True)),
|
||||
('value', self.gf('django.db.models.fields.CharField')(max_length=50, db_index=True)),
|
||||
))
|
||||
db.send_create_signal(u'stacktach', ['TenantType'])
|
||||
|
||||
|
||||
def backwards(self, orm):
|
||||
# Deleting model 'TenantInfo'
|
||||
db.delete_table(u'stacktach_tenantinfo')
|
||||
|
||||
# Removing M2M table for field types on 'TenantInfo'
|
||||
db.delete_table(db.shorten_name(u'stacktach_tenantinfo_types'))
|
||||
|
||||
# Deleting model 'TenantType'
|
||||
db.delete_table(u'stacktach_tenanttype')
|
||||
|
||||
|
||||
models = {
|
||||
u'stacktach.deployment': {
|
||||
'Meta': {'object_name': 'Deployment'},
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'name': ('django.db.models.fields.CharField', [], {'max_length': '50'})
|
||||
},
|
||||
u'stacktach.genericrawdata': {
|
||||
'Meta': {'object_name': 'GenericRawData'},
|
||||
'deployment': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['stacktach.Deployment']"}),
|
||||
'event': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'host': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '100', 'null': 'True', 'blank': 'True'}),
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'instance': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'json': ('django.db.models.fields.TextField', [], {}),
|
||||
'message_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'publisher': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '100', 'null': 'True', 'blank': 'True'}),
|
||||
'request_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'routing_key': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'service': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'tenant': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'when': ('django.db.models.fields.DecimalField', [], {'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'})
|
||||
},
|
||||
u'stacktach.glancerawdata': {
|
||||
'Meta': {'object_name': 'GlanceRawData'},
|
||||
'deployment': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['stacktach.Deployment']"}),
|
||||
'event': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'host': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '100', 'null': 'True', 'blank': 'True'}),
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'image_type': ('django.db.models.fields.IntegerField', [], {'default': '0', 'null': 'True', 'db_index': 'True'}),
|
||||
'instance': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'json': ('django.db.models.fields.TextField', [], {}),
|
||||
'owner': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '255', 'null': 'True', 'blank': 'True'}),
|
||||
'publisher': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '100', 'null': 'True', 'blank': 'True'}),
|
||||
'request_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'routing_key': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'service': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'status': ('django.db.models.fields.CharField', [], {'max_length': '30', 'null': 'True', 'db_index': 'True'}),
|
||||
'uuid': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '36', 'null': 'True', 'blank': 'True'}),
|
||||
'when': ('django.db.models.fields.DecimalField', [], {'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'})
|
||||
},
|
||||
u'stacktach.imagedeletes': {
|
||||
'Meta': {'object_name': 'ImageDeletes'},
|
||||
'deleted_at': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'raw': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['stacktach.GlanceRawData']", 'null': 'True'}),
|
||||
'uuid': ('django.db.models.fields.CharField', [], {'max_length': '50', 'db_index': 'True'})
|
||||
},
|
||||
u'stacktach.imageexists': {
|
||||
'Meta': {'object_name': 'ImageExists'},
|
||||
'audit_period_beginning': ('django.db.models.fields.DecimalField', [], {'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
'audit_period_ending': ('django.db.models.fields.DecimalField', [], {'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
'created_at': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
'delete': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'+'", 'null': 'True', 'to': u"orm['stacktach.ImageDeletes']"}),
|
||||
'deleted_at': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
'fail_reason': ('django.db.models.fields.CharField', [], {'max_length': '300', 'null': 'True'}),
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'message_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'owner': ('django.db.models.fields.CharField', [], {'max_length': '255', 'null': 'True', 'db_index': 'True'}),
|
||||
'raw': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'+'", 'to': u"orm['stacktach.GlanceRawData']"}),
|
||||
'send_status': ('django.db.models.fields.IntegerField', [], {'default': '0', 'db_index': 'True'}),
|
||||
'size': ('django.db.models.fields.BigIntegerField', [], {'max_length': '20'}),
|
||||
'status': ('django.db.models.fields.CharField', [], {'default': "'pending'", 'max_length': '50', 'db_index': 'True'}),
|
||||
'usage': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'+'", 'null': 'True', 'to': u"orm['stacktach.ImageUsage']"}),
|
||||
'uuid': ('django.db.models.fields.CharField', [], {'max_length': '50', 'null': 'True', 'db_index': 'True'})
|
||||
},
|
||||
u'stacktach.imageusage': {
|
||||
'Meta': {'object_name': 'ImageUsage'},
|
||||
'created_at': ('django.db.models.fields.DecimalField', [], {'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'last_raw': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['stacktach.GlanceRawData']", 'null': 'True'}),
|
||||
'owner': ('django.db.models.fields.CharField', [], {'max_length': '50', 'null': 'True', 'db_index': 'True'}),
|
||||
'size': ('django.db.models.fields.BigIntegerField', [], {'max_length': '20'}),
|
||||
'uuid': ('django.db.models.fields.CharField', [], {'max_length': '50', 'db_index': 'True'})
|
||||
},
|
||||
u'stacktach.instancedeletes': {
|
||||
'Meta': {'object_name': 'InstanceDeletes'},
|
||||
'deleted_at': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'instance': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'launched_at': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
'raw': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['stacktach.RawData']", 'null': 'True'})
|
||||
},
|
||||
u'stacktach.instanceexists': {
|
||||
'Meta': {'object_name': 'InstanceExists'},
|
||||
'audit_period_beginning': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
'audit_period_ending': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
'bandwidth_public_out': ('django.db.models.fields.BigIntegerField', [], {'default': '0'}),
|
||||
'delete': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'+'", 'null': 'True', 'to': u"orm['stacktach.InstanceDeletes']"}),
|
||||
'deleted_at': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
'fail_reason': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '300', 'null': 'True', 'blank': 'True'}),
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'instance': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'instance_flavor_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '100', 'null': 'True', 'blank': 'True'}),
|
||||
'instance_type_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'launched_at': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
'message_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'os_architecture': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'os_distro': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'os_version': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'raw': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'+'", 'null': 'True', 'to': u"orm['stacktach.RawData']"}),
|
||||
'rax_options': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'send_status': ('django.db.models.fields.IntegerField', [], {'default': '0', 'null': 'True', 'db_index': 'True'}),
|
||||
'status': ('django.db.models.fields.CharField', [], {'default': "'pending'", 'max_length': '50', 'db_index': 'True'}),
|
||||
'tenant': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'usage': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'+'", 'null': 'True', 'to': u"orm['stacktach.InstanceUsage']"})
|
||||
},
|
||||
u'stacktach.instancereconcile': {
|
||||
'Meta': {'object_name': 'InstanceReconcile'},
|
||||
'deleted_at': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'instance': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'instance_flavor_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '100', 'null': 'True', 'blank': 'True'}),
|
||||
'instance_type_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'launched_at': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
'os_architecture': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'os_distro': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'os_version': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'rax_options': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'row_created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
|
||||
'row_updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
|
||||
'source': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '150', 'null': 'True', 'blank': 'True'}),
|
||||
'tenant': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'})
|
||||
},
|
||||
u'stacktach.instanceusage': {
|
||||
'Meta': {'object_name': 'InstanceUsage'},
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'instance': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'instance_flavor_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '100', 'null': 'True', 'blank': 'True'}),
|
||||
'instance_type_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'launched_at': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
'os_architecture': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'os_distro': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'os_version': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'rax_options': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'request_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'tenant': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'})
|
||||
},
|
||||
u'stacktach.jsonreport': {
|
||||
'Meta': {'object_name': 'JsonReport'},
|
||||
'created': ('django.db.models.fields.DecimalField', [], {'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'json': ('django.db.models.fields.TextField', [], {}),
|
||||
'name': ('django.db.models.fields.CharField', [], {'max_length': '50', 'db_index': 'True'}),
|
||||
'period_end': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True'}),
|
||||
'period_start': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True'}),
|
||||
'version': ('django.db.models.fields.IntegerField', [], {'default': '1'})
|
||||
},
|
||||
u'stacktach.lifecycle': {
|
||||
'Meta': {'object_name': 'Lifecycle'},
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'instance': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'last_raw': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['stacktach.RawData']", 'null': 'True'}),
|
||||
'last_state': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'last_task_state': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'})
|
||||
},
|
||||
u'stacktach.rawdata': {
|
||||
'Meta': {'object_name': 'RawData'},
|
||||
'deployment': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['stacktach.Deployment']"}),
|
||||
'event': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'host': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '100', 'null': 'True', 'blank': 'True'}),
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'image_type': ('django.db.models.fields.IntegerField', [], {'default': '0', 'null': 'True', 'db_index': 'True'}),
|
||||
'instance': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'json': ('django.db.models.fields.TextField', [], {}),
|
||||
'old_state': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '20', 'null': 'True', 'blank': 'True'}),
|
||||
'old_task': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '30', 'null': 'True', 'blank': 'True'}),
|
||||
'publisher': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '100', 'null': 'True', 'blank': 'True'}),
|
||||
'request_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'routing_key': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'service': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'state': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '20', 'null': 'True', 'blank': 'True'}),
|
||||
'task': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '30', 'null': 'True', 'blank': 'True'}),
|
||||
'tenant': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '50', 'null': 'True', 'blank': 'True'}),
|
||||
'when': ('django.db.models.fields.DecimalField', [], {'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'})
|
||||
},
|
||||
u'stacktach.rawdataimagemeta': {
|
||||
'Meta': {'object_name': 'RawDataImageMeta'},
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'os_architecture': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'os_distro': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'os_version': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
|
||||
'raw': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['stacktach.RawData']"}),
|
||||
'rax_options': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'})
|
||||
},
|
||||
u'stacktach.requesttracker': {
|
||||
'Meta': {'object_name': 'RequestTracker'},
|
||||
'completed': ('django.db.models.fields.BooleanField', [], {'default': 'False', 'db_index': 'True'}),
|
||||
'duration': ('django.db.models.fields.DecimalField', [], {'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'last_timing': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['stacktach.Timing']", 'null': 'True'}),
|
||||
'lifecycle': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['stacktach.Lifecycle']"}),
|
||||
'request_id': ('django.db.models.fields.CharField', [], {'max_length': '50', 'db_index': 'True'}),
|
||||
'start': ('django.db.models.fields.DecimalField', [], {'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'})
|
||||
},
|
||||
u'stacktach.tenantinfo': {
|
||||
'Meta': {'object_name': 'TenantInfo'},
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'last_updated': ('django.db.models.fields.DateTimeField', [], {'db_index': 'True'}),
|
||||
'name': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '100', 'null': 'True', 'blank': 'True'}),
|
||||
'tenant': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '50', 'db_index': 'True'}),
|
||||
'types': ('django.db.models.fields.related.ManyToManyField', [], {'to': u"orm['stacktach.TenantType']", 'symmetrical': 'False'})
|
||||
},
|
||||
u'stacktach.tenanttype': {
|
||||
'Meta': {'object_name': 'TenantType'},
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'name': ('django.db.models.fields.CharField', [], {'max_length': '50', 'db_index': 'True'}),
|
||||
'value': ('django.db.models.fields.CharField', [], {'max_length': '50', 'db_index': 'True'})
|
||||
},
|
||||
u'stacktach.timing': {
|
||||
'Meta': {'object_name': 'Timing'},
|
||||
'diff': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6', 'db_index': 'True'}),
|
||||
'end_raw': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'+'", 'null': 'True', 'to': u"orm['stacktach.RawData']"}),
|
||||
'end_when': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6'}),
|
||||
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
|
||||
'lifecycle': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['stacktach.Lifecycle']"}),
|
||||
'name': ('django.db.models.fields.CharField', [], {'max_length': '50', 'db_index': 'True'}),
|
||||
'start_raw': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'+'", 'null': 'True', 'to': u"orm['stacktach.RawData']"}),
|
||||
'start_when': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '20', 'decimal_places': '6'})
|
||||
}
|
||||
}
|
||||
|
||||
complete_apps = ['stacktach']
|
@ -177,14 +177,20 @@ class InstanceUsage(models.Model):
|
||||
rax_options = models.TextField(null=True, blank=True)
|
||||
|
||||
def deployment(self):
|
||||
return self.latest_raw_for_request_id().deployment.name
|
||||
raws = RawData.objects.filter(request_id=self.request_id)
|
||||
return raws and raws[0].deployment
|
||||
|
||||
def latest_deployment_for_request_id(self):
|
||||
raw = self.latest_raw_for_request_id()
|
||||
return raw and raw.deployment
|
||||
|
||||
def latest_raw_for_request_id(self):
|
||||
return RawData.objects.filter(
|
||||
return self.request_id and RawData.objects.filter(
|
||||
request_id=self.request_id).order_by('-id')[0]
|
||||
|
||||
def host(self):
|
||||
return self.latest_raw_for_request_id().host
|
||||
raw = self.latest_raw_for_request_id()
|
||||
return raw and raw.host
|
||||
|
||||
@staticmethod
|
||||
def find(instance, launched_at):
|
||||
@ -356,6 +362,7 @@ class InstanceExists(models.Model):
|
||||
exists = InstanceExists.objects.get(message_id=message_id)
|
||||
if exists.status == InstanceExists.PENDING:
|
||||
exists.status = InstanceExists.SENT_UNVERIFIED
|
||||
exists.send_status = '201'
|
||||
exists.save()
|
||||
else:
|
||||
exists_not_pending.append(message_id)
|
||||
@ -409,6 +416,22 @@ class JsonReport(models.Model):
|
||||
json = models.TextField()
|
||||
|
||||
|
||||
class TenantType(models.Model):
|
||||
name = models.CharField(max_length=50, db_index=True)
|
||||
value = models.CharField(max_length=50, db_index=True)
|
||||
|
||||
|
||||
class TenantInfo(models.Model):
|
||||
"""This contains tenant information synced from an external source.
|
||||
It's mostly used as a cache to put things like tenant name on reports
|
||||
without making alot of calls to an external system."""
|
||||
tenant = models.CharField(max_length=50, db_index=True, unique=True)
|
||||
name = models.CharField(max_length=100, null=True,
|
||||
blank=True, db_index=True)
|
||||
types = models.ManyToManyField(TenantType)
|
||||
last_updated = models.DateTimeField(db_index=True)
|
||||
|
||||
|
||||
class GlanceRawData(models.Model):
|
||||
result_titles = [["#", "?", "When", "Deployment", "Event", "Host",
|
||||
"Status"]]
|
||||
@ -575,6 +598,7 @@ class ImageExists(models.Model):
|
||||
for exists in exists_list:
|
||||
if exists.status == ImageExists.PENDING:
|
||||
exists.status = ImageExists.SENT_UNVERIFIED
|
||||
exists.send_status = '201'
|
||||
exists.save()
|
||||
else:
|
||||
exists_not_pending.append(message_id)
|
||||
|
@ -5,9 +5,9 @@
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
@ -495,98 +495,6 @@ def do_kpi(request, tenant_id=None):
|
||||
return rsp(json.dumps(results))
|
||||
|
||||
|
||||
def do_list_usage_launches(request):
|
||||
|
||||
filter_args = {}
|
||||
if 'instance' in request.GET:
|
||||
uuid = request.GET['instance']
|
||||
if not utils.is_uuid_like(uuid):
|
||||
msg = "%s is not uuid-like" % uuid
|
||||
return error_response(400, 'Bad Request', msg)
|
||||
filter_args['instance'] = uuid
|
||||
|
||||
model = models.InstanceUsage.objects
|
||||
if len(filter_args) > 0:
|
||||
launches = model_search(request, model, filter_args)
|
||||
else:
|
||||
launches = model_search(request, model, None)
|
||||
|
||||
results = [["UUID", "Launched At", "Instance Type Id",
|
||||
"Instance Flavor Id"]]
|
||||
|
||||
for launch in launches:
|
||||
launched = None
|
||||
if launch.launched_at:
|
||||
launched = str(dt.dt_from_decimal(launch.launched_at))
|
||||
results.append([launch.instance, launched, launch.instance_type_id,
|
||||
launch.instance_flavor_id])
|
||||
|
||||
return rsp(json.dumps(results))
|
||||
|
||||
|
||||
def do_list_usage_deletes(request):
|
||||
|
||||
filter_args = {}
|
||||
if 'instance' in request.GET:
|
||||
uuid = request.GET['instance']
|
||||
if not utils.is_uuid_like(uuid):
|
||||
msg = "%s is not uuid-like" % uuid
|
||||
return error_response(400, 'Bad Request', msg)
|
||||
filter_args['instance'] = uuid
|
||||
|
||||
model = models.InstanceDeletes.objects
|
||||
if len(filter_args) > 0:
|
||||
deletes = model_search(request, model, filter_args)
|
||||
else:
|
||||
deletes = model_search(request, model, None)
|
||||
|
||||
results = [["UUID", "Launched At", "Deleted At"]]
|
||||
|
||||
for delete in deletes:
|
||||
launched = None
|
||||
if delete.launched_at:
|
||||
launched = str(dt.dt_from_decimal(delete.launched_at))
|
||||
deleted = None
|
||||
if delete.deleted_at:
|
||||
deleted = str(dt.dt_from_decimal(delete.deleted_at))
|
||||
results.append([delete.instance, launched, deleted])
|
||||
|
||||
return rsp(json.dumps(results))
|
||||
|
||||
|
||||
def do_list_usage_exists(request):
|
||||
|
||||
filter_args = {}
|
||||
if 'instance' in request.GET:
|
||||
uuid = request.GET['instance']
|
||||
if not utils.is_uuid_like(uuid):
|
||||
msg = "%s is not uuid-like" % uuid
|
||||
return error_response(400, 'Bad Request', msg)
|
||||
filter_args['instance'] = uuid
|
||||
|
||||
model = models.InstanceExists.objects
|
||||
if len(filter_args) > 0:
|
||||
exists = model_search(request, model, filter_args)
|
||||
else:
|
||||
exists = model_search(request, model, None)
|
||||
|
||||
results = [["UUID", "Launched At", "Deleted At", "Instance Type Id",
|
||||
"Instance Flavor Id", "Message ID", "Status"]]
|
||||
|
||||
for exist in exists:
|
||||
launched = None
|
||||
if exist.launched_at:
|
||||
launched = str(dt.dt_from_decimal(exist.launched_at))
|
||||
deleted = None
|
||||
if exist.deleted_at:
|
||||
deleted = str(dt.dt_from_decimal(exist.deleted_at))
|
||||
results.append([exist.instance, launched, deleted,
|
||||
exist.instance_type_id, exist.instance_flavor_id,
|
||||
exist.message_id, exist.status])
|
||||
|
||||
return rsp(json.dumps(results))
|
||||
|
||||
|
||||
def do_jsonreports(request):
|
||||
yesterday = datetime.datetime.utcnow() - datetime.timedelta(days=1)
|
||||
now = datetime.datetime.utcnow()
|
||||
@ -720,7 +628,7 @@ def do_jsonreports_search(request):
|
||||
report.name,
|
||||
report.version])
|
||||
except BadRequestException as be:
|
||||
return error_response(400, 'Bad Request', be.message)
|
||||
return error_response(400, 'Bad Request', str(be))
|
||||
except ValidationError as ve:
|
||||
return error_response(400, 'Bad Request', ve.messages[0])
|
||||
|
||||
|
@ -59,12 +59,6 @@ stacky_urls = (
|
||||
url(r'^stacky/search/$', 'stacktach.stacky_server.search'),
|
||||
url(r'^stacky/kpi/$', 'stacktach.stacky_server.do_kpi'),
|
||||
url(r'^stacky/kpi/(?P<tenant_id>\w+)/$', 'stacktach.stacky_server.do_kpi'),
|
||||
url(r'^stacky/usage/launches/$',
|
||||
'stacktach.stacky_server.do_list_usage_launches'),
|
||||
url(r'^stacky/usage/deletes/$',
|
||||
'stacktach.stacky_server.do_list_usage_deletes'),
|
||||
url(r'^stacky/usage/exists/$',
|
||||
'stacktach.stacky_server.do_list_usage_exists'),
|
||||
)
|
||||
|
||||
dbapi_urls = (
|
||||
@ -109,6 +103,11 @@ dbapi_urls = (
|
||||
'stacktach.dbapi.get_usage_exist_stats_glance'),
|
||||
url(r'^db/stats/events/', 'stacktach.dbapi.get_event_stats'),
|
||||
url(r'^db/repair/', 'stacktach.dbapi.repair_stacktach_down'),
|
||||
url(r'db/tenant/info/(?P<tenant_id>\w+)/$',
|
||||
'stacktach.dbapi.update_tenant_info'),
|
||||
url(r'db/tenant/batch_info/$',
|
||||
'stacktach.dbapi.batch_update_tenant_info'),
|
||||
|
||||
)
|
||||
|
||||
urlpatterns = patterns('', *(web_urls + stacky_urls + dbapi_urls))
|
||||
|
@ -34,6 +34,17 @@ from utils import MESSAGE_ID_3
|
||||
from utils import MESSAGE_ID_4
|
||||
|
||||
|
||||
class Length(mox.Comparator):
|
||||
def __init__(self, l):
|
||||
self._len = l
|
||||
|
||||
def equals(self, rhs):
|
||||
return self._len == len(rhs)
|
||||
|
||||
def __repr__(self):
|
||||
return "<sequence with len %s >" % self._len
|
||||
|
||||
|
||||
class DBAPITestCase(StacktachBaseTestCase):
|
||||
def setUp(self):
|
||||
self.mox = mox.Mox()
|
||||
@ -411,6 +422,128 @@ class DBAPITestCase(StacktachBaseTestCase):
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_update_tenant_info(self):
|
||||
TEST_TENANT='test'
|
||||
|
||||
models.TenantInfo.objects = self.mox.CreateMockAnything()
|
||||
models.TenantType.objects = self.mox.CreateMockAnything()
|
||||
|
||||
fake_request = self.mox.CreateMockAnything()
|
||||
fake_request.method = 'PUT'
|
||||
body_dict = dict(tenant=TEST_TENANT,
|
||||
name='test name',
|
||||
types=dict(test_type='thingy'))
|
||||
body = json.dumps(body_dict)
|
||||
fake_request.body = body
|
||||
|
||||
info = self.mox.CreateMockAnything()
|
||||
info_result = self.mox.CreateMockAnything()
|
||||
models.TenantInfo.objects.select_for_update().AndReturn(info_result)
|
||||
info_result.get(tenant=TEST_TENANT).AndReturn(info)
|
||||
info.save()
|
||||
|
||||
ttype = self.mox.CreateMockAnything()
|
||||
models.TenantType.objects.get(name='test_type', value='thingy').AndReturn(ttype)
|
||||
ttype.__hash__().AndReturn(hash('test_type'))
|
||||
info.save()
|
||||
|
||||
self.mox.ReplayAll()
|
||||
|
||||
dbapi.update_tenant_info(fake_request, TEST_TENANT)
|
||||
|
||||
self.assertEqual(info.name, 'test name')
|
||||
self.assertEqual(info.types, [ttype])
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_batch_update_tenant_info(self):
|
||||
TEST_DATE='test date time'
|
||||
|
||||
mock_t1 = self.mox.CreateMock(models.TenantInfo)
|
||||
mock_t1.id = 1
|
||||
mock_t1.tenant = 'test_old'
|
||||
mock_t1.name = 'test old name'
|
||||
mock_t1.types = self.mox.CreateMockAnything()
|
||||
mock_t1.types.all().AndReturn([])
|
||||
mock_t1.last_updated = TEST_DATE
|
||||
|
||||
mock_t2 = self.mox.CreateMock(models.TenantInfo)
|
||||
mock_t2.id = 2
|
||||
mock_t2.tenant = 'test_new'
|
||||
mock_t2.name = 'test new name'
|
||||
mock_t2.last_updated = TEST_DATE
|
||||
mock_t2.types = self.mox.CreateMockAnything()
|
||||
mock_t2.types.all().AndReturn([])
|
||||
TEST_OBJECTS = [mock_t1, mock_t2]
|
||||
|
||||
mock_tt1 = self.mox.CreateMock(models.TenantType)
|
||||
mock_tt1.id = 1
|
||||
mock_tt1.name = 'test_type'
|
||||
mock_tt1.value = 'thingy'
|
||||
|
||||
mock_tt2 = self.mox.CreateMock(models.TenantType)
|
||||
mock_tt2.id = 2
|
||||
mock_tt2.name = 'test_type'
|
||||
mock_tt2.value = 'whatzit'
|
||||
TEST_TYPES = [mock_tt1, mock_tt2]
|
||||
|
||||
models.TenantInfo.objects = self.mox.CreateMockAnything()
|
||||
models.TenantType.objects = self.mox.CreateMockAnything()
|
||||
TypeXref = models.TenantInfo.types.through
|
||||
TypeXref.objects = self.mox.CreateMockAnything()
|
||||
|
||||
self.mox.StubOutWithMock(dbapi, 'datetime')
|
||||
dbapi.datetime.utcnow().AndReturn(TEST_DATE)
|
||||
|
||||
fake_request = self.mox.CreateMockAnything()
|
||||
fake_request.method = 'PUT'
|
||||
body_dict = dict(tenants=[dict(tenant='test_old',
|
||||
name='test old name',
|
||||
types=dict(test_type='thingy')),
|
||||
dict(tenant='test_new',
|
||||
name='test new name',
|
||||
types=dict(test_type='whatzit'))])
|
||||
body = json.dumps(body_dict)
|
||||
fake_request.body = body
|
||||
|
||||
info_values = self.mox.CreateMockAnything()
|
||||
models.TenantInfo.objects.filter(tenant__in=['test_old', 'test_new']).AndReturn(info_values)
|
||||
info_values.values('tenant').AndReturn([dict(tenant='test_old')])
|
||||
models.TenantInfo.objects.bulk_create(mox.And(
|
||||
Length(1), mox.IsA(list), mox.In(mox.And(
|
||||
mox.IsA(models.TenantInfo),
|
||||
mox.ContainsAttributeValue('tenant','test_new'),
|
||||
mox.ContainsAttributeValue('name', 'test new name'),
|
||||
mox.ContainsAttributeValue('last_updated', TEST_DATE)
|
||||
))))
|
||||
|
||||
fake_tenants = self.mox.CreateMockAnything()
|
||||
models.TenantInfo.objects.filter(tenant__in=['test_old', 'test_new'])\
|
||||
.AndReturn(fake_tenants)
|
||||
fake_tenants.update(last_updated=TEST_DATE)
|
||||
fake_tenants.__iter__().AndReturn(iter(TEST_OBJECTS))
|
||||
|
||||
models.TenantType.objects.all().AndReturn(TEST_TYPES)
|
||||
|
||||
mock_query = self.mox.CreateMockAnything()
|
||||
TypeXref.objects.filter(tenantinfo_id__in=[]).AndReturn(mock_query)
|
||||
mock_query.delete()
|
||||
|
||||
TypeXref.objects.bulk_create(mox.And(
|
||||
Length(2), mox.IsA(list),
|
||||
mox.In(mox.And(
|
||||
mox.IsA(TypeXref),
|
||||
mox.ContainsAttributeValue('tenantinfo_id', 1),
|
||||
mox.ContainsAttributeValue('tenanttype_id', 1))),
|
||||
mox.In(mox.And(
|
||||
mox.IsA(TypeXref),
|
||||
mox.ContainsAttributeValue('tenantinfo_id', 2),
|
||||
mox.ContainsAttributeValue('tenanttype_id', 2))),
|
||||
))
|
||||
|
||||
self.mox.ReplayAll()
|
||||
dbapi.batch_update_tenant_info(fake_request)
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_send_status(self):
|
||||
fake_request = self.mox.CreateMockAnything()
|
||||
fake_request.method = 'PUT'
|
||||
|
@ -21,12 +21,14 @@ import uuid
|
||||
import kombu
|
||||
|
||||
import mox
|
||||
from tests.unit import utils
|
||||
|
||||
from stacktach import datetime_to_decimal as dt
|
||||
from stacktach import stacklog
|
||||
from stacktach import models
|
||||
from tests.unit import StacktachBaseTestCase
|
||||
from utils import IMAGE_UUID_1, SIZE_1, SIZE_2, CREATED_AT_1, CREATED_AT_2
|
||||
from utils import IMAGE_OWNER_1, IMAGE_OWNER_2, DELETED_AT_1, DELETED_AT_2
|
||||
from utils import GLANCE_VERIFIER_EVENT_TYPE
|
||||
from utils import make_verifier_config
|
||||
from verifier import glance_verifier
|
||||
@ -80,7 +82,10 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_usage_created_at_mismatch(self):
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-01 01:02:03')
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.uuid = IMAGE_UUID_1
|
||||
exist.usage = self.mox.CreateMockAnything()
|
||||
exist.created_at = CREATED_AT_1
|
||||
exist.usage.created_at = CREATED_AT_2
|
||||
@ -90,30 +95,37 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
|
||||
glance_verifier._verify_for_usage(exist)
|
||||
|
||||
exception = cm.exception
|
||||
entity_1 = {'name': 'exists', 'value': CREATED_AT_1}
|
||||
entity_2 = {'name': 'launches', 'value': CREATED_AT_2}
|
||||
self.assertEqual(exception.field_name, 'created_at')
|
||||
self.assertEqual(exception.expected, CREATED_AT_1)
|
||||
self.assertEqual(exception.actual, CREATED_AT_2)
|
||||
|
||||
self.assertEqual(exception.entity_1, entity_1)
|
||||
self.assertEqual(exception.entity_2, entity_2)
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_usage_owner_mismatch(self):
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-01 01:02:03')
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.uuid = IMAGE_UUID_1
|
||||
exist.usage = self.mox.CreateMockAnything()
|
||||
exist.owner = 'owner'
|
||||
exist.usage.owner = 'not_owner'
|
||||
exist.owner = IMAGE_OWNER_1
|
||||
exist.usage.owner = IMAGE_OWNER_2
|
||||
self.mox.ReplayAll()
|
||||
|
||||
with self.assertRaises(FieldMismatch) as cm:
|
||||
glance_verifier._verify_for_usage(exist)
|
||||
|
||||
exception = cm.exception
|
||||
entity_1 = {'name': 'exists', 'value': IMAGE_OWNER_1}
|
||||
entity_2 = {'name': 'launches', 'value': IMAGE_OWNER_2}
|
||||
self.assertEqual(exception.field_name, 'owner')
|
||||
self.assertEqual(exception.expected, 'owner')
|
||||
self.assertEqual(exception.actual, 'not_owner')
|
||||
self.assertEqual(exception.entity_1, entity_1)
|
||||
self.assertEqual(exception.entity_2, entity_2)
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_usage_size_mismatch(self):
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.uuid = IMAGE_UUID_1
|
||||
exist.size = SIZE_1
|
||||
|
||||
exist.usage = self.mox.CreateMockAnything()
|
||||
@ -123,11 +135,11 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
|
||||
with self.assertRaises(FieldMismatch) as cm:
|
||||
glance_verifier._verify_for_usage(exist)
|
||||
exception = cm.exception
|
||||
|
||||
entity_1 = {'name': 'exists', 'value': SIZE_1}
|
||||
entity_2 = {'name': 'launches', 'value': SIZE_2}
|
||||
self.assertEqual(exception.field_name, 'size')
|
||||
self.assertEqual(exception.expected, SIZE_1)
|
||||
self.assertEqual(exception.actual, SIZE_2)
|
||||
|
||||
self.assertEqual(exception.entity_1, entity_1)
|
||||
self.assertEqual(exception.entity_2, entity_2)
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_usage_for_late_usage(self):
|
||||
@ -236,30 +248,33 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_delete_deleted_at_mismatch(self):
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-02 03:04:05')
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.uuid = IMAGE_UUID_1
|
||||
exist.delete = self.mox.CreateMockAnything()
|
||||
exist.deleted_at = decimal.Decimal('5.1')
|
||||
exist.delete.deleted_at = decimal.Decimal('4.1')
|
||||
exist.deleted_at = DELETED_AT_1
|
||||
exist.delete.deleted_at = DELETED_AT_2
|
||||
self.mox.ReplayAll()
|
||||
|
||||
with self.assertRaises(FieldMismatch) as fm:
|
||||
glance_verifier._verify_for_delete(exist)
|
||||
exception = fm.exception
|
||||
entity_1 = {'name': 'exists', 'value': DELETED_AT_1}
|
||||
entity_2 = {'name': 'deletes', 'value': DELETED_AT_2}
|
||||
self.assertEqual(exception.field_name, 'deleted_at')
|
||||
self.assertEqual(exception.expected, decimal.Decimal('5.1'))
|
||||
self.assertEqual(exception.actual, decimal.Decimal('4.1'))
|
||||
self.assertEqual(exception.entity_1, entity_1)
|
||||
self.assertEqual(exception.entity_2, entity_2)
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_should_verify_that_image_size_in_exist_is_not_null(self):
|
||||
self.mox.StubOutWithMock(datetime, 'datetime')
|
||||
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
|
||||
self.mox.ReplayAll()
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-02 03:04:05')
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.id = 23
|
||||
exist.size = None
|
||||
exist.created_at = decimal.Decimal('5.1')
|
||||
exist.uuid = '1234-5678-9012-3456'
|
||||
exist.uuid = IMAGE_UUID_1
|
||||
self.mox.ReplayAll()
|
||||
|
||||
try:
|
||||
@ -269,20 +284,18 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
|
||||
self.assertEqual(nf.field_name, 'image_size')
|
||||
self.assertEqual(
|
||||
nf.reason, "Failed at 2014-01-02 03:04:05 UTC for "
|
||||
"1234-5678-9012-3456: image_size field was null for "
|
||||
"exist id 23")
|
||||
"12345678-6352-4dbc-8271-96cc54bf14cd: image_size field was "
|
||||
"null for exist id 23")
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_should_verify_that_created_at_in_exist_is_not_null(self):
|
||||
self.mox.StubOutWithMock(datetime, 'datetime')
|
||||
datetime.datetime.utcnow().AndReturn('2014-01-01 01:02:03')
|
||||
self.mox.ReplayAll()
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-01 01:02:03')
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.id = 23
|
||||
exist.size = 'size'
|
||||
exist.created_at = None
|
||||
exist.uuid = '1234-5678-9012-3456'
|
||||
exist.uuid = IMAGE_UUID_1
|
||||
self.mox.ReplayAll()
|
||||
|
||||
with self.assertRaises(NullFieldException) as nfe:
|
||||
@ -292,14 +305,12 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
|
||||
self.assertEqual(exception.field_name, 'created_at')
|
||||
self.assertEqual(exception.reason,
|
||||
"Failed at 2014-01-01 01:02:03 UTC for "
|
||||
"1234-5678-9012-3456: created_at field was "
|
||||
"null for exist id 23")
|
||||
"12345678-6352-4dbc-8271-96cc54bf14cd: created_at "
|
||||
"field was null for exist id 23")
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_should_verify_that_uuid_in_exist_is_not_null(self):
|
||||
self.mox.StubOutWithMock(datetime, 'datetime')
|
||||
datetime.datetime.utcnow().AndReturn('2014-01-01 01:02:03')
|
||||
self.mox.ReplayAll()
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-01 01:02:03')
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.id = 23
|
||||
@ -319,15 +330,13 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_should_verify_that_owner_in_exist_is_not_null(self):
|
||||
self.mox.StubOutWithMock(datetime, 'datetime')
|
||||
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
|
||||
self.mox.ReplayAll()
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-01 01:02:03')
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.id = 23
|
||||
exist.size = 1234
|
||||
exist.created_at = decimal.Decimal('5.1')
|
||||
exist.uuid = '1234-5678-9012-3456'
|
||||
exist.uuid = IMAGE_UUID_1
|
||||
exist.owner = None
|
||||
self.mox.ReplayAll()
|
||||
|
||||
@ -337,8 +346,10 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
|
||||
except NullFieldException as nf:
|
||||
self.assertEqual(nf.field_name, 'owner')
|
||||
self.assertEqual(
|
||||
nf.reason, "Failed at 2014-01-02 03:04:05 UTC for "
|
||||
"1234-5678-9012-3456: owner field was null for exist id 23")
|
||||
nf.reason,
|
||||
"Failed at 2014-01-01 01:02:03 UTC for "
|
||||
"12345678-6352-4dbc-8271-96cc54bf14cd: owner field was null "
|
||||
"for exist id 23")
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_should_verify_that_uuid_value_is_uuid_like(self):
|
||||
@ -413,9 +424,7 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_should_verify_owner_is_of_type_hex(self):
|
||||
self.mox.StubOutWithMock(datetime, 'datetime')
|
||||
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
|
||||
self.mox.ReplayAll()
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-02 03:04:05')
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.id = 23
|
||||
@ -470,9 +479,7 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
|
||||
self.assertTrue(verified)
|
||||
|
||||
def test_verify_exist_marks_exist_failed_if_field_mismatch_exception(self):
|
||||
self.mox.StubOutWithMock(datetime, 'datetime')
|
||||
datetime.datetime.utcnow().AndReturn('2014-01-01 01:01:01')
|
||||
self.mox.ReplayAll()
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-02 03:04:05')
|
||||
|
||||
exist1 = self.mox.CreateMockAnything()
|
||||
exist2 = self.mox.CreateMockAnything()
|
||||
@ -480,14 +487,15 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
|
||||
self.mox.StubOutWithMock(glance_verifier, '_verify_for_usage')
|
||||
self.mox.StubOutWithMock(glance_verifier, '_verify_for_delete')
|
||||
self.mox.StubOutWithMock(glance_verifier, '_verify_validity')
|
||||
field_mismatch_exc = FieldMismatch('field', 'expected',
|
||||
'actual', 'uuid')
|
||||
entity_1 = {'name': 'exists', 'value': 'expected'}
|
||||
entity_2 = {'name': 'launches', 'value': 'actual'}
|
||||
field_mismatch_exc = FieldMismatch('field', entity_1, entity_2, 'uuid')
|
||||
glance_verifier._verify_for_usage(exist1).AndRaise(
|
||||
exception=field_mismatch_exc)
|
||||
exist1.mark_failed(
|
||||
reason="Failed at 2014-01-01 01:01:01 UTC for uuid: Expected "
|
||||
"field to be 'expected' got 'actual'")
|
||||
|
||||
reason="Failed at 2014-01-02 03:04:05 UTC for uuid: Data mismatch "
|
||||
"for 'field' - 'exists' contains 'expected' but 'launches' "
|
||||
"contains 'actual'")
|
||||
glance_verifier._verify_for_usage(exist2)
|
||||
glance_verifier._verify_for_delete(exist2)
|
||||
glance_verifier._verify_validity(exist2)
|
||||
@ -498,7 +506,6 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
|
||||
self.mox.VerifyAll()
|
||||
self.assertFalse(verified)
|
||||
|
||||
|
||||
def test_verify_for_range_without_callback_for_sent_unverified(self):
|
||||
mock_logger = self._setup_mock_logger()
|
||||
self.mox.StubOutWithMock(mock_logger, 'info')
|
||||
|
@ -134,6 +134,9 @@ class ImageExistsTestCase(unittest.TestCase):
|
||||
results = ImageExists.mark_exists_as_sent_unverified(message_ids)
|
||||
|
||||
self.assertEqual(results, ([], []))
|
||||
self.assertEqual(exist1.send_status, '201')
|
||||
self.assertEqual(exist2.send_status, '201')
|
||||
self.assertEqual(exist3.send_status, '201')
|
||||
|
||||
self.mox.VerifyAll()
|
||||
|
||||
@ -157,6 +160,8 @@ class ImageExistsTestCase(unittest.TestCase):
|
||||
|
||||
self.assertEqual(results, (['9156b83e-f684-4ec3-8f94-7e41902f27aa'],
|
||||
[]))
|
||||
self.assertEqual(exist1.send_status, '201')
|
||||
self.assertEqual(exist2.send_status, '201')
|
||||
|
||||
self.mox.VerifyAll()
|
||||
|
||||
@ -183,7 +188,8 @@ class ImageExistsTestCase(unittest.TestCase):
|
||||
|
||||
self.assertEqual(results, ([],
|
||||
["0708cb0b-6169-4d7c-9f58-3cf3d5bf694b"]))
|
||||
|
||||
self.assertEqual(exist1.send_status, '201')
|
||||
self.assertEqual(exist3.send_status, '201')
|
||||
self.mox.VerifyAll()
|
||||
|
||||
|
||||
@ -230,7 +236,8 @@ class InstanceExistsTestCase(unittest.TestCase):
|
||||
results = InstanceExists.mark_exists_as_sent_unverified(message_ids)
|
||||
|
||||
self.assertEqual(results, ([], []))
|
||||
|
||||
self.assertEqual(exist1.send_status, '201')
|
||||
self.assertEqual(exist2.send_status, '201')
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_mark_exists_as_sent_unverified_return_absent_exists(self):
|
||||
@ -250,7 +257,7 @@ class InstanceExistsTestCase(unittest.TestCase):
|
||||
|
||||
self.assertEqual(results, (['9156b83e-f684-4ec3-8f94-7e41902f27aa'],
|
||||
[]))
|
||||
|
||||
self.assertEqual(exist1.send_status, '201')
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_mark_exists_as_sent_unverified_and_return_exist_not_pending(self):
|
||||
@ -271,6 +278,6 @@ class InstanceExistsTestCase(unittest.TestCase):
|
||||
|
||||
self.assertEqual(results, ([],
|
||||
["9156b83e-f684-4ec3-8f94-7e41902f27aa"]))
|
||||
|
||||
self.assertEqual(exist1.send_status, '201')
|
||||
self.mox.VerifyAll()
|
||||
|
||||
|
@ -28,7 +28,8 @@ from stacktach import datetime_to_decimal as dt
|
||||
from stacktach import stacklog
|
||||
from stacktach import models
|
||||
from tests.unit import StacktachBaseTestCase
|
||||
from utils import make_verifier_config, LAUNCHED_AT_1, INSTANCE_FLAVOR_ID_1
|
||||
from tests.unit import utils
|
||||
from utils import make_verifier_config, LAUNCHED_AT_1, INSTANCE_FLAVOR_ID_1
|
||||
from utils import INSTANCE_FLAVOR_ID_2, FLAVOR_FIELD_NAME, DELETED_AT_1
|
||||
from utils import LAUNCHED_AT_2, DELETED_AT_2
|
||||
from utils import INSTANCE_ID_1
|
||||
@ -97,6 +98,7 @@ class NovaVerifierVerifyForLaunchTestCase(StacktachBaseTestCase):
|
||||
def test_verify_for_launch_launched_at_in_range(self):
|
||||
self.mox.StubOutWithMock(config, 'flavor_field_name')
|
||||
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.usage = self.mox.CreateMockAnything()
|
||||
exist.launched_at = decimal.Decimal('1.0')
|
||||
@ -111,24 +113,29 @@ class NovaVerifierVerifyForLaunchTestCase(StacktachBaseTestCase):
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_for_launch_launched_at_missmatch(self):
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-02 03:04:05')
|
||||
|
||||
self.mox.StubOutWithMock(config, 'flavor_field_name')
|
||||
config.flavor_field_name().AndReturn("flavor_field_name")
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.instance = INSTANCE_ID_1
|
||||
exist.usage = self.mox.CreateMockAnything()
|
||||
exist.launched_at = decimal.Decimal('1.1')
|
||||
exist.launched_at = LAUNCHED_AT_1
|
||||
exist.dummy_flavor_field_name = 'dummy_flavor'
|
||||
exist.usage.launched_at = decimal.Decimal('2.1')
|
||||
exist.usage.launched_at = LAUNCHED_AT_2
|
||||
exist.usage.dummy_flavor_field_name = 'dummy_flavor'
|
||||
self.mox.ReplayAll()
|
||||
|
||||
try:
|
||||
with self.assertRaises(FieldMismatch) as fm:
|
||||
nova_verifier._verify_for_launch(exist)
|
||||
self.fail()
|
||||
except FieldMismatch, fm:
|
||||
self.assertEqual(fm.field_name, 'launched_at')
|
||||
self.assertEqual(fm.expected, decimal.Decimal('1.1'))
|
||||
self.assertEqual(fm.actual, decimal.Decimal('2.1'))
|
||||
exception = fm.exception
|
||||
|
||||
entity_1 = {'name': 'exists', 'value': LAUNCHED_AT_1}
|
||||
entity_2 = {'name': 'launches', 'value': LAUNCHED_AT_2}
|
||||
self.assertEqual(exception.field_name, 'launched_at')
|
||||
self.assertEqual(exception.entity_1, entity_1)
|
||||
self.assertEqual(exception.entity_2, entity_2)
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_for_launch_flavor_id_missmatch(self):
|
||||
@ -138,6 +145,7 @@ class NovaVerifierVerifyForLaunchTestCase(StacktachBaseTestCase):
|
||||
|
||||
self.mox.StubOutWithMock(config, 'flavor_field_name')
|
||||
config.flavor_field_name().AndReturn(FLAVOR_FIELD_NAME)
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.instance = INSTANCE_ID_1
|
||||
exist.usage = self.mox.CreateMockAnything()
|
||||
@ -146,25 +154,27 @@ class NovaVerifierVerifyForLaunchTestCase(StacktachBaseTestCase):
|
||||
exist.usage.launched_at = decimal.Decimal(LAUNCHED_AT_1)
|
||||
exist.usage.flavor_field_name = INSTANCE_FLAVOR_ID_2
|
||||
self.mox.ReplayAll()
|
||||
|
||||
with self.assertRaises(FieldMismatch) as fm:
|
||||
nova_verifier._verify_for_launch(exist)
|
||||
exception = fm.exception
|
||||
self.assertEqual(exception.field_name, FLAVOR_FIELD_NAME)
|
||||
self.assertEqual(exception.expected, INSTANCE_FLAVOR_ID_1)
|
||||
self.assertEqual(exception.actual, INSTANCE_FLAVOR_ID_2)
|
||||
self.assertEqual(
|
||||
exception.reason,
|
||||
"Failed at 2014-01-02 03:04:05 UTC for "
|
||||
"08f685d9-6352-4dbc-8271-96cc54bf14cd: Expected flavor_field_name "
|
||||
"to be '1' got 'performance2-120'")
|
||||
|
||||
entity_1 = {'name': 'exists', 'value': INSTANCE_FLAVOR_ID_1}
|
||||
entity_2 = {'name': 'launches', 'value': INSTANCE_FLAVOR_ID_2}
|
||||
self.assertEqual(exception.field_name, 'flavor_field_name')
|
||||
self.assertEqual(exception.entity_1, entity_1)
|
||||
self.assertEqual(exception.entity_2, entity_2)
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_for_launch_tenant_id_mismatch(self):
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-02 03:04:05')
|
||||
|
||||
self.mox.StubOutWithMock(config, 'flavor_field_name')
|
||||
config.flavor_field_name().AndReturn(FLAVOR_FIELD_NAME)
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.tenant = TENANT_ID_1
|
||||
exist.instance = INSTANCE_ID_1
|
||||
|
||||
exist.usage = self.mox.CreateMockAnything()
|
||||
exist.usage.tenant = TENANT_ID_2
|
||||
@ -174,17 +184,21 @@ class NovaVerifierVerifyForLaunchTestCase(StacktachBaseTestCase):
|
||||
nova_verifier._verify_for_launch(exist)
|
||||
exception = cm.exception
|
||||
|
||||
entity_1 = {'name': 'exists', 'value': TENANT_ID_1}
|
||||
entity_2 = {'name': 'launches', 'value': TENANT_ID_2}
|
||||
self.assertEqual(exception.field_name, 'tenant')
|
||||
self.assertEqual(exception.expected, TENANT_ID_1)
|
||||
self.assertEqual(exception.actual, TENANT_ID_2)
|
||||
|
||||
self.assertEqual(exception.entity_1, entity_1)
|
||||
self.assertEqual(exception.entity_2, entity_2)
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_for_launch_rax_options_mismatch(self):
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-02 03:04:05')
|
||||
|
||||
self.mox.StubOutWithMock(config, 'flavor_field_name')
|
||||
config.flavor_field_name().AndReturn("flavor_field_name")
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.rax_options = RAX_OPTIONS_1
|
||||
exist.instance = INSTANCE_ID_1
|
||||
|
||||
exist.usage = self.mox.CreateMockAnything()
|
||||
exist.usage.rax_options = RAX_OPTIONS_2
|
||||
@ -193,18 +207,22 @@ class NovaVerifierVerifyForLaunchTestCase(StacktachBaseTestCase):
|
||||
with self.assertRaises(FieldMismatch) as cm:
|
||||
nova_verifier._verify_for_launch(exist)
|
||||
exception = cm.exception
|
||||
|
||||
entity_1 = {'name': 'exists', 'value': RAX_OPTIONS_1}
|
||||
entity_2 = {'name': 'launches', 'value': RAX_OPTIONS_2}
|
||||
self.assertEqual(exception.field_name, 'rax_options')
|
||||
self.assertEqual(exception.expected, RAX_OPTIONS_1)
|
||||
self.assertEqual(exception.actual, RAX_OPTIONS_2)
|
||||
self.assertEqual(exception.entity_1, entity_1)
|
||||
self.assertEqual(exception.entity_2, entity_2)
|
||||
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_for_launch_os_distro_mismatch(self):
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-02 03:04:05')
|
||||
|
||||
self.mox.StubOutWithMock(config, 'flavor_field_name')
|
||||
config.flavor_field_name().AndReturn("flavor_field_name")
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.os_distro = OS_DISTRO_1
|
||||
exist.instance = INSTANCE_ID_1
|
||||
|
||||
exist.usage = self.mox.CreateMockAnything()
|
||||
exist.usage.os_distro = OS_DISTRO_2
|
||||
@ -214,16 +232,20 @@ class NovaVerifierVerifyForLaunchTestCase(StacktachBaseTestCase):
|
||||
nova_verifier._verify_for_launch(exist)
|
||||
exception = cm.exception
|
||||
|
||||
entity_1 = {'name': 'exists', 'value': OS_DISTRO_1}
|
||||
entity_2 = {'name': 'launches', 'value': OS_DISTRO_2}
|
||||
self.assertEqual(exception.field_name, 'os_distro')
|
||||
self.assertEqual(exception.expected, OS_DISTRO_1)
|
||||
self.assertEqual(exception.actual, OS_DISTRO_2)
|
||||
self.assertEqual(exception.entity_1, entity_1)
|
||||
self.assertEqual(exception.entity_2, entity_2)
|
||||
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_for_launch_os_architecture_mismatch(self):
|
||||
self.mox.StubOutWithMock(config, 'flavor_field_name')
|
||||
config.flavor_field_name().AndReturn("flavor_field_name")
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.instance = INSTANCE_ID_1
|
||||
exist.os_architecture = OS_ARCH_1
|
||||
|
||||
exist.usage = self.mox.CreateMockAnything()
|
||||
@ -234,17 +256,20 @@ class NovaVerifierVerifyForLaunchTestCase(StacktachBaseTestCase):
|
||||
nova_verifier._verify_for_launch(exist)
|
||||
exception = cm.exception
|
||||
|
||||
entity_1 = {'name': 'exists', 'value': OS_ARCH_1}
|
||||
entity_2 = {'name': 'launches', 'value': OS_ARCH_2}
|
||||
self.assertEqual(exception.field_name, 'os_architecture')
|
||||
self.assertEqual(exception.expected, OS_ARCH_1)
|
||||
self.assertEqual(exception.actual, OS_ARCH_2)
|
||||
|
||||
self.assertEqual(exception.entity_1, entity_1)
|
||||
self.assertEqual(exception.entity_2, entity_2)
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_for_launch_os_version_mismatch(self):
|
||||
self.mox.StubOutWithMock(config, 'flavor_field_name')
|
||||
config.flavor_field_name().AndReturn("flavor_field_name")
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.os_version = OS_VERSION_1
|
||||
exist.instance = INSTANCE_ID_1
|
||||
|
||||
exist.usage = self.mox.CreateMockAnything()
|
||||
exist.usage.os_version = OS_VERSION_2
|
||||
@ -254,15 +279,17 @@ class NovaVerifierVerifyForLaunchTestCase(StacktachBaseTestCase):
|
||||
nova_verifier._verify_for_launch(exist)
|
||||
exception = cm.exception
|
||||
|
||||
entity_1 = {'name': 'exists', 'value': OS_VERSION_1}
|
||||
entity_2 = {'name': 'launches', 'value': OS_VERSION_2}
|
||||
self.assertEqual(exception.field_name, 'os_version')
|
||||
self.assertEqual(exception.expected, OS_VERSION_1)
|
||||
self.assertEqual(exception.actual, OS_VERSION_2)
|
||||
|
||||
self.assertEqual(exception.entity_1, entity_1)
|
||||
self.assertEqual(exception.entity_2, entity_2)
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_for_launch_late_usage(self):
|
||||
self.mox.StubOutWithMock(config, 'flavor_field_name')
|
||||
config.flavor_field_name().AndReturn("flavor_field_name")
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.usage = None
|
||||
exist.instance = INSTANCE_ID_1
|
||||
@ -431,10 +458,13 @@ class NovaVerifierVerifyForDeleteTestCase(StacktachBaseTestCase):
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_for_delete_launched_at_mismatch(self):
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-02 03:04:05')
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.delete = self.mox.CreateMockAnything()
|
||||
exist.instance = INSTANCE_ID_1
|
||||
exist.launched_at = LAUNCHED_AT_1
|
||||
exist.deleted_at = DELETED_AT_1
|
||||
exist.delete = self.mox.CreateMockAnything()
|
||||
exist.delete.launched_at = LAUNCHED_AT_2
|
||||
exist.delete.deleted_at = DELETED_AT_1
|
||||
self.mox.ReplayAll()
|
||||
@ -442,16 +472,22 @@ class NovaVerifierVerifyForDeleteTestCase(StacktachBaseTestCase):
|
||||
with self.assertRaises(FieldMismatch) as fm:
|
||||
nova_verifier._verify_for_delete(exist)
|
||||
exception = fm.exception
|
||||
|
||||
entity_1 = {'name': 'exists', 'value': LAUNCHED_AT_1}
|
||||
entity_2 = {'name': 'deletes', 'value': LAUNCHED_AT_2}
|
||||
self.assertEqual(exception.field_name, 'launched_at')
|
||||
self.assertEqual(exception.expected, LAUNCHED_AT_1)
|
||||
self.assertEqual(exception.actual, LAUNCHED_AT_2)
|
||||
self.assertEqual(exception.entity_1, entity_1)
|
||||
self.assertEqual(exception.entity_2, entity_2)
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_verify_for_delete_deleted_at_mismatch(self):
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-02 03:04:05')
|
||||
|
||||
exist = self.mox.CreateMockAnything()
|
||||
exist.delete = self.mox.CreateMockAnything()
|
||||
exist.instance = INSTANCE_ID_1
|
||||
exist.launched_at = LAUNCHED_AT_1
|
||||
exist.deleted_at = DELETED_AT_1
|
||||
exist.delete = self.mox.CreateMockAnything()
|
||||
exist.delete.launched_at = LAUNCHED_AT_1
|
||||
exist.delete.deleted_at = DELETED_AT_2
|
||||
self.mox.ReplayAll()
|
||||
@ -459,9 +495,11 @@ class NovaVerifierVerifyForDeleteTestCase(StacktachBaseTestCase):
|
||||
with self.assertRaises(FieldMismatch) as fm:
|
||||
nova_verifier._verify_for_delete(exist)
|
||||
exception = fm.exception
|
||||
entity_1 = {'name': 'exists', 'value': DELETED_AT_1}
|
||||
entity_2 = {'name': 'deletes', 'value': DELETED_AT_2}
|
||||
self.assertEqual(exception.field_name, 'deleted_at')
|
||||
self.assertEqual(exception.expected, DELETED_AT_1)
|
||||
self.assertEqual(exception.actual, DELETED_AT_2)
|
||||
self.assertEqual(exception.entity_1, entity_1)
|
||||
self.assertEqual(exception.entity_2, entity_2)
|
||||
self.mox.VerifyAll()
|
||||
|
||||
|
||||
|
@ -1068,236 +1068,6 @@ class StackyServerTestCase(StacktachBaseTestCase):
|
||||
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_do_list_usage_launches(self):
|
||||
fake_request = self.mox.CreateMockAnything()
|
||||
fake_request.GET = {}
|
||||
results = self.mox.CreateMockAnything()
|
||||
models.InstanceUsage.objects.all().AndReturn(results)
|
||||
usage = self.mox.CreateMockAnything()
|
||||
usage.instance = INSTANCE_ID_1
|
||||
usage.launched_at = utils.decimal_utc()
|
||||
usage.instance_type_id = INSTANCE_TYPE_ID_1
|
||||
usage.instance_flavor_id = INSTANCE_FLAVOR_ID_1
|
||||
results[None:50].AndReturn(results)
|
||||
results.__iter__().AndReturn([usage].__iter__())
|
||||
self.mox.ReplayAll()
|
||||
|
||||
resp = stacky_server.do_list_usage_launches(fake_request)
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
resp_json = json.loads(resp.content)
|
||||
self.assertEqual(len(resp_json), 2)
|
||||
self.assertEqual(resp_json[0], ["UUID", "Launched At",
|
||||
"Instance Type Id",
|
||||
"Instance Flavor Id"])
|
||||
self.assertEqual(resp_json[1][0], INSTANCE_ID_1)
|
||||
time_str = dt.dt_from_decimal(usage.launched_at)
|
||||
self.assertEqual(resp_json[1][1], str(time_str))
|
||||
self.assertEqual(resp_json[1][2], INSTANCE_TYPE_ID_1)
|
||||
self.assertEqual(resp_json[1][3], INSTANCE_FLAVOR_ID_1)
|
||||
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_do_list_usage_launches_with_instance(self):
|
||||
fake_request = self.mox.CreateMockAnything()
|
||||
fake_request.GET = {'instance': INSTANCE_ID_1}
|
||||
results = self.mox.CreateMockAnything()
|
||||
models.InstanceUsage.objects.filter(instance=INSTANCE_ID_1)\
|
||||
.AndReturn(results)
|
||||
usage = self.mox.CreateMockAnything()
|
||||
usage.instance = INSTANCE_ID_1
|
||||
usage.launched_at = utils.decimal_utc()
|
||||
usage.instance_type_id = INSTANCE_TYPE_ID_1
|
||||
usage.instance_flavor_id = INSTANCE_FLAVOR_ID_1
|
||||
results[None:50].AndReturn(results)
|
||||
results.__iter__().AndReturn([usage].__iter__())
|
||||
self.mox.ReplayAll()
|
||||
|
||||
resp = stacky_server.do_list_usage_launches(fake_request)
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
resp_json = json.loads(resp.content)
|
||||
self.assertEqual(len(resp_json), 2)
|
||||
self.assertEqual(resp_json[0], ["UUID", "Launched At",
|
||||
"Instance Type Id",
|
||||
"Instance Flavor Id"])
|
||||
self.assertEqual(resp_json[1][0], INSTANCE_ID_1)
|
||||
time_str = dt.dt_from_decimal(usage.launched_at)
|
||||
self.assertEqual(resp_json[1][1], str(time_str))
|
||||
self.assertEqual(resp_json[1][2], INSTANCE_TYPE_ID_1)
|
||||
self.assertEqual(resp_json[1][3], INSTANCE_FLAVOR_ID_1)
|
||||
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_do_list_usage_launches_bad_instance(self):
|
||||
fake_request = self.mox.CreateMockAnything()
|
||||
fake_request.GET = {'instance': "obviouslybaduuid"}
|
||||
self.mox.ReplayAll()
|
||||
|
||||
resp = stacky_server.do_list_usage_launches(fake_request)
|
||||
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
resp_json = json.loads(resp.content)
|
||||
self.assertEqual(len(resp_json), 2)
|
||||
self.assertEqual(resp_json[0], ['Error', 'Message'])
|
||||
msg = 'obviouslybaduuid is not uuid-like'
|
||||
self.assertEqual(resp_json[1], ['Bad Request', msg])
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_do_list_usage_deletes(self):
|
||||
fake_request = self.mox.CreateMockAnything()
|
||||
fake_request.GET = {}
|
||||
results = self.mox.CreateMockAnything()
|
||||
models.InstanceDeletes.objects.all().AndReturn(results)
|
||||
usage = self.mox.CreateMockAnything()
|
||||
usage.instance = INSTANCE_ID_1
|
||||
usage.launched_at = utils.decimal_utc()
|
||||
usage.deleted_at = usage.launched_at + 10
|
||||
results[None:50].AndReturn(results)
|
||||
results.__iter__().AndReturn([usage].__iter__())
|
||||
self.mox.ReplayAll()
|
||||
|
||||
resp = stacky_server.do_list_usage_deletes(fake_request)
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
resp_json = json.loads(resp.content)
|
||||
self.assertEqual(len(resp_json), 2)
|
||||
self.assertEqual(resp_json[0], ["UUID", "Launched At",
|
||||
"Deleted At"])
|
||||
self.assertEqual(resp_json[1][0], INSTANCE_ID_1)
|
||||
launch_time_str = dt.dt_from_decimal(usage.launched_at)
|
||||
self.assertEqual(resp_json[1][1], str(launch_time_str))
|
||||
delete_time_str = dt.dt_from_decimal(usage.deleted_at)
|
||||
self.assertEqual(resp_json[1][2], str(delete_time_str))
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_do_list_usage_deletes_with_instance(self):
|
||||
fake_request = self.mox.CreateMockAnything()
|
||||
fake_request.GET = {'instance': INSTANCE_ID_1}
|
||||
results = self.mox.CreateMockAnything()
|
||||
models.InstanceDeletes.objects.filter(instance=INSTANCE_ID_1)\
|
||||
.AndReturn(results)
|
||||
usage = self.mox.CreateMockAnything()
|
||||
usage.instance = INSTANCE_ID_1
|
||||
usage.launched_at = utils.decimal_utc()
|
||||
usage.deleted_at = usage.launched_at + 10
|
||||
results[None:50].AndReturn(results)
|
||||
results.__iter__().AndReturn([usage].__iter__())
|
||||
self.mox.ReplayAll()
|
||||
|
||||
resp = stacky_server.do_list_usage_deletes(fake_request)
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
resp_json = json.loads(resp.content)
|
||||
self.assertEqual(len(resp_json), 2)
|
||||
self.assertEqual(resp_json[0], ["UUID", "Launched At",
|
||||
"Deleted At"])
|
||||
self.assertEqual(resp_json[1][0], INSTANCE_ID_1)
|
||||
launch_time_str = dt.dt_from_decimal(usage.launched_at)
|
||||
self.assertEqual(resp_json[1][1], str(launch_time_str))
|
||||
delete_time_str = dt.dt_from_decimal(usage.deleted_at)
|
||||
self.assertEqual(resp_json[1][2], str(delete_time_str))
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_do_list_usage_deletes_bad_instance(self):
|
||||
fake_request = self.mox.CreateMockAnything()
|
||||
fake_request.GET = {'instance': "obviouslybaduuid"}
|
||||
self.mox.ReplayAll()
|
||||
|
||||
resp = stacky_server.do_list_usage_deletes(fake_request)
|
||||
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
resp_json = json.loads(resp.content)
|
||||
self.assertEqual(len(resp_json), 2)
|
||||
self.assertEqual(resp_json[0], ['Error', 'Message'])
|
||||
msg = 'obviouslybaduuid is not uuid-like'
|
||||
self.assertEqual(resp_json[1], ['Bad Request', msg])
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_do_list_usage_exists(self):
|
||||
fake_request = self.mox.CreateMockAnything()
|
||||
fake_request.GET = {}
|
||||
results = self.mox.CreateMockAnything()
|
||||
models.InstanceExists.objects.all().AndReturn(results)
|
||||
usage = self.mox.CreateMockAnything()
|
||||
usage.instance = INSTANCE_ID_1
|
||||
usage.launched_at = utils.decimal_utc()
|
||||
usage.deleted_at = usage.launched_at + 10
|
||||
usage.instance_type_id = INSTANCE_TYPE_ID_1
|
||||
usage.instance_flavor_id = INSTANCE_FLAVOR_ID_1
|
||||
usage.message_id = 'someid'
|
||||
usage.status = 'pending'
|
||||
results[None:50].AndReturn(results)
|
||||
results.__iter__().AndReturn([usage].__iter__())
|
||||
self.mox.ReplayAll()
|
||||
|
||||
resp = stacky_server.do_list_usage_exists(fake_request)
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
resp_json = json.loads(resp.content)
|
||||
self.assertEqual(len(resp_json), 2)
|
||||
self.assertEqual(resp_json[0], ["UUID", "Launched At", "Deleted At",
|
||||
"Instance Type Id",
|
||||
"Instance Flavor Id", "Message ID",
|
||||
"Status"])
|
||||
self.assertEqual(resp_json[1][0], INSTANCE_ID_1)
|
||||
launch_time_str = dt.dt_from_decimal(usage.launched_at)
|
||||
self.assertEqual(resp_json[1][1], str(launch_time_str))
|
||||
delete_time_str = dt.dt_from_decimal(usage.deleted_at)
|
||||
self.assertEqual(resp_json[1][2], str(delete_time_str))
|
||||
self.assertEqual(resp_json[1][3], INSTANCE_TYPE_ID_1)
|
||||
self.assertEqual(resp_json[1][4], INSTANCE_FLAVOR_ID_1)
|
||||
self.assertEqual(resp_json[1][5], 'someid')
|
||||
self.assertEqual(resp_json[1][6], 'pending')
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_do_list_usage_exists_with_instance(self):
|
||||
fake_request = self.mox.CreateMockAnything()
|
||||
fake_request.GET = {'instance': INSTANCE_ID_1}
|
||||
results = self.mox.CreateMockAnything()
|
||||
models.InstanceExists.objects.filter(instance=INSTANCE_ID_1)\
|
||||
.AndReturn(results)
|
||||
usage = self.mox.CreateMockAnything()
|
||||
usage.instance = INSTANCE_ID_1
|
||||
usage.launched_at = utils.decimal_utc()
|
||||
usage.deleted_at = usage.launched_at + 10
|
||||
usage.instance_type_id = INSTANCE_TYPE_ID_1
|
||||
usage.instance_flavor_id = INSTANCE_FLAVOR_ID_1
|
||||
usage.message_id = 'someid'
|
||||
usage.status = 'pending'
|
||||
results[None:50].AndReturn(results)
|
||||
results.__iter__().AndReturn([usage].__iter__())
|
||||
self.mox.ReplayAll()
|
||||
|
||||
resp = stacky_server.do_list_usage_exists(fake_request)
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
resp_json = json.loads(resp.content)
|
||||
self.assertEqual(len(resp_json), 2)
|
||||
self.assertEqual(resp_json[0], ["UUID", "Launched At", "Deleted At",
|
||||
"Instance Type Id",
|
||||
"Instance Flavor Id", "Message ID",
|
||||
"Status"])
|
||||
self.assertEqual(resp_json[1][0], INSTANCE_ID_1)
|
||||
launch_time_str = dt.dt_from_decimal(usage.launched_at)
|
||||
self.assertEqual(resp_json[1][1], str(launch_time_str))
|
||||
delete_time_str = dt.dt_from_decimal(usage.deleted_at)
|
||||
self.assertEqual(resp_json[1][2], str(delete_time_str))
|
||||
self.assertEqual(resp_json[1][3], INSTANCE_TYPE_ID_1)
|
||||
self.assertEqual(resp_json[1][4], INSTANCE_FLAVOR_ID_1)
|
||||
self.assertEqual(resp_json[1][5], 'someid')
|
||||
self.assertEqual(resp_json[1][6], 'pending')
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_do_list_usage_exists_bad_instance(self):
|
||||
fake_request = self.mox.CreateMockAnything()
|
||||
fake_request.GET = {'instance': "obviouslybaduuid"}
|
||||
self.mox.ReplayAll()
|
||||
|
||||
resp = stacky_server.do_list_usage_exists(fake_request)
|
||||
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
resp_json = json.loads(resp.content)
|
||||
self.assertEqual(len(resp_json), 2)
|
||||
self.assertEqual(resp_json[0], ['Error', 'Message'])
|
||||
msg = 'obviouslybaduuid is not uuid-like'
|
||||
self.assertEqual(resp_json[1], ['Bad Request', msg])
|
||||
self.mox.VerifyAll()
|
||||
|
||||
def test_model_factory_for_nova(self):
|
||||
self.mox.UnsetStubs()
|
||||
nova_model = stacky_server._model_factory('nova')
|
||||
|
@ -1,7 +1,7 @@
|
||||
import datetime
|
||||
import mox
|
||||
from tests.unit import StacktachBaseTestCase
|
||||
from verifier import NotFound, AmbiguousResults, FieldMismatch, NullFieldException, WrongTypeException
|
||||
from tests.unit import StacktachBaseTestCase, utils
|
||||
from verifier import NotFound, AmbiguousResults, FieldMismatch
|
||||
from verifier import NullFieldException, WrongTypeException
|
||||
|
||||
|
||||
class VerificationExceptionTestCase(StacktachBaseTestCase):
|
||||
@ -25,36 +25,41 @@ class VerificationExceptionTestCase(StacktachBaseTestCase):
|
||||
"Ambiguous results for object_type using search_params")
|
||||
|
||||
def test_field_mismatch_exception(self):
|
||||
self.mox.StubOutWithMock(datetime, 'datetime')
|
||||
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
|
||||
self.mox.ReplayAll()
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-02 03:04:05')
|
||||
|
||||
exception = FieldMismatch('field_name', 'expected', 'actual', 'uuid')
|
||||
exception = FieldMismatch(
|
||||
'field_name',
|
||||
{'name': 'entity1', 'value': 'expected'},
|
||||
{'name': 'entity2', 'value': 'actual'},
|
||||
'uuid')
|
||||
|
||||
self.assertEqual(exception.reason,
|
||||
"Failed at 2014-01-02 03:04:05 UTC for uuid: Expected"
|
||||
" field_name to be 'expected' got 'actual'")
|
||||
self.assertEqual(
|
||||
exception.reason,
|
||||
"Failed at 2014-01-02 03:04:05 UTC for uuid: Data mismatch for "
|
||||
"'field_name' - 'entity1' contains 'expected' but 'entity2' "
|
||||
"contains 'actual'")
|
||||
|
||||
def test_null_field_exception(self):
|
||||
self.mox.StubOutWithMock(datetime, 'datetime')
|
||||
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
|
||||
self.mox.ReplayAll()
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-02 03:04:05')
|
||||
|
||||
exception = NullFieldException('field_name', '1234', 'uuid')
|
||||
exception = NullFieldException('field_name', 'exist_id', 'uuid')
|
||||
|
||||
self.assertEqual(exception.reason,
|
||||
"Failed at 2014-01-02 03:04:05 UTC for uuid: "
|
||||
"field_name field was null for exist id 1234")
|
||||
self.assertEqual(exception.field_name, 'field_name')
|
||||
self.assertEqual(
|
||||
exception.reason,
|
||||
"Failed at 2014-01-02 03:04:05 UTC for uuid: field_name field was "
|
||||
"null for exist id exist_id")
|
||||
|
||||
def test_wrong_type_exception(self):
|
||||
self.mox.StubOutWithMock(datetime, 'datetime')
|
||||
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
|
||||
self.mox.ReplayAll()
|
||||
|
||||
exception = WrongTypeException('field_name', 'value', '1234', 'uuid')
|
||||
|
||||
self.assertEqual(exception.reason,
|
||||
"Failed at 2014-01-02 03:04:05 UTC for uuid: "
|
||||
"{field_name: value} was of incorrect type for"
|
||||
" exist id 1234")
|
||||
utils.mock_datetime_utcnow(self.mox, '2014-01-02 03:04:05')
|
||||
|
||||
exception = WrongTypeException(
|
||||
'field_name', 'value', 'exist_id', 'uuid')
|
||||
self.assertEqual(exception.field_name, 'field_name')
|
||||
self.assertEqual(exception.value, 'value')
|
||||
self.assertEqual(exception.exist_id, 'exist_id')
|
||||
self.assertEqual(exception.uuid, 'uuid')
|
||||
self.assertEqual(
|
||||
exception.reason,
|
||||
"Failed at 2014-01-02 03:04:05 UTC for uuid: {field_name: value} "
|
||||
"was of incorrect type for exist id exist_id")
|
||||
|
@ -71,6 +71,9 @@ SIZE_2 = 4567
|
||||
CREATED_AT_1 = decimal.Decimal("10.1")
|
||||
CREATED_AT_2 = decimal.Decimal("11.1")
|
||||
|
||||
IMAGE_OWNER_1 = "owner_1"
|
||||
IMAGE_OWNER_2 = "owner_2"
|
||||
|
||||
TIMESTAMP_1 = "2013-06-20 17:31:57.939614"
|
||||
SETTLE_TIME = 5
|
||||
SETTLE_UNITS = "minutes"
|
||||
@ -198,3 +201,10 @@ def make_verifier_config(notifs):
|
||||
GLANCE_VERIFIER_EVENT_TYPE,
|
||||
FLAVOR_FIELD_NAME)
|
||||
return config
|
||||
|
||||
|
||||
def mock_datetime_utcnow(mox, time):
|
||||
mox.StubOutWithMock(datetime, 'datetime')
|
||||
datetime.datetime.utcnow().AndReturn(time)
|
||||
mox.ReplayAll()
|
||||
|
||||
|
17
tox.ini
Normal file
17
tox.ini
Normal file
@ -0,0 +1,17 @@
|
||||
[tox]
|
||||
envlist = py26,py27,pep8
|
||||
|
||||
[testenv]
|
||||
deps = -r{toxinidir}/etc/test-requires.txt
|
||||
-r{toxinidir}/etc/pip-requires.txt
|
||||
|
||||
setenv = VIRTUAL_ENV={envdir}
|
||||
|
||||
commands =
|
||||
nosetests tests --exclude-dir=stacktach --with-coverage --cover-package=stacktach,worker,verifier --cover-erase
|
||||
|
||||
sitepackages = False
|
||||
|
||||
[testenv:pep8]
|
||||
commands =
|
||||
true
|
@ -43,16 +43,22 @@ class AmbiguousResults(VerificationException):
|
||||
|
||||
|
||||
class FieldMismatch(VerificationException):
|
||||
def __init__(self, field_name, expected, actual, uuid):
|
||||
def __init__(self, field_name, entity_1, entity_2, uuid):
|
||||
#instance fields for testing ease
|
||||
self.field_name = field_name
|
||||
self.expected = expected
|
||||
self.actual = actual
|
||||
self.entity_1 = entity_1
|
||||
self.entity_2 = entity_2
|
||||
self.uuid = uuid
|
||||
|
||||
self.reason = \
|
||||
"Failed at {failed_at} UTC for {uuid}: Expected {field_name} " \
|
||||
"to be '{expected}' got '{actual}'".\
|
||||
format(failed_at=datetime.datetime.utcnow(), uuid=uuid,
|
||||
field_name=field_name, expected=expected,
|
||||
actual=actual)
|
||||
"Failed at {failed_at} UTC for {uuid}: Data mismatch for " \
|
||||
"'{field_name}' - '{name_1}' contains '{value_1}' but '{name_2}' " \
|
||||
"contains '{value_2}'".\
|
||||
format(failed_at=datetime.datetime.utcnow(), uuid=self.uuid,
|
||||
field_name=self.field_name, name_1=entity_1['name'],
|
||||
value_1=self.entity_1['value'],
|
||||
name_2=self.entity_2['name'],
|
||||
value_2=self.entity_2['value'])
|
||||
|
||||
|
||||
class NullFieldException(VerificationException):
|
||||
@ -67,11 +73,16 @@ class NullFieldException(VerificationException):
|
||||
|
||||
class WrongTypeException(VerificationException):
|
||||
def __init__(self, field_name, value, exist_id, uuid):
|
||||
#made instance fields to ease testing
|
||||
self.field_name = field_name
|
||||
self.value = value
|
||||
self.exist_id = exist_id
|
||||
self.uuid = uuid
|
||||
|
||||
self.reason = \
|
||||
"Failed at {failed_at} UTC for {uuid}: " \
|
||||
"{{{field_name}: {value}}} was of incorrect type for " \
|
||||
"exist id {exist_id}".format(
|
||||
failed_at=datetime.datetime.utcnow(), uuid=uuid,
|
||||
field_name=field_name, value=value, exist_id=exist_id)
|
||||
|
||||
failed_at=datetime.datetime.utcnow(), uuid=self.uuid,
|
||||
field_name=self.field_name, value=self.value,
|
||||
exist_id=self.exist_id)
|
||||
|
@ -5,9 +5,9 @@
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
@ -21,7 +21,7 @@ config_filename = os.environ.get('STACKTACH_VERIFIER_CONFIG',
|
||||
'stacktach_verifier_config.json')
|
||||
try:
|
||||
from local_settings import *
|
||||
config_filename = STACKTACH_VERIFIER_CONFIG
|
||||
config_filename = config_filename
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
|
@ -48,14 +48,25 @@ def _get_child_logger():
|
||||
def _verify_field_mismatch(exists, usage):
|
||||
if not base_verifier._verify_date_field(
|
||||
usage.created_at, exists.created_at, same_second=True):
|
||||
raise FieldMismatch('created_at', exists.created_at, usage.created_at,
|
||||
exists.uuid)
|
||||
raise FieldMismatch(
|
||||
'created_at',
|
||||
{'name': 'exists', 'value': exists.created_at},
|
||||
{'name': 'launches', 'value': usage.created_at},
|
||||
exists.uuid)
|
||||
|
||||
if usage.owner != exists.owner:
|
||||
raise FieldMismatch('owner', exists.owner, usage.owner, exists.uuid)
|
||||
raise FieldMismatch(
|
||||
'owner',
|
||||
{'name': 'exists', 'value': exists.owner},
|
||||
{'name': 'launches', 'value': usage.owner},
|
||||
exists.uuid)
|
||||
|
||||
if usage.size != exists.size:
|
||||
raise FieldMismatch('size', exists.size, usage.size, exists.uuid)
|
||||
raise FieldMismatch(
|
||||
'size',
|
||||
{'name': 'exists', 'value': exists.size},
|
||||
{'name': 'launches', 'value': usage.size},
|
||||
exists.uuid)
|
||||
|
||||
|
||||
def _verify_validity(exist):
|
||||
@ -119,8 +130,11 @@ def _verify_for_delete(exist, delete=None):
|
||||
if delete:
|
||||
if not base_verifier._verify_date_field(
|
||||
delete.deleted_at, exist.deleted_at, same_second=True):
|
||||
raise FieldMismatch('deleted_at', exist.deleted_at,
|
||||
delete.deleted_at, exist.uuid)
|
||||
raise FieldMismatch(
|
||||
'deleted_at',
|
||||
{'name': 'exists', 'value': exist.deleted_at},
|
||||
{'name': 'deletes', 'value': delete.deleted_at},
|
||||
exist.uuid)
|
||||
|
||||
|
||||
def _verify(exists):
|
||||
|
@ -49,35 +49,54 @@ def _verify_field_mismatch(exists, launch):
|
||||
flavor_field_name = config.flavor_field_name()
|
||||
if not base_verifier._verify_date_field(
|
||||
launch.launched_at, exists.launched_at, same_second=True):
|
||||
raise FieldMismatch('launched_at', exists.launched_at,
|
||||
launch.launched_at, exists.instance)
|
||||
raise FieldMismatch(
|
||||
'launched_at',
|
||||
{'name': 'exists', 'value': exists.launched_at},
|
||||
{'name': 'launches', 'value': launch.launched_at},
|
||||
exists.instance)
|
||||
|
||||
if getattr(launch, flavor_field_name) != \
|
||||
getattr(exists, flavor_field_name):
|
||||
raise FieldMismatch(flavor_field_name,
|
||||
getattr(exists, flavor_field_name),
|
||||
getattr(launch, flavor_field_name),
|
||||
exists.instance)
|
||||
raise FieldMismatch(
|
||||
flavor_field_name,
|
||||
{'name': 'exists', 'value': getattr(exists, flavor_field_name)},
|
||||
{'name': 'launches', 'value': getattr(launch, flavor_field_name)},
|
||||
exists.instance)
|
||||
|
||||
if launch.tenant != exists.tenant:
|
||||
raise FieldMismatch('tenant', exists.tenant, launch.tenant,
|
||||
exists.instance)
|
||||
raise FieldMismatch(
|
||||
'tenant',
|
||||
{'name': 'exists', 'value': exists.tenant},
|
||||
{'name': 'launches', 'value': launch.tenant},
|
||||
exists.instance)
|
||||
|
||||
if launch.rax_options != exists.rax_options:
|
||||
raise FieldMismatch('rax_options', exists.rax_options,
|
||||
launch.rax_options, exists.instance)
|
||||
raise FieldMismatch(
|
||||
'rax_options',
|
||||
{'name': 'exists', 'value': exists.rax_options},
|
||||
{'name': 'launches', 'value': launch.rax_options},
|
||||
exists.instance)
|
||||
|
||||
if launch.os_architecture != exists.os_architecture:
|
||||
raise FieldMismatch('os_architecture', exists.os_architecture,
|
||||
launch.os_architecture, exists.instance)
|
||||
raise FieldMismatch(
|
||||
'os_architecture',
|
||||
{'name': 'exists', 'value': exists.os_architecture},
|
||||
{'name': 'launches', 'value': launch.os_architecture},
|
||||
exists.instance)
|
||||
|
||||
if launch.os_version != exists.os_version:
|
||||
raise FieldMismatch('os_version', exists.os_version,
|
||||
launch.os_version, exists.instance)
|
||||
raise FieldMismatch(
|
||||
'os_version',
|
||||
{'name': 'exists', 'value': exists.os_version},
|
||||
{'name': 'launches', 'value': launch.os_version},
|
||||
exists.instance)
|
||||
|
||||
if launch.os_distro != exists.os_distro:
|
||||
raise FieldMismatch('os_distro', exists.os_distro,
|
||||
launch.os_distro, exists.instance)
|
||||
raise FieldMismatch(
|
||||
'os_distro',
|
||||
{'name': 'exists', 'value': exists.os_distro},
|
||||
{'name': 'launches', 'value': launch.os_distro},
|
||||
exists.instance)
|
||||
|
||||
|
||||
def _verify_for_launch(exist, launch=None,
|
||||
@ -143,13 +162,19 @@ def _verify_for_delete(exist, delete=None,
|
||||
if delete:
|
||||
if not base_verifier._verify_date_field(
|
||||
delete.launched_at, exist.launched_at, same_second=True):
|
||||
raise FieldMismatch('launched_at', exist.launched_at,
|
||||
delete.launched_at, exist.instance)
|
||||
raise FieldMismatch(
|
||||
'launched_at',
|
||||
{'name': 'exists', 'value': exist.launched_at},
|
||||
{'name': 'deletes', 'value': delete.launched_at},
|
||||
exist.instance)
|
||||
|
||||
if not base_verifier._verify_date_field(
|
||||
delete.deleted_at, exist.deleted_at, same_second=True):
|
||||
raise FieldMismatch('deleted_at', exist.deleted_at,
|
||||
delete.deleted_at, exist.instance)
|
||||
raise FieldMismatch(
|
||||
'deleted_at',
|
||||
{'name': 'exists', 'value': exist.deleted_at},
|
||||
{'name': 'deletes', 'value': delete.deleted_at},
|
||||
exist.instance)
|
||||
|
||||
|
||||
def _verify_basic_validity(exist):
|
||||
|
@ -1,3 +1,13 @@
|
||||
#!/bin/sh
|
||||
### BEGIN INIT INFO
|
||||
# Provides: verifier
|
||||
# Required-Start:
|
||||
# Required-Stop:
|
||||
# Default-Start: 2 3 4 5
|
||||
# Default-Stop: 0 1 6
|
||||
# Short-Description: Start/stop stacktach verifier
|
||||
### END INIT INFO
|
||||
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
@ -15,17 +25,6 @@
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
|
||||
#!/bin/sh
|
||||
### BEGIN INIT INFO
|
||||
# Provides: verifier
|
||||
# Required-Start:
|
||||
# Required-Stop:
|
||||
# Default-Start: 2 3 4 5
|
||||
# Default-Stop: 0 1 6
|
||||
# Short-Description: Start/stop stacktach verifier
|
||||
### END INIT INFO
|
||||
|
||||
. /lib/lsb/init-functions
|
||||
|
||||
WORKDIR=/srv/www/stacktach/app
|
||||
|
@ -50,7 +50,9 @@ def _get_child_logger():
|
||||
|
||||
class Consumer(kombu.mixins.ConsumerMixin):
|
||||
def __init__(self, name, connection, deployment, durable, queue_arguments,
|
||||
exchange, topics):
|
||||
exchange, topics, connect_max_retries=10):
|
||||
self.connect_max_retries = connect_max_retries
|
||||
self.retry_attempts = 0
|
||||
self.connection = connection
|
||||
self.deployment = deployment
|
||||
self.durable = durable
|
||||
@ -144,11 +146,18 @@ class Consumer(kombu.mixins.ConsumerMixin):
|
||||
shutdown_soon = True
|
||||
|
||||
def on_connection_revived(self):
|
||||
self.retry_attempts = 0
|
||||
_get_child_logger().debug("The connection to RabbitMQ was revived.")
|
||||
|
||||
def on_connection_error(self, exc, interval):
|
||||
_get_child_logger().error("RabbitMQ Broker connection error: %r. "
|
||||
"Trying again in %s seconds.", exc, interval)
|
||||
self.retry_attempts += 1
|
||||
msg = ("RabbitMQ Broker connection error: %r. "
|
||||
"Trying again in %s seconds.", exc, interval)
|
||||
if self.retry_attempts >= self.connect_max_retries:
|
||||
# If we're on the last retry
|
||||
_get_child_logger().error(*msg)
|
||||
else:
|
||||
_get_child_logger().warn(*msg)
|
||||
|
||||
def on_decode_error(self, message, exc):
|
||||
_get_child_logger().exception("Decode Error: %s" % exc)
|
||||
|
Loading…
Reference in New Issue
Block a user