Retire repository
Fuel (from openstack namespace) and fuel-ccp (in x namespace) repositories are unused and ready to retire. This change removes all content from the repository and adds the usual README file to point out that the repository is retired following the process from https://docs.openstack.org/infra/manual/drivers.html#retiring-a-project See also http://lists.openstack.org/pipermail/openstack-discuss/2019-December/011647.html Depends-On: https://review.opendev.org/699362 Change-Id: I9c48f2cfa39e04a629e20cb320caca8fd69758fd
This commit is contained in:
parent
dd3d11e643
commit
942c19a6a8
@ -1,7 +0,0 @@
|
||||
[run]
|
||||
branch = True
|
||||
source = tuning_box
|
||||
omit = tuning_box/openstack/*
|
||||
|
||||
[report]
|
||||
ignore_errors = True
|
57
.gitignore
vendored
57
.gitignore
vendored
@ -1,57 +0,0 @@
|
||||
*.py[cod]
|
||||
|
||||
# C extensions
|
||||
*.so
|
||||
|
||||
# Packages
|
||||
*.egg
|
||||
*.egg-info
|
||||
dist
|
||||
build
|
||||
.eggs
|
||||
eggs
|
||||
parts
|
||||
bin
|
||||
var
|
||||
sdist
|
||||
develop-eggs
|
||||
.installed.cfg
|
||||
lib
|
||||
lib64
|
||||
|
||||
# Installer logs
|
||||
pip-log.txt
|
||||
|
||||
# Unit test / coverage reports
|
||||
.coverage
|
||||
.tox
|
||||
nosetests.xml
|
||||
.testrepository
|
||||
.venv
|
||||
cover
|
||||
.cache
|
||||
testdb
|
||||
|
||||
# Translations
|
||||
*.mo
|
||||
|
||||
# Mr Developer
|
||||
.mr.developer.cfg
|
||||
.project
|
||||
.pydevproject
|
||||
|
||||
# Complexity
|
||||
output/*.html
|
||||
output/*/index.html
|
||||
|
||||
# Sphinx
|
||||
doc/build
|
||||
|
||||
# pbr generates these
|
||||
AUTHORS
|
||||
ChangeLog
|
||||
|
||||
# Editors
|
||||
*~
|
||||
.*.swp
|
||||
.*sw?
|
3
.mailmap
3
.mailmap
@ -1,3 +0,0 @@
|
||||
# Format is:
|
||||
# <preferred e-mail> <other e-mail 1>
|
||||
# <preferred e-mail> <other e-mail 2>
|
@ -1,9 +0,0 @@
|
||||
[DEFAULT]
|
||||
test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \
|
||||
OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \
|
||||
OS_LOG_CAPTURE=${OS_LOG_CAPTURE:-1} \
|
||||
OS_DEBUG=${OS_DEBUG:-1} \
|
||||
OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-60} \
|
||||
${PYTHON:-python} -m subunit.run discover -t ./ . $LISTOPT $IDOPTION
|
||||
test_id_option=--load-list $IDFILE
|
||||
test_list_option=--list
|
@ -1,17 +0,0 @@
|
||||
If you would like to contribute to the development of OpenStack, you must
|
||||
follow the steps in this page:
|
||||
|
||||
http://docs.openstack.org/infra/manual/developers.html
|
||||
|
||||
If you already have a good understanding of how the system works and your
|
||||
OpenStack accounts are set up, you can skip to the development workflow
|
||||
section of this documentation to learn how changes to OpenStack should be
|
||||
submitted for review via the Gerrit tool:
|
||||
|
||||
http://docs.openstack.org/infra/manual/developers.html#development-workflow
|
||||
|
||||
Pull requests submitted through GitHub will be ignored.
|
||||
|
||||
Bugs should be filed on Launchpad, not GitHub:
|
||||
|
||||
https://bugs.launchpad.net/tuning_box
|
@ -1,4 +0,0 @@
|
||||
tuning_box Style Commandments
|
||||
===============================================
|
||||
|
||||
Read the OpenStack Style Commandments http://docs.openstack.org/developer/hacking/
|
175
LICENSE
175
LICENSE
@ -1,175 +0,0 @@
|
||||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
32
MAINTAINERS
32
MAINTAINERS
@ -1,32 +0,0 @@
|
||||
---
|
||||
description:
|
||||
For Fuel team structure and contribution policy, see [1].
|
||||
|
||||
This is repository level MAINTAINERS file. All contributions to this
|
||||
repository must be approved by one or more Core Reviewers [2].
|
||||
If you are contributing to files (or create new directories) in
|
||||
root folder of this repository, please contact Core Reviewers for
|
||||
review and merge requests.
|
||||
|
||||
If you are contributing to subfolders of this repository, please
|
||||
check 'maintainers' section of this file in order to find maintainers
|
||||
for those specific modules.
|
||||
|
||||
It is mandatory to get +1 from one or more maintainers before asking
|
||||
Core Reviewers for review/merge in order to decrease a load on Core Reviewers [3].
|
||||
Exceptions are when maintainers are actually cores, or when maintainers
|
||||
are not available for some reason (e.g. on vacation).
|
||||
|
||||
[1] https://specs.openstack.org/openstack/fuel-specs/policy/team-structure
|
||||
[2] https://review.openstack.org/#/admin/groups/1325,members
|
||||
[3] http://lists.openstack.org/pipermail/openstack-dev/2015-August/072406.html
|
||||
|
||||
Please keep this file in YAML format in order to allow helper scripts
|
||||
to read this as a configuration data.
|
||||
|
||||
maintainers:
|
||||
|
||||
- ./:
|
||||
- name: Aleksandr Kislitsky
|
||||
email: akislitsky@mirantis.com
|
||||
IRC: akislitsky
|
@ -1,7 +0,0 @@
|
||||
include AUTHORS
|
||||
include ChangeLog
|
||||
exclude .gitignore
|
||||
exclude .gitreview
|
||||
recursive-include tuning_box/migrations *.py
|
||||
|
||||
global-exclude *.pyc
|
298
README.rst
298
README.rst
@ -1,292 +1,10 @@
|
||||
==========
|
||||
Tuning Box
|
||||
==========
|
||||
This project is no longer maintained.
|
||||
|
||||
Tuning Box is a configuration storage for your clouds.
|
||||
The contents of this repository are still available in the Git
|
||||
source code management system. To see the contents of this
|
||||
repository before it reached its end of life, please check out the
|
||||
previous commit with "git checkout HEAD^1".
|
||||
|
||||
Tuning Box can be used as a centralized storage for all configurations. It
|
||||
supports Keystone authentication. By default, Tuning Box installs as a Fuel
|
||||
extension but also it can be run as a service.
|
||||
|
||||
* Free software: `Apache license`_
|
||||
* Source_
|
||||
* Bugs_
|
||||
|
||||
.. _Source: https://github.com/openstack/tuning-box
|
||||
.. _Bugs: https://bugs.launchpad.net/fuel/+bugs?field.searchtext=&orderby=-importance&search=Search&field.tag=area-configdb+
|
||||
.. _Apache license: https://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Features
|
||||
--------
|
||||
|
||||
ConfigDB entities:
|
||||
|
||||
- Environment
|
||||
- Component
|
||||
- Hierarchy level
|
||||
- Resource definition
|
||||
- Resource value
|
||||
- Resource value override
|
||||
|
||||
Installation
|
||||
------------
|
||||
|
||||
#. Download Tuning Box RPM package or code to the Fuel Master node. The
|
||||
package can be built from the source code using::
|
||||
|
||||
$ python setup.py bdist_rpm
|
||||
|
||||
#. Tuning Box installs as a Fuel Nailgun extension. Therefore, perform the
|
||||
DB migration and restart the Nailgun service::
|
||||
|
||||
$ nailgun_syncdb
|
||||
$ systemctl restart nailgun.service
|
||||
|
||||
#. Configure the Tuning Box keystone service::
|
||||
|
||||
$ export OS_USERNAME=admin OS_PASSWORD=admin OS_PROJECT_NAME=admin OS_AUTH_URL=http://10.20.0.2:5000
|
||||
$ openstack service create --name tuning-box config
|
||||
$ openstack endpoint create --publicurl http://10.20.0.2:8000/api/config --region RegionOne tuning-box
|
||||
|
||||
Now, you have enabled a set of ``config`` commands in the ``fuel2`` CLI.
|
||||
|
||||
Commands groups for fuel2 CLI
|
||||
-----------------------------
|
||||
|
||||
The ``fuel2`` CLI commands groups are the following:
|
||||
|
||||
- ``config comp`` - CRUD operations for components
|
||||
- ``config def`` - CRUD operations for resource definitions
|
||||
- ``config env`` - CRUD operations for environments
|
||||
- ``config get``, ``config set``, ``config del`` - CRUD operations for
|
||||
resource values
|
||||
- ``config override``, ``config rm override`` - operations for resource values
|
||||
overrides
|
||||
|
||||
API
|
||||
---
|
||||
|
||||
For all operations authentication is required. Auth token should be passed in
|
||||
the X-Auth-Token HTTP header. Tuning Box installed as a Fuel Nailgun extension
|
||||
thus base API URL is placed at ``http://MASTER_NODE_IP:8000/api/v1/config``
|
||||
All operations URLs should be concatenated with the base API URL.
|
||||
|
||||
Environments operations
|
||||
=======================
|
||||
|
||||
URL: ``/environments``
|
||||
Operations:
|
||||
|
||||
- (GET) list environments
|
||||
- (POST) create environment
|
||||
|
||||
For environment creation POST:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
{
|
||||
'hierarchy_levels': [
|
||||
# list of hierarchy levels
|
||||
],
|
||||
'components': [
|
||||
# list of components ids
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
Environment operations
|
||||
======================
|
||||
|
||||
URL: ``/environments/<int:component_id>``
|
||||
Operations:
|
||||
|
||||
- (GET) get environment
|
||||
- (PUT/PATCH) update environment
|
||||
- (DELETE) delete environment
|
||||
|
||||
Components operations
|
||||
=====================
|
||||
|
||||
URL: ``/components``
|
||||
Operations:
|
||||
|
||||
- (GET) list components
|
||||
- (POST) create component
|
||||
|
||||
For component creation POST:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
{
|
||||
'name': str,
|
||||
'resource_definitions': [
|
||||
{
|
||||
'name': str, 'content': str
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
Component operations
|
||||
====================
|
||||
|
||||
URL: ``/components/<int:component_id>``
|
||||
Operations:
|
||||
|
||||
- (GET) get component
|
||||
- (PUT/PATCH) update component
|
||||
- (DELETE) delete component
|
||||
|
||||
Hierarchy levels operations
|
||||
===========================
|
||||
|
||||
URL: ``/environments/<int:environment_id>/hierarchy_levels``
|
||||
Operations:
|
||||
|
||||
- (GET) list environment hierarchy levels
|
||||
|
||||
Hierarchy levels modifications performed via environment
|
||||
modifications.
|
||||
|
||||
Hierarchy level operations
|
||||
==========================
|
||||
|
||||
URL: ``/environments/<int:component_id>/<string:level>``
|
||||
Operations:
|
||||
|
||||
- (GET) get hierarchy level
|
||||
|
||||
.. _`keys operations`:
|
||||
|
||||
Keys operations
|
||||
===============
|
||||
|
||||
For performing keys operation send PATCH request to the appropriate URL. As data use
|
||||
list of keys written in the order of access. For instance you have the following data:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
{
|
||||
'k0': {
|
||||
'k1': 'val01',
|
||||
'k2': 'val02,
|
||||
'k3': [{'k4': 'val030'}]
|
||||
}
|
||||
}
|
||||
|
||||
For access to the val02 key path will be: ['k0', 'k2']
|
||||
If you want to modify value add required value to the keys path. For instance, if you
|
||||
want change 'val02' to 'val02_new' key paths will be: ['k0', 'k2', 'val02_new']
|
||||
|
||||
If you want to delete 'k4' key use key path ['k0', 'k3', 0, 'k4']
|
||||
|
||||
Key operations work only in batch mode, so you should pass list of keys paths to the
|
||||
appropriate API handler::
|
||||
|
||||
[['k0', 'k1', 'val01_new'], ['k0', 'k2', 'val02_new']]
|
||||
|
||||
For adding new key 'new_k' to the data you should send the following keys paths::
|
||||
|
||||
[['new_k', 'new_val']]
|
||||
|
||||
Resource definitions operations
|
||||
===============================
|
||||
|
||||
URL: ``/resource_definitions``
|
||||
Operations:
|
||||
|
||||
- (GET) list resource definitions
|
||||
- (POST) create resource definition
|
||||
|
||||
For resource definition creation POST:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
{
|
||||
'name': str,
|
||||
'component_id': int,
|
||||
'content': str
|
||||
}
|
||||
|
||||
|
||||
Resource definition operations
|
||||
==============================
|
||||
|
||||
URL: ``/resource_definitions/<int:resource_definition_id>``
|
||||
Operations:
|
||||
|
||||
- (GET) get resource definition
|
||||
- (PUT/PATCH) update resource definition
|
||||
- (DELETE) delete resource definition
|
||||
|
||||
Resource definition keys operations
|
||||
===================================
|
||||
|
||||
Operations with keys modifies resource definition content only.
|
||||
These operations supports nested keys. For details see: `keys operations`_.
|
||||
|
||||
URL: ``/resource_definitions/<int:resource_definition_id>/keys/<keys_operation:operation>``
|
||||
Handled keys operations:
|
||||
|
||||
- get resource value key
|
||||
- update resource definition key
|
||||
- delete resource definition key
|
||||
|
||||
Resource values operations
|
||||
==========================
|
||||
|
||||
URL: ``/environments/<int:environment_id>/<levels:levels>resources/<id_or_name:resource_id_or_name>/values``
|
||||
Operations:
|
||||
|
||||
- (GET) get resource value
|
||||
- (PUT) create/update resource value
|
||||
|
||||
For resource value creation set PUT HTTP request with data as workload.
|
||||
This data will be stored to the resource values bound to the appropriate
|
||||
level value.
|
||||
|
||||
For merging data from all levels specify 'effective' parameter for GET
|
||||
HTTP request.
|
||||
|
||||
For tracing the level from which data is got specify 'show_lookup'
|
||||
parameter for the GET HTTP request. Lookup has sense only if you are
|
||||
fetching the effective values.
|
||||
|
||||
Resource values keys operations
|
||||
===============================
|
||||
|
||||
Operations with keys modifies resource values only.
|
||||
These operations supports nested keys. For details see: `keys operations`_.
|
||||
|
||||
URL: ``/environments/<int:environment_id>/<levels:levels>resources/<id_or_name:resource_id_or_name>/values/keys/<keys_operation:operation>``
|
||||
Handled keys operations:
|
||||
|
||||
- get resource values key
|
||||
- update resource values key
|
||||
- delete resource values key
|
||||
|
||||
Resource overrides operations
|
||||
=============================
|
||||
|
||||
URL: ``/environments/<int:environment_id>/<levels:levels>resources/<id_or_name:resource_id_or_name>/overrides``
|
||||
Operations:
|
||||
|
||||
- (GET) get resource overrides
|
||||
- (PUT) create/update resource overrides
|
||||
|
||||
For resource value creation set PUT HTTP request with data as workload.
|
||||
This data will be stored to the resource override bound to the appropriate
|
||||
level value.
|
||||
|
||||
Resource values keys operations
|
||||
===============================
|
||||
|
||||
Operations with keys modifies resource overrides only.
|
||||
These operations supports nested keys. For details see: `keys operations`_.
|
||||
|
||||
URL: ``/environments/<int:environment_id>/<levels:levels>resources/<id_or_name:resource_id_or_name>/overrides/keys/<keys_operation:operation>``
|
||||
Handled keys operations:
|
||||
|
||||
- get resource value key
|
||||
- update resource value key
|
||||
- delete resource value key
|
||||
For any further questions, please email
|
||||
openstack-discuss@lists.openstack.org or join #openstack-dev on
|
||||
Freenode.
|
||||
|
7
TODO
7
TODO
@ -1,7 +0,0 @@
|
||||
* API input validation
|
||||
* properly handle collections/elements in API (currently all operations are
|
||||
allowed on both collection and element which leads to bad error codes)
|
||||
* add cascade deletes or smth like it
|
||||
* verify that schema/template is actually related to environment
|
||||
* add component priorities
|
||||
* add versioning of all data
|
41
alembic.ini
41
alembic.ini
@ -1,41 +0,0 @@
|
||||
[alembic]
|
||||
script_location = tuning_box/migrations
|
||||
# use in-memory sqlite to generate revisions
|
||||
sqlalchemy.url = sqlite:///
|
||||
version_table = alembic_version
|
||||
|
||||
|
||||
# Logging configuration
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
10
bindep.txt
10
bindep.txt
@ -1,10 +0,0 @@
|
||||
# This is a cross-platform list tracking distribution packages needed by tests;
|
||||
# see http://docs.openstack.org/infra/bindep/ for additional information.
|
||||
|
||||
# Requirements for DB migrations check
|
||||
libpq-dev [platform:dpkg]
|
||||
postgresql-devel [platform:rpm]
|
||||
mysql-server [platform:dpkg]
|
||||
mariadb-server [platform:rpm]
|
||||
postgresql
|
||||
postgresql-server [platform:rpm]
|
@ -1,75 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.abspath('../..'))
|
||||
# -- General configuration ----------------------------------------------------
|
||||
|
||||
# Add any Sphinx extension module names here, as strings. They can be
|
||||
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
|
||||
extensions = [
|
||||
'sphinx.ext.autodoc',
|
||||
#'sphinx.ext.intersphinx',
|
||||
'oslosphinx'
|
||||
]
|
||||
|
||||
# autodoc generation is a bit aggressive and a nuisance when doing heavy
|
||||
# text edit cycles.
|
||||
# execute "export SPHINX_DEBUG=1" in your terminal to disable
|
||||
|
||||
# The suffix of source filenames.
|
||||
source_suffix = '.rst'
|
||||
|
||||
# The master toctree document.
|
||||
master_doc = 'index'
|
||||
|
||||
# General information about the project.
|
||||
project = u'tuning_box'
|
||||
copyright = u'2013, OpenStack Foundation'
|
||||
|
||||
# If true, '()' will be appended to :func: etc. cross-reference text.
|
||||
add_function_parentheses = True
|
||||
|
||||
# If true, the current module name will be prepended to all description
|
||||
# unit titles (such as .. function::).
|
||||
add_module_names = True
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
pygments_style = 'sphinx'
|
||||
|
||||
# -- Options for HTML output --------------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages. Major themes that come with
|
||||
# Sphinx are currently 'default' and 'sphinxdoc'.
|
||||
# html_theme_path = ["."]
|
||||
# html_theme = '_theme'
|
||||
# html_static_path = ['static']
|
||||
|
||||
# Output file base name for HTML help builder.
|
||||
htmlhelp_basename = '%sdoc' % project
|
||||
|
||||
# Grouping the document tree into LaTeX files. List of tuples
|
||||
# (source start file, target name, title, author, documentclass
|
||||
# [howto/manual]).
|
||||
latex_documents = [
|
||||
('index',
|
||||
'%s.tex' % project,
|
||||
u'%s Documentation' % project,
|
||||
u'OpenStack Foundation', 'manual'),
|
||||
]
|
||||
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
#intersphinx_mapping = {'http://docs.python.org/': None}
|
@ -1,4 +0,0 @@
|
||||
============
|
||||
Contributing
|
||||
============
|
||||
.. include:: ../../CONTRIBUTING.rst
|
@ -1,24 +0,0 @@
|
||||
.. tuning_box documentation master file, created by
|
||||
sphinx-quickstart on Tue Jul 9 22:26:36 2013.
|
||||
You can adapt this file completely to your liking, but it should at least
|
||||
contain the root `toctree` directive.
|
||||
|
||||
Welcome to tuning_box's documentation!
|
||||
========================================================
|
||||
|
||||
Contents:
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 2
|
||||
|
||||
readme
|
||||
installation
|
||||
usage
|
||||
contributing
|
||||
|
||||
Indices and tables
|
||||
==================
|
||||
|
||||
* :ref:`genindex`
|
||||
* :ref:`modindex`
|
||||
* :ref:`search`
|
@ -1,12 +0,0 @@
|
||||
============
|
||||
Installation
|
||||
============
|
||||
|
||||
At the command line::
|
||||
|
||||
$ pip install tuning_box
|
||||
|
||||
Or, if you have virtualenvwrapper installed::
|
||||
|
||||
$ mkvirtualenv tuning_box
|
||||
$ pip install tuning_box
|
@ -1 +0,0 @@
|
||||
.. include:: ../../README.rst
|
@ -1,14 +0,0 @@
|
||||
{
|
||||
"name": "comp1",
|
||||
"templates": [
|
||||
{
|
||||
"name": "temp1",
|
||||
"content": {"asd": "ns1.a"}
|
||||
}
|
||||
],
|
||||
"schemas": [{
|
||||
"name": "schema1",
|
||||
"content": {},
|
||||
"namespace_id": 1
|
||||
}]
|
||||
}
|
@ -1,5 +0,0 @@
|
||||
{
|
||||
"name": "env1",
|
||||
"components": [1],
|
||||
"hierarchy_levels": ["node"]
|
||||
}
|
@ -1 +0,0 @@
|
||||
{"name": "ns1"}
|
File diff suppressed because it is too large
Load Diff
@ -1,13 +0,0 @@
|
||||
{
|
||||
"name": "comp1",
|
||||
"resource_definitions": [
|
||||
{
|
||||
"name": "resource1",
|
||||
"content": {}
|
||||
},
|
||||
{
|
||||
"name": "slashed/resource",
|
||||
"content": {}
|
||||
}
|
||||
]
|
||||
}
|
@ -1,5 +0,0 @@
|
||||
{
|
||||
"name": "env1",
|
||||
"components": ["comp1"],
|
||||
"hierarchy_levels": ["nodes"]
|
||||
}
|
@ -1 +0,0 @@
|
||||
{"name": "ns1"}
|
File diff suppressed because it is too large
Load Diff
@ -1,36 +0,0 @@
|
||||
yum install git python-pip python-alembic python-flask
|
||||
git clone https://git.openstack.org/openstack/tuning-box.git
|
||||
pip install -e tuning-box
|
||||
sudo -u postgres psql -c '\dt' nailgun | grep tuning_box
|
||||
nailgun_syncdb
|
||||
sudo -u postgres psql -c '\dt' nailgun | grep tuning_box
|
||||
service nailgun restart
|
||||
token_id=$(openstack token issue -c id -f value)
|
||||
|
||||
curl -H "X-Auth-Token: $token_id" http://10.20.0.2:8000/api/config/
|
||||
- components
|
||||
- environments
|
||||
- components -H "Content-type: application/json" -d @component.json
|
||||
- environments -H "Content-type: application/json" -d @environment.json
|
||||
- environments/1/resources/resource1/values
|
||||
- environments/1/resources/slashed/resource/values
|
||||
- environments/1/resources/1/values
|
||||
- environments/1/nodes/node-1.domain.tld/resources/1/values
|
||||
- environments/1/resources/1/values -H "Content-type: application/json" -X PUT -d '{"global_key": "global_value"}'
|
||||
- environments/1/nodes/node-1.domain.tld/resources/1/values -H "Content-type: application/json" -X PUT -d '{"node_key": "node_value"}'
|
||||
- environments/1/resources/1/values
|
||||
- environments/1/resources/1/values?effective
|
||||
- environments/1/nodes/node-1.domain.tld/resources/1/values
|
||||
- environments/1/nodes/node-1.domain.tld/resources/1/values?effective
|
||||
- environments/1/resources/1/overrides -H "Content-type: application/json" -X PUT -d '{"global_key": "global_override"}'
|
||||
- environments/1/nodes/node-1.domain.tld/resources/1/values?effective
|
||||
- environments/1/resources/1/values -H "Content-type: application/json" -X PUT -d '{"global_key": "global_value_new"}'
|
||||
- environments/1/nodes/node-1.domain.tld/resources/1/values?effective
|
||||
- environments/1/nodes/node-1.domain.tld/resources/1/overrides -H "Content-type: application/json" -X PUT -d '{"global_key": "node_override"}'
|
||||
- environments/1/nodes/node-1.domain.tld/resources/1/values?effective
|
||||
- environments/1/nodes/node-2.domain.tld/resources/1/values?effective
|
||||
- environments/1/resources/1/values
|
||||
- environments/1/resources/1/values?effective
|
||||
- environments/1/nodes/node-1.domain.tld/resources/1/values
|
||||
|
||||
curl -H "X-Auth-Token: $token_id" http://10.20.0.2:8000/api/config/
|
@ -1,13 +0,0 @@
|
||||
{
|
||||
"name": "comp1",
|
||||
"resource_definitions": [
|
||||
{
|
||||
"name": "resource1",
|
||||
"content": {}
|
||||
},
|
||||
{
|
||||
"name": "slashed/resource",
|
||||
"content": {}
|
||||
}
|
||||
]
|
||||
}
|
@ -1,4 +0,0 @@
|
||||
{
|
||||
"components": ["comp1"],
|
||||
"hierarchy_levels": ["nodes"]
|
||||
}
|
File diff suppressed because it is too large
Load Diff
@ -1,34 +0,0 @@
|
||||
rpm --import http://mirror.fuel-infra.org/mos-repos/centos/mos9.0-centos7/os/RPM-GPG-KEY-mos9.0
|
||||
yum-config-manager --add-repo http://mirror.fuel-infra.org/mos-repos/centos/mos9.0-centos7/os/x86_64/
|
||||
yum-config-manager --add-repo http://packages.fuel-infra.org/review/FUEL-304811//repositories/centos/master-centos7/os/x86_64
|
||||
yum install -y tuning-box
|
||||
nailgun_syncdb
|
||||
service nailgun restart
|
||||
|
||||
export OS_USERNAME=admin OS_PASSWORD=admin OS_PROJECT_NAME=admin OS_AUTH_URL=http://10.20.0.2:5000
|
||||
openstack service create --name tuning-box config
|
||||
openstack endpoint create --publicurl http://10.20.0.2:8000/api/config --region RegionOne tuning-box
|
||||
openstack catalog list
|
||||
|
||||
token_id=$(openstack token issue -c id -f value)
|
||||
|
||||
curl -H "X-Auth-Token: $token_id" http://10.20.0.2:8000/api/config/components -H "Content-type: application/json" -d @component.json
|
||||
curl -H "X-Auth-Token: $token_id" http://10.20.0.2:8000/api/config/environments -H "Content-type: application/json" -d @environment.json
|
||||
|
||||
fuel2 config get --env 1 --resource resource1 --format yaml
|
||||
|
||||
echo '{"a": 1, "b": null}' | fuel2 config set --env 1 --resource resource1 --format json
|
||||
fuel2 config get --env 1 --resource resource1 --format yaml
|
||||
fuel2 config get --env 1 --resource resource1 --format yaml --level nodes=1
|
||||
|
||||
echo '{"a": 2}' | fuel2 config set --env 1 --resource resource1 --format json --level nodes=1
|
||||
fuel2 config get --env 1 --resource resource1 --format yaml
|
||||
fuel2 config get --env 1 --resource resource1 --format yaml --level nodes=1
|
||||
|
||||
fuel2 config override --env 1 --resource resource1 --key b --value s --type str
|
||||
fuel2 config get --env 1 --resource resource1 --format yaml
|
||||
fuel2 config get --env 1 --resource resource1 --format yaml --level nodes=1
|
||||
|
||||
echo '{"a": 1, "b": "s3"}' | fuel2 config set --env 1 --resource resource1 --format json
|
||||
fuel2 config get --env 1 --resource resource1 --format yaml
|
||||
fuel2 config get --env 1 --resource resource1 --format yaml --level nodes=1
|
@ -1,90 +0,0 @@
|
||||
import collections
|
||||
import decimal
|
||||
import json
|
||||
import os
|
||||
import shutil
|
||||
import sys
|
||||
|
||||
def remove_bs(data):
|
||||
new_stdout = []
|
||||
for i, e in enumerate(data["stdout"]):
|
||||
if '\b' in e[1]:
|
||||
c = e[1].count('\b')
|
||||
for j, e1 in enumerate(reversed(new_stdout)):
|
||||
if len(e1[1]) <= c:
|
||||
c -= len(e1[1])
|
||||
if not c:
|
||||
break
|
||||
else:
|
||||
e1[1] = e1[1][:-c]
|
||||
j += 1
|
||||
break
|
||||
new_stdout[-j-1:] = []
|
||||
else:
|
||||
new_stdout.append(e)
|
||||
data["stdout"] = new_stdout
|
||||
|
||||
|
||||
def speedup_typing(data, delay):
|
||||
delay = decimal.Decimal(delay)
|
||||
it = iter(data["stdout"])
|
||||
try:
|
||||
e = next(it)
|
||||
while True:
|
||||
while not e[1].endswith("]# "):
|
||||
e = next(it)
|
||||
e = next(it)
|
||||
if e[1] != "#":
|
||||
continue
|
||||
while True:
|
||||
e = next(it)
|
||||
e[0] = delay
|
||||
if '\r' in e[1]:
|
||||
break
|
||||
except StopIteration:
|
||||
pass
|
||||
|
||||
|
||||
def set_max_delay(data, delay):
|
||||
delay = decimal.Decimal(delay)
|
||||
for e in data["stdout"]:
|
||||
if e[0] > delay:
|
||||
e[0] = delay
|
||||
|
||||
|
||||
def correct_duration(data):
|
||||
data["duration"] = sum(e[0] for e in data["stdout"])
|
||||
|
||||
|
||||
def main(fname):
|
||||
with open(fname) as f:
|
||||
data = json.load(f, object_pairs_hook=collections.OrderedDict, parse_float=decimal.Decimal)
|
||||
|
||||
remove_bs(data)
|
||||
speedup_typing(data, '0.05')
|
||||
set_max_delay(data, '0.5')
|
||||
correct_duration(data)
|
||||
|
||||
shutil.copy(fname, fname + ".tmp")
|
||||
with open(fname + ".tmp", "w") as f:
|
||||
data = json.dump(data, f, indent=2, cls=Encoder)
|
||||
os.rename(fname + ".tmp", fname)
|
||||
|
||||
|
||||
class DecIntWrapper(int):
|
||||
def __init__(self, d):
|
||||
self.d = d
|
||||
|
||||
def __str__(self):
|
||||
return str(self.d)
|
||||
|
||||
|
||||
class Encoder(json.JSONEncoder):
|
||||
def default(self, o):
|
||||
if isinstance(o, decimal.Decimal):
|
||||
#import pdb; pdb.set_trace()
|
||||
return DecIntWrapper(o)
|
||||
return json.JSONEncoder.default(o)
|
||||
|
||||
if __name__ == '__main__':
|
||||
main(sys.argv[1])
|
@ -1,7 +0,0 @@
|
||||
========
|
||||
Usage
|
||||
========
|
||||
|
||||
To use tuning_box in a project::
|
||||
|
||||
import tuning_box
|
@ -1,26 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
LOG_LEVEL = 'DEBUG'
|
||||
PROPAGATE_EXCEPTIONS = True
|
||||
|
||||
SQLALCHEMY_DATABASE_URI = \
|
||||
'postgresql://tuningbox:tuningbox@localhost/tuningbox'
|
||||
|
||||
AUTH = {
|
||||
'auth_host': '127.0.0.1',
|
||||
'auth_protocol': 'http',
|
||||
'auth_version': 'v2.0',
|
||||
'admin_user': 'tuningbox',
|
||||
'admin_password': 'tuningbox',
|
||||
'admin_tenant_name': 'services'
|
||||
}
|
@ -1,8 +0,0 @@
|
||||
uwsgi:
|
||||
socket: :8082
|
||||
protocol: http
|
||||
module: tuning_box.app:build_app()
|
||||
pythonpath: %d../../..
|
||||
env: TUNINGBOX_SETTINGS=%d/tuningbox_config.py
|
||||
# logto: /var/log/tuningbox/tuningbox.log
|
||||
# pid: /var/run/tuningbox.pid
|
@ -1,12 +0,0 @@
|
||||
[Unit]
|
||||
Name=Tunigbox service
|
||||
ConditionPathExists=/etc/tuningbox/uwsgi_tuningbox.yaml
|
||||
|
||||
[Service]
|
||||
ExecStart=/usr/sbin/uwsgi -y /etc/tuningbox/uwsgi_tuningbox.yaml
|
||||
ExecReload=/usr/sbin/uwsgi --reload /var/run/tuningbox.pid
|
||||
ExecStop=/usr/sbin/uwsgi --stop /var/run/tuningbox.pid
|
||||
ExecStopPost=/usr/bin/rm -f /var/run/tuningbox.pid
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
@ -1,6 +0,0 @@
|
||||
[DEFAULT]
|
||||
|
||||
# The list of modules to copy from oslo-incubator.git
|
||||
|
||||
# The base module to hold the copy of openstack.common
|
||||
base=tuning_box
|
@ -1,13 +0,0 @@
|
||||
# The order of packages is significant, because pip processes them in the order
|
||||
# of appearance. Changing the order has an impact on the overall integration
|
||||
# process, which may cause wedges in the gate later.
|
||||
|
||||
pbr>=1.6
|
||||
flask
|
||||
flask-sqlalchemy
|
||||
flask-restful
|
||||
alembic
|
||||
cliff
|
||||
requests
|
||||
keystonemiddleware>=4.0.0,!=4.1.0,!=4.5.0
|
||||
six>=1.9.0
|
98
setup.cfg
98
setup.cfg
@ -1,98 +0,0 @@
|
||||
[metadata]
|
||||
name = tuning_box
|
||||
summary = Tuning Box - configuration storage for your clouds
|
||||
description-file =
|
||||
README.rst
|
||||
author = OpenStack
|
||||
author-email = openstack-dev@lists.openstack.org
|
||||
home-page = http://www.openstack.org/
|
||||
classifier =
|
||||
Environment :: OpenStack
|
||||
Intended Audience :: Information Technology
|
||||
Intended Audience :: System Administrators
|
||||
License :: OSI Approved :: Apache Software License
|
||||
Operating System :: POSIX :: Linux
|
||||
Programming Language :: Python
|
||||
Programming Language :: Python :: 2
|
||||
Programming Language :: Python :: 2.7
|
||||
Programming Language :: Python :: 3
|
||||
Programming Language :: Python :: 3.4
|
||||
|
||||
[files]
|
||||
packages =
|
||||
tuning_box
|
||||
|
||||
[build_sphinx]
|
||||
source-dir = doc/source
|
||||
build-dir = doc/build
|
||||
all_files = 1
|
||||
|
||||
[upload_sphinx]
|
||||
upload-dir = doc/build/html
|
||||
|
||||
[compile_catalog]
|
||||
directory = tuning_box/locale
|
||||
domain = tuning_box
|
||||
|
||||
[update_catalog]
|
||||
domain = tuning_box
|
||||
output_dir = tuning_box/locale
|
||||
input_file = tuning_box/locale/tuning_box.pot
|
||||
|
||||
[extract_messages]
|
||||
keywords = _ gettext ngettext l_ lazy_gettext
|
||||
mapping_file = babel.cfg
|
||||
output_file = tuning_box/locale/tuning_box.pot
|
||||
|
||||
[entry_points]
|
||||
nailgun.extensions =
|
||||
tuning_box = tuning_box.nailgun:Extension
|
||||
tuning_box.cli =
|
||||
get = tuning_box.cli.resources:Get
|
||||
set = tuning_box.cli.resources:Set
|
||||
del = tuning_box.cli.resources:Delete
|
||||
override = tuning_box.cli.resources:Override
|
||||
rm_override = tuning_box.cli.resources:DeleteOverride
|
||||
env_create = tuning_box.cli.environments:CreateEnvironment
|
||||
env_list = tuning_box.cli.environments:ListEnvironments
|
||||
env_show = tuning_box.cli.environments:ShowEnvironment
|
||||
env_delete = tuning_box.cli.environments:DeleteEnvironment
|
||||
env_update = tuning_box.cli.environments:UpdateEnvironment
|
||||
comp_create = tuning_box.cli.components:CreateComponent
|
||||
comp_list = tuning_box.cli.components:ListComponents
|
||||
comp_show = tuning_box.cli.components:ShowComponent
|
||||
comp_delete = tuning_box.cli.components:DeleteComponent
|
||||
comp_update = tuning_box.cli.components:UpdateComponent
|
||||
def_create = tuning_box.cli.resource_definitions:CreateResourceDefinition
|
||||
def_list = tuning_box.cli.resource_definitions:ListResourceDefinitions
|
||||
def_show = tuning_box.cli.resource_definitions:ShowResourceDefinition
|
||||
def_delete = tuning_box.cli.resource_definitions:DeleteResourceDefinition
|
||||
def_update = tuning_box.cli.resource_definitions:UpdateResourceDefinition
|
||||
lvl_list = tuning_box.cli.hierarchy_levels:ListHierarchyLevels
|
||||
lvl_show = tuning_box.cli.hierarchy_levels:ShowHierarchyLevel
|
||||
fuelclient =
|
||||
config_get = tuning_box.fuelclient:Get
|
||||
config_set = tuning_box.fuelclient:Set
|
||||
config_del = tuning_box.fuelclient:Delete
|
||||
config_override = tuning_box.fuelclient:Override
|
||||
config_rm_override = tuning_box.fuelclient:DeleteOverride
|
||||
config_env_create = tuning_box.fuelclient:CreateEnvironment
|
||||
config_env_list = tuning_box.fuelclient:ListEnvironments
|
||||
config_env_show = tuning_box.fuelclient:ShowEnvironment
|
||||
config_env_delete = tuning_box.fuelclient:DeleteEnvironment
|
||||
config_env_update = tuning_box.fuelclient:UpdateEnvironment
|
||||
config_comp_create = tuning_box.fuelclient:CreateComponent
|
||||
config_comp_list = tuning_box.fuelclient:ListComponents
|
||||
config_comp_show = tuning_box.fuelclient:ShowComponent
|
||||
config_comp_delete = tuning_box.fuelclient:DeleteComponent
|
||||
config_comp_update = tuning_box.fuelclient:UpdateComponent
|
||||
config_def_create = tuning_box.fuelclient:CreateResourceDefinition
|
||||
config_def_list = tuning_box.fuelclient:ListResourceDefinitions
|
||||
config_def_show = tuning_box.fuelclient:ShowResourceDefinition
|
||||
config_def_delete = tuning_box.fuelclient:DeleteResourceDefinition
|
||||
config_def_update = tuning_box.fuelclient:UpdateResourceDefinition
|
||||
config_lvl_list = tuning_box.fuelclient:ListHierarchyLevels
|
||||
config_lvl_show = tuning_box.fuelclient:ShowHierarchyLevel
|
||||
console_scripts =
|
||||
tuningbox_db_upgrade = tuning_box.migration:upgrade
|
||||
tuningbox_db_downgrade = tuning_box.migration:downgrade
|
29
setup.py
29
setup.py
@ -1,29 +0,0 @@
|
||||
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
# THIS FILE IS MANAGED BY THE GLOBAL REQUIREMENTS REPO - DO NOT EDIT
|
||||
import setuptools
|
||||
|
||||
# In python < 2.7.4, a lazy loading of package `pbr` will break
|
||||
# setuptools if some other modules registered functions in `atexit`.
|
||||
# solution from: http://bugs.python.org/issue15881#msg170215
|
||||
try:
|
||||
import multiprocessing # noqa
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
setuptools.setup(
|
||||
setup_requires=['pbr'],
|
||||
pbr=True)
|
@ -1,52 +0,0 @@
|
||||
%define name tuning-box
|
||||
%{!?version: %define version 0.1.0}
|
||||
%{!?release: %define release 1}
|
||||
|
||||
Name: %{name}
|
||||
Version: %{version}
|
||||
Release: %{release}
|
||||
Source0: %{name}-%{version}.tar.gz
|
||||
Summary: Fuel ConfigDB extension package
|
||||
URL: http://openstack.org
|
||||
License: Apache
|
||||
Group: Development/Libraries
|
||||
Prefix: %{_prefix}
|
||||
BuildRequires: git
|
||||
BuildRequires: python-setuptools
|
||||
BuildRequires: python2-devel
|
||||
BuildArch: noarch
|
||||
|
||||
Requires: fuel-nailgun
|
||||
Requires: python-alembic
|
||||
Requires: python-flask
|
||||
Requires: python2-flask-sqlalchemy
|
||||
Requires: python2-flask-restful
|
||||
|
||||
%description
|
||||
This package provides extension to the Nailgun API. This extension allows to
|
||||
manage deployment information and facilitates the exchange of such information
|
||||
between Nailgun and 3rd-party deployment and configuration management services
|
||||
(e.g. Puppet Master).
|
||||
|
||||
%prep
|
||||
%setup -cq -n %{name}-%{version}
|
||||
|
||||
%build
|
||||
cd %{_builddir}/%{name}-%{version} && PBR_VERSION=%{version} %{__python} setup.py build
|
||||
|
||||
%install
|
||||
cd %{_builddir}/%{name}-%{version} && PBR_VERSION=%{version} %{__python} setup.py install --single-version-externally-managed -O1 --root=$RPM_BUILD_ROOT
|
||||
|
||||
%clean
|
||||
rm -rf $RPM_BUILD_ROOT
|
||||
|
||||
%files
|
||||
%defattr(-,root,root)
|
||||
%{python_sitelib}/tuning_box/
|
||||
%{python_sitelib}/tuning_box-%{version}*.egg-info/
|
||||
/usr/bin/tuningbox_db_downgrade
|
||||
/usr/bin/tuningbox_db_upgrade
|
||||
|
||||
%changelog
|
||||
* Fri Mar 18 2016 Oleg Gelbukh <ogelbukh@mirantis.com> 9.0.0
|
||||
- Initial version of package spec
|
@ -1,18 +0,0 @@
|
||||
# The order of packages is significant, because pip processes them in the order
|
||||
# of appearance. Changing the order has an impact on the overall integration
|
||||
# process, which may cause wedges in the gate later.
|
||||
|
||||
hacking<0.11,>=0.10.0
|
||||
|
||||
coverage>=3.6
|
||||
discover
|
||||
python-subunit>=0.0.18
|
||||
sphinx!=1.2.0,!=1.3b1,<1.3,>=1.1.2
|
||||
oslo.db[fixtures,mysql,postgresql]
|
||||
oslosphinx>=2.5.0 # Apache-2.0
|
||||
oslotest>=1.10.0 # Apache-2.0
|
||||
testrepository>=0.0.18
|
||||
testscenarios>=0.4
|
||||
testtools>=1.4.0
|
||||
requests-mock
|
||||
python-fuelclient
|
@ -1,54 +0,0 @@
|
||||
#!/bin/bash -xe
|
||||
|
||||
# This script will be run by OpenStack CI before unit tests are run,
|
||||
# it sets up the test system as needed.
|
||||
# Developers should setup their test systems in a similar way.
|
||||
|
||||
# This setup needs to be run as a user that can run sudo.
|
||||
|
||||
# The root password for the MySQL database; pass it in via
|
||||
# MYSQL_ROOT_PW.
|
||||
DB_ROOT_PW=${MYSQL_ROOT_PW:-insecure_slave}
|
||||
|
||||
# This user and its password are used by the tests, if you change it,
|
||||
# your tests might fail.
|
||||
DB_USER=openstack_citest
|
||||
DB_PW=openstack_citest
|
||||
|
||||
sudo -H mysqladmin -u root password $DB_ROOT_PW
|
||||
|
||||
# It's best practice to remove anonymous users from the database. If
|
||||
# a anonymous user exists, then it matches first for connections and
|
||||
# other connections from that host will not work.
|
||||
sudo -H mysql -u root -p$DB_ROOT_PW -h localhost -e "
|
||||
DELETE FROM mysql.user WHERE User='';
|
||||
FLUSH PRIVILEGES;
|
||||
GRANT ALL PRIVILEGES ON *.*
|
||||
TO '$DB_USER'@'%' identified by '$DB_PW' WITH GRANT OPTION;"
|
||||
|
||||
# Now create our database.
|
||||
mysql -u $DB_USER -p$DB_PW -h 127.0.0.1 -e "
|
||||
SET default_storage_engine=MYISAM;
|
||||
DROP DATABASE IF EXISTS openstack_citest;
|
||||
CREATE DATABASE openstack_citest CHARACTER SET utf8;"
|
||||
|
||||
# Same for PostgreSQL
|
||||
|
||||
# Setup user
|
||||
root_roles=$(sudo -H -u postgres psql -t -c "
|
||||
SELECT 'HERE' from pg_roles where rolname='$DB_USER'")
|
||||
if [[ ${root_roles} == *HERE ]];then
|
||||
sudo -H -u postgres psql -c "ALTER ROLE $DB_USER WITH SUPERUSER LOGIN PASSWORD '$DB_PW'"
|
||||
else
|
||||
sudo -H -u postgres psql -c "CREATE ROLE $DB_USER WITH SUPERUSER LOGIN PASSWORD '$DB_PW'"
|
||||
fi
|
||||
|
||||
# Store password for tests
|
||||
cat << EOF > $HOME/.pgpass
|
||||
*:*:*:$DB_USER:$DB_PW
|
||||
EOF
|
||||
chmod 0600 $HOME/.pgpass
|
||||
|
||||
# Now create our database
|
||||
psql -h 127.0.0.1 -U $DB_USER -d template1 -c "DROP DATABASE IF EXISTS openstack_citest"
|
||||
createdb -h 127.0.0.1 -U $DB_USER -l C -T template0 -E utf8 openstack_citest
|
37
tox.ini
37
tox.ini
@ -1,37 +0,0 @@
|
||||
[tox]
|
||||
minversion = 1.6
|
||||
envlist = py34,py27,pep8
|
||||
skipsdist = True
|
||||
|
||||
[testenv]
|
||||
usedevelop = True
|
||||
install_command = pip install -U {opts} {packages}
|
||||
setenv =
|
||||
VIRTUAL_ENV={envdir}
|
||||
PYTHONWARNINGS=ignore:Unmanaged access of declarative attribute __tablename__ from non-mapped class ModelMixin
|
||||
OS_TEST_DBAPI_ADMIN_CONNECTION=mysql+pymysql://openstack_citest:openstack_citest@localhost/;postgresql://openstack_citest:openstack_citest@localhost/postgres;sqlite:///testdb
|
||||
deps = -r{toxinidir}/test-requirements.txt
|
||||
commands = python setup.py test --slowest --testr-args='{posargs}'
|
||||
|
||||
[testenv:pep8]
|
||||
commands = flake8
|
||||
|
||||
[testenv:venv]
|
||||
commands = {posargs}
|
||||
|
||||
[testenv:cover]
|
||||
commands = python setup.py test --coverage --testr-args='{posargs}'
|
||||
|
||||
[testenv:docs]
|
||||
commands = python setup.py build_sphinx
|
||||
|
||||
[testenv:debug]
|
||||
commands = oslo_debug_helper {posargs}
|
||||
|
||||
[flake8]
|
||||
# E123, E125 skipped as they are invalid PEP-8.
|
||||
|
||||
show-source = True
|
||||
ignore = E123,E125
|
||||
builtins = _
|
||||
exclude=.venv,.git,.tox,dist,doc,*openstack/common*,*lib/python*,*egg,build
|
@ -1,24 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import pbr.version
|
||||
import pkg_resources
|
||||
|
||||
|
||||
__version__ = pbr.version.VersionInfo(
|
||||
'tuning_box').version_string()
|
||||
|
||||
|
||||
def get_migrations_dir():
|
||||
return pkg_resources.resource_filename('tuning_box', 'migrations')
|
@ -1,161 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import flask
|
||||
import flask_restful
|
||||
from sqlalchemy import exc as sa_exc
|
||||
|
||||
from tuning_box import converters
|
||||
from tuning_box import db
|
||||
from tuning_box import errors
|
||||
from tuning_box.library import components
|
||||
from tuning_box.library import environments
|
||||
from tuning_box.library import hierarchy_levels
|
||||
from tuning_box.library import resource_definitions
|
||||
from tuning_box.library import resource_overrides
|
||||
from tuning_box.library import resource_values
|
||||
from tuning_box import logger
|
||||
from tuning_box.middleware import keystone
|
||||
|
||||
# These handlers work if PROPAGATE_EXCEPTIONS is off (non-Nailgun case)
|
||||
api_errors = {
|
||||
'IntegrityError': {'status': 409}, # sqlalchemy IntegrityError
|
||||
'TuningboxIntegrityError': {'status': 409},
|
||||
'KeysOperationError': {'status': 409},
|
||||
'RequestValidationError': {'status': 409},
|
||||
'TuningboxNotFound': {'status': 404}
|
||||
}
|
||||
api = flask_restful.Api(errors=api_errors)
|
||||
|
||||
# Components
|
||||
api.add_resource(components.ComponentsCollection, '/components')
|
||||
api.add_resource(components.Component, '/components/<int:component_id>')
|
||||
|
||||
# Resource definitions
|
||||
api.add_resource(
|
||||
resource_definitions.ResourceDefinitionsCollection,
|
||||
'/resource_definitions',
|
||||
)
|
||||
api.add_resource(
|
||||
resource_definitions.ResourceDefinition,
|
||||
'/resource_definitions/<int:resource_definition_id>'
|
||||
)
|
||||
api.add_resource(
|
||||
resource_definitions.ResourceDefinitionKeys,
|
||||
'/resource_definitions/<int:resource_definition_id>/'
|
||||
'keys/<keys_operation:operation>'
|
||||
)
|
||||
|
||||
# Resource values
|
||||
api.add_resource(
|
||||
resource_values.ResourceValues,
|
||||
'/environments/<int:environment_id>/<levels:levels>resources/'
|
||||
'<id_or_name:resource_id_or_name>/values'
|
||||
)
|
||||
api.add_resource(
|
||||
resource_values.ResourceValuesKeys,
|
||||
'/environments/<int:environment_id>/<levels:levels>resources/'
|
||||
'<id_or_name:resource_id_or_name>/values/'
|
||||
'keys/<keys_operation:operation>'
|
||||
)
|
||||
|
||||
# Resource overrides
|
||||
api.add_resource(
|
||||
resource_overrides.ResourceOverrides,
|
||||
'/environments/<int:environment_id>/'
|
||||
'<levels:levels>resources/<id_or_name:resource_id_or_name>/overrides'
|
||||
)
|
||||
api.add_resource(
|
||||
resource_overrides.ResourceOverridesKeys,
|
||||
'/environments/<int:environment_id>/'
|
||||
'<levels:levels>resources/<id_or_name:resource_id_or_name>/overrides/'
|
||||
'keys/<keys_operation:operation>'
|
||||
)
|
||||
|
||||
# Environments
|
||||
api.add_resource(environments.EnvironmentsCollection, '/environments')
|
||||
api.add_resource(
|
||||
environments.Environment,
|
||||
'/environments/<int:environment_id>'
|
||||
)
|
||||
|
||||
# Hierarchy levels
|
||||
api.add_resource(
|
||||
hierarchy_levels.EnvironmentHierarchyLevelsCollection,
|
||||
'/environments/<int:environment_id>/hierarchy_levels'
|
||||
)
|
||||
api.add_resource(
|
||||
hierarchy_levels.EnvironmentHierarchyLevels,
|
||||
'/environments/<int:environment_id>/hierarchy_levels/'
|
||||
'<id_or_name:id_or_name>'
|
||||
)
|
||||
|
||||
|
||||
def handle_request_validation_error(exc):
|
||||
response = flask.jsonify(msg=exc.args[0])
|
||||
response.status_code = 409
|
||||
return response
|
||||
|
||||
|
||||
def handle_integrity_error(exc):
|
||||
response = flask.jsonify(msg=exc.args[0])
|
||||
response.status_code = 409
|
||||
return response
|
||||
|
||||
|
||||
def handle_object_not_found(exc):
|
||||
response = flask.jsonify(msg=exc.args[0])
|
||||
response.status_code = 404
|
||||
return response
|
||||
|
||||
|
||||
def handle_keys_operation_error(exc):
|
||||
response = flask.jsonify(msg=exc.args[0])
|
||||
response.status_code = 409
|
||||
return response
|
||||
|
||||
|
||||
def build_app(configure_logging=True, with_keystone=True):
|
||||
app = flask.Flask(__name__)
|
||||
app.url_map.converters.update(converters.ALL)
|
||||
api.init_app(app) # init_app spoils Api object if app is a blueprint
|
||||
app.config["SQLALCHEMY_TRACK_MODIFICATIONS"] = False # silence warning
|
||||
# TUNINGBOX_SETTINGS is the path to the file with tuning_box configuration
|
||||
app.config.from_envvar('TUNINGBOX_SETTINGS', silent=True)
|
||||
# These handlers work if PROPAGATE_EXCEPTIONS is on (Nailgun case)
|
||||
app.register_error_handler(sa_exc.IntegrityError, handle_integrity_error)
|
||||
app.register_error_handler(errors.TuningboxIntegrityError,
|
||||
handle_integrity_error)
|
||||
app.register_error_handler(errors.TuningboxNotFound,
|
||||
handle_object_not_found)
|
||||
app.register_error_handler(errors.RequestValidationError,
|
||||
handle_request_validation_error)
|
||||
app.register_error_handler(errors.KeysOperationError,
|
||||
handle_keys_operation_error)
|
||||
db.db.init_app(app)
|
||||
if configure_logging:
|
||||
log_level = app.config.get('LOG_LEVEL', 'DEBUG')
|
||||
logger.init_logger(app, log_level)
|
||||
if with_keystone:
|
||||
app.wsgi_app = keystone.KeystoneMiddleware(app)
|
||||
return app
|
||||
|
||||
|
||||
def main():
|
||||
import logging
|
||||
logging.basicConfig(level=logging.DEBUG)
|
||||
|
||||
app = build_app()
|
||||
app.run()
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
@ -1,27 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from cliff import app
|
||||
from cliff import commandmanager
|
||||
|
||||
import tuning_box
|
||||
|
||||
|
||||
class TuningBoxApp(app.App):
|
||||
def __init__(self, client, **kwargs):
|
||||
super(TuningBoxApp, self).__init__(
|
||||
description='Tuning Box - configuration storage for your cloud',
|
||||
version=tuning_box.__version__,
|
||||
command_manager=commandmanager.CommandManager('tuning_box.cli'),
|
||||
**kwargs
|
||||
)
|
||||
self.client = client
|
@ -1,165 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import abc
|
||||
import json
|
||||
import yaml
|
||||
|
||||
from cliff import command
|
||||
from cliff import lister
|
||||
from cliff import show
|
||||
from fuelclient.cli import error as fc_error
|
||||
from fuelclient.common import data_utils
|
||||
import six
|
||||
|
||||
|
||||
def level_converter(value):
|
||||
levels = []
|
||||
for part in value.split(','):
|
||||
spl = part.split("=", 1)
|
||||
if len(spl) != 2:
|
||||
raise TypeError("Levels list should be in format "
|
||||
"level1=value1,level2=value2")
|
||||
levels.append(tuple(spl))
|
||||
return levels
|
||||
|
||||
try:
|
||||
text_type = unicode
|
||||
except NameError:
|
||||
text_type = str
|
||||
|
||||
|
||||
def format_output(output, format_):
|
||||
if format_ == 'plain':
|
||||
if output is None:
|
||||
return ''
|
||||
if isinstance(output, text_type):
|
||||
if text_type is str:
|
||||
return output
|
||||
else:
|
||||
return output.encode('utf-8')
|
||||
format_ = 'json'
|
||||
# numbers, booleans, lists and dicts will be represented as JSON
|
||||
if format_ == 'json':
|
||||
return json.dumps(output)
|
||||
if format_ == 'yaml':
|
||||
# Usage of safe_dump here is crucial since PyYAML emits
|
||||
# "!!python/unicode" objects from unicode strings by defaul
|
||||
return yaml.safe_dump(output, default_flow_style=False)
|
||||
raise RuntimeError("Unknown format '{}'".format(format_))
|
||||
|
||||
|
||||
class BaseCommand(command.Command):
|
||||
|
||||
def get_client(self):
|
||||
return self.app.client
|
||||
|
||||
def _parse_comma_separated(self, parsed_args, param_name, cast_to):
|
||||
param = getattr(parsed_args, param_name)
|
||||
if param is None or param == '[]':
|
||||
return []
|
||||
result = six.moves.map(six.text_type.strip,
|
||||
six.text_type(param).split(','))
|
||||
return list(six.moves.map(cast_to, result))
|
||||
|
||||
def read_json(self):
|
||||
return json.load(self.app.stdin)
|
||||
|
||||
def read_yaml(self):
|
||||
docs_gen = yaml.safe_load_all(self.app.stdin)
|
||||
doc = next(docs_gen)
|
||||
guard = object()
|
||||
if next(docs_gen, guard) is not guard:
|
||||
self.app.stderr.write("Warning: will use only first "
|
||||
"document from YAML stream")
|
||||
return doc
|
||||
|
||||
|
||||
class BaseOneCommand(BaseCommand):
|
||||
|
||||
@abc.abstractproperty
|
||||
def base_url(self):
|
||||
"""Base url for request operations"""
|
||||
|
||||
@abc.abstractproperty
|
||||
def entity_name(self):
|
||||
"""Name of the TuningBox entity"""
|
||||
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(BaseOneCommand, self).get_parser(*args, **kwargs)
|
||||
parser.add_argument(
|
||||
'id',
|
||||
type=int,
|
||||
help='Id of the {0}'.format(self.entity_name))
|
||||
return parser
|
||||
|
||||
def get_url(self, parsed_args):
|
||||
return '{0}/{1}'.format(self.base_url, parsed_args.id)
|
||||
|
||||
def get_deletion_message(self, parsed_args):
|
||||
return '{0} with id {1} was deleted\n'.format(
|
||||
self.entity_name.capitalize(), parsed_args.id)
|
||||
|
||||
def get_update_message(self, parsed_args):
|
||||
return '{0} with id {1} was updated\n'.format(
|
||||
self.entity_name.capitalize(), parsed_args.id)
|
||||
|
||||
|
||||
class BaseDeleteCommand(BaseOneCommand):
|
||||
"""Deletes entity with the specified id."""
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
result = self.get_client().delete(self.get_url(parsed_args))
|
||||
if result is None:
|
||||
result = self.get_deletion_message(parsed_args)
|
||||
self.app.stdout.write(six.text_type(result))
|
||||
|
||||
|
||||
class BaseListCommand(BaseCommand, lister.Lister):
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
result = self.get_client().get(self.base_url)
|
||||
try:
|
||||
data = data_utils.get_display_data_multi(self.columns, result)
|
||||
return self.columns, data
|
||||
except fc_error.BadDataException:
|
||||
return zip(*result.items())
|
||||
|
||||
|
||||
class BaseShowCommand(BaseOneCommand, show.ShowOne):
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
result = self.get_client().get(self.get_url(parsed_args))
|
||||
try:
|
||||
data = data_utils.get_display_data_single(self.columns, result)
|
||||
return self.columns, data
|
||||
except fc_error.BadDataException:
|
||||
return zip(*result.items())
|
||||
|
||||
|
||||
class FormattedCommand(BaseCommand):
|
||||
format_choices = ('json', 'yaml', 'plain')
|
||||
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(FormattedCommand, self).get_parser(*args, **kwargs)
|
||||
parser.add_argument(
|
||||
'-f', '--format',
|
||||
choices=self.format_choices,
|
||||
default='json',
|
||||
help="Desired format for return value",
|
||||
)
|
||||
return parser
|
||||
|
||||
def run(self, parsed_args):
|
||||
res = self.take_action(parsed_args)
|
||||
self.app.stdout.write(format_output(res, parsed_args.format))
|
||||
return 0
|
@ -1,89 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from cliff import show
|
||||
import six
|
||||
|
||||
from tuning_box.cli import base
|
||||
|
||||
|
||||
class ComponentsCommand(base.BaseCommand):
|
||||
entity_name = 'component'
|
||||
base_url = '/components'
|
||||
columns = ('id', 'name', 'resource_definitions')
|
||||
|
||||
|
||||
class ListComponents(ComponentsCommand, base.BaseListCommand):
|
||||
pass
|
||||
|
||||
|
||||
class ShowComponent(ComponentsCommand, base.BaseShowCommand):
|
||||
pass
|
||||
|
||||
|
||||
class DeleteComponent(ComponentsCommand, base.BaseDeleteCommand):
|
||||
pass
|
||||
|
||||
|
||||
class CreateComponent(ComponentsCommand, show.ShowOne):
|
||||
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(CreateComponent, self).get_parser(
|
||||
*args, **kwargs)
|
||||
parser.add_argument(
|
||||
'-n', '--name',
|
||||
type=str,
|
||||
help="Component name"
|
||||
)
|
||||
return parser
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
result = self.get_client().post(
|
||||
self.base_url, {'name': parsed_args.name,
|
||||
'resource_definitions': []})
|
||||
return zip(*result.items())
|
||||
|
||||
|
||||
class UpdateComponent(ComponentsCommand, base.BaseOneCommand):
|
||||
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(UpdateComponent, self).get_parser(
|
||||
*args, **kwargs)
|
||||
parser.add_argument(
|
||||
'-n', '--name',
|
||||
type=str,
|
||||
help="Component name"
|
||||
)
|
||||
parser.add_argument(
|
||||
'-r', '--resource-definitions',
|
||||
dest='resources',
|
||||
type=str,
|
||||
help="Comma separated resource definitions IDs. "
|
||||
"Set parameter to [] if you want to pass empty list",
|
||||
)
|
||||
return parser
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
data = {}
|
||||
if parsed_args.name is not None:
|
||||
data['name'] = parsed_args.name
|
||||
if parsed_args.resources is not None:
|
||||
data['resource_definitions'] = []
|
||||
res_def_ids = self._parse_comma_separated(
|
||||
parsed_args, 'resources', int)
|
||||
for res_def_id in res_def_ids:
|
||||
data['resource_definitions'].append({'id': res_def_id})
|
||||
|
||||
result = self.get_client().patch(self.get_url(parsed_args), data)
|
||||
if result is None:
|
||||
result = self.get_update_message(parsed_args)
|
||||
self.app.stdout.write(six.text_type(result))
|
@ -1,100 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from cliff import show
|
||||
import six
|
||||
|
||||
from tuning_box.cli import base
|
||||
|
||||
|
||||
class EnvironmentsCommand(base.BaseCommand):
|
||||
entity_name = 'environment'
|
||||
base_url = '/environments'
|
||||
columns = ('id', 'components', 'hierarchy_levels')
|
||||
|
||||
|
||||
class ListEnvironments(EnvironmentsCommand, base.BaseListCommand):
|
||||
pass
|
||||
|
||||
|
||||
class ShowEnvironment(EnvironmentsCommand, base.BaseShowCommand):
|
||||
pass
|
||||
|
||||
|
||||
class DeleteEnvironment(EnvironmentsCommand, base.BaseDeleteCommand):
|
||||
pass
|
||||
|
||||
|
||||
class CreateEnvironment(EnvironmentsCommand, show.ShowOne):
|
||||
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(CreateEnvironment, self).get_parser(
|
||||
*args, **kwargs)
|
||||
parser.add_argument(
|
||||
'-i', '--components',
|
||||
type=str,
|
||||
help="Comma separated components IDs",
|
||||
)
|
||||
parser.add_argument(
|
||||
'-l', '--levels',
|
||||
type=str,
|
||||
help="Comma separated levels names",
|
||||
)
|
||||
return parser
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
levels = self._parse_comma_separated(
|
||||
parsed_args, 'levels', six.text_type)
|
||||
components = self._parse_comma_separated(
|
||||
parsed_args, 'components', int)
|
||||
|
||||
result = self.get_client().post(
|
||||
self.base_url,
|
||||
{'hierarchy_levels': levels, 'components': components}
|
||||
)
|
||||
return zip(*result.items())
|
||||
|
||||
|
||||
class UpdateEnvironment(EnvironmentsCommand, base.BaseOneCommand):
|
||||
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(UpdateEnvironment, self).get_parser(
|
||||
*args, **kwargs)
|
||||
parser.add_argument(
|
||||
'-i', '--components',
|
||||
dest='components',
|
||||
type=str,
|
||||
help="Comma separated components IDs. "
|
||||
"Set parameter to [] if you want to pass empty list",
|
||||
)
|
||||
parser.add_argument(
|
||||
'-l', '--levels',
|
||||
type=str,
|
||||
dest='levels',
|
||||
help="Comma separated levels names "
|
||||
"Set parameter to [] if you want to pass empty list",
|
||||
)
|
||||
return parser
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
data = {}
|
||||
if parsed_args.levels is not None:
|
||||
data['hierarchy_levels'] = self._parse_comma_separated(
|
||||
parsed_args, 'levels', six.text_type)
|
||||
if parsed_args.components is not None:
|
||||
data['components'] = self._parse_comma_separated(
|
||||
parsed_args, 'components', int)
|
||||
|
||||
result = self.get_client().patch(self.get_url(parsed_args), data)
|
||||
if result is None:
|
||||
result = self.get_update_message(parsed_args)
|
||||
self.app.stdout.write(six.text_type(result))
|
@ -1,23 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
|
||||
class TuningBoxCliError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class IncompatibleParams(TuningBoxCliError):
|
||||
pass
|
||||
|
||||
|
||||
class UnsupportedDataType(TuningBoxCliError):
|
||||
pass
|
@ -1,71 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from cliff import lister
|
||||
from cliff import show
|
||||
from fuelclient.cli import error as fc_error
|
||||
from fuelclient.common import data_utils
|
||||
|
||||
from tuning_box.cli import base
|
||||
|
||||
|
||||
class HierarchyLevelsCommand(base.BaseCommand):
|
||||
columns = ('id', 'name', 'parent', 'values')
|
||||
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(HierarchyLevelsCommand, self).get_parser(
|
||||
*args, **kwargs)
|
||||
parser.add_argument(
|
||||
'-e', '--env',
|
||||
type=int,
|
||||
required=True,
|
||||
help="ID of environment to get data from",
|
||||
)
|
||||
return parser
|
||||
|
||||
def get_base_url(self, parsed_args):
|
||||
return '/environments/{}/hierarchy_levels'.format(parsed_args.env)
|
||||
|
||||
|
||||
class ListHierarchyLevels(HierarchyLevelsCommand, lister.Lister):
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
result = self.get_client().get(self.get_base_url(parsed_args))
|
||||
try:
|
||||
data = data_utils.get_display_data_multi(self.columns, result)
|
||||
return self.columns, data
|
||||
except fc_error.BadDataException:
|
||||
return zip(*result.items())
|
||||
|
||||
|
||||
class ShowHierarchyLevel(HierarchyLevelsCommand, show.ShowOne):
|
||||
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(ShowHierarchyLevel, self).get_parser(*args, **kwargs)
|
||||
parser.add_argument(
|
||||
'name',
|
||||
type=str,
|
||||
help='Hierarchy level name'
|
||||
)
|
||||
return parser
|
||||
|
||||
def get_url(self, parsed_args):
|
||||
base_url = self.get_base_url(parsed_args)
|
||||
return base_url + '/{0}'.format(parsed_args.name)
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
result = self.get_client().get(self.get_url(parsed_args))
|
||||
try:
|
||||
data = data_utils.get_display_data_single(self.columns, result)
|
||||
return self.columns, data
|
||||
except fc_error.BadDataException:
|
||||
return zip(*result.items())
|
@ -1,150 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import json
|
||||
import yaml
|
||||
|
||||
from cliff import show
|
||||
import six
|
||||
|
||||
from tuning_box.cli import base
|
||||
from tuning_box.cli import errors
|
||||
|
||||
|
||||
class ResourceDefinitionsCommand(base.BaseCommand):
|
||||
entity_name = 'resource_definition'
|
||||
base_url = '/resource_definitions'
|
||||
columns = ('id', 'name', 'component_id', 'content')
|
||||
|
||||
|
||||
class ListResourceDefinitions(ResourceDefinitionsCommand,
|
||||
base.BaseListCommand):
|
||||
pass
|
||||
|
||||
|
||||
class ShowResourceDefinition(ResourceDefinitionsCommand,
|
||||
base.BaseShowCommand):
|
||||
pass
|
||||
|
||||
|
||||
class DeleteResourceDefinition(ResourceDefinitionsCommand,
|
||||
base.BaseDeleteCommand):
|
||||
pass
|
||||
|
||||
|
||||
class ModifyResourceDefinitionCommand(ResourceDefinitionsCommand):
|
||||
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(ModifyResourceDefinitionCommand, self).get_parser(
|
||||
*args, **kwargs)
|
||||
parser.add_argument(
|
||||
'-n', '--name',
|
||||
dest='name',
|
||||
type=str,
|
||||
help="Resource definition name"
|
||||
)
|
||||
parser.add_argument(
|
||||
'-i', '--component-id',
|
||||
dest='component_id',
|
||||
type=int,
|
||||
help="Component Id"
|
||||
)
|
||||
parser.add_argument(
|
||||
'-p', '--content',
|
||||
dest='content',
|
||||
type=str,
|
||||
help="Content to be set"
|
||||
)
|
||||
parser.add_argument(
|
||||
'-t', '--type',
|
||||
choices=('json', 'yaml'),
|
||||
help="Content type"
|
||||
)
|
||||
parser.add_argument(
|
||||
'-d', '--data-format',
|
||||
dest='data_format',
|
||||
choices=('json', 'yaml'),
|
||||
help="Format of data passed to stdin to be set to content"
|
||||
)
|
||||
return parser
|
||||
|
||||
def verify_arguments(self, parsed_args):
|
||||
if parsed_args.content is not None:
|
||||
if parsed_args.data_format is not None:
|
||||
raise errors.IncompatibleParams(
|
||||
"You shouldn't specify --data-format if you pass "
|
||||
"content in command line, specify --type instead."
|
||||
)
|
||||
elif parsed_args.type is None:
|
||||
raise errors.IncompatibleParams(
|
||||
"You should specify --type if you pass "
|
||||
"content in command line."
|
||||
)
|
||||
elif parsed_args.data_format is None:
|
||||
raise errors.IncompatibleParams(
|
||||
"You should specify --data-format for stdin data if you "
|
||||
"don't pass content in command line."
|
||||
)
|
||||
elif parsed_args.type is not None:
|
||||
raise errors.IncompatibleParams(
|
||||
"--type and --data-format parameters can't "
|
||||
"be used together."
|
||||
)
|
||||
|
||||
def get_content(self, parsed_args):
|
||||
type_ = parsed_args.type
|
||||
if type_ == 'json':
|
||||
return json.loads(parsed_args.content)
|
||||
elif type_ == 'yaml':
|
||||
return yaml.safe_load(parsed_args.content)
|
||||
elif type_ is None:
|
||||
data_format = parsed_args.data_format
|
||||
if data_format == 'json':
|
||||
return self.read_json()
|
||||
elif data_format == 'yaml':
|
||||
return self.read_yaml()
|
||||
raise errors.UnsupportedDataType(
|
||||
"Unsupported format: {0}".format(data_format)
|
||||
)
|
||||
raise errors.UnsupportedDataType("Unsupported type: {0}".format(type_))
|
||||
|
||||
|
||||
class CreateResourceDefinition(ModifyResourceDefinitionCommand, show.ShowOne):
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
self.verify_arguments(parsed_args)
|
||||
data = {
|
||||
'name': parsed_args.name,
|
||||
'component_id': parsed_args.component_id,
|
||||
'content': self.get_content(parsed_args)
|
||||
}
|
||||
result = self.get_client().post(self.base_url, data)
|
||||
return zip(*result.items())
|
||||
|
||||
|
||||
class UpdateResourceDefinition(ModifyResourceDefinitionCommand,
|
||||
base.BaseOneCommand):
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
data = {}
|
||||
if parsed_args.name is not None:
|
||||
data['name'] = parsed_args.name
|
||||
if parsed_args.component_id is not None:
|
||||
data['component_id'] = parsed_args.component_id
|
||||
if (parsed_args.content is not None
|
||||
or parsed_args.data_format is not None):
|
||||
data['content'] = self.get_content(parsed_args)
|
||||
|
||||
result = self.get_client().patch(self.get_url(parsed_args), data)
|
||||
if result is None:
|
||||
result = self.get_update_message(parsed_args)
|
||||
self.app.stdout.write(six.text_type(result))
|
@ -1,257 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import json
|
||||
import six
|
||||
import yaml
|
||||
|
||||
from cliff import show
|
||||
from fuelclient.cli import error as fc_error
|
||||
from fuelclient.common import data_utils
|
||||
|
||||
from tuning_box.cli.base import BaseCommand
|
||||
from tuning_box.cli.base import level_converter
|
||||
from tuning_box.library.resource_values import ResourceValues
|
||||
|
||||
|
||||
class ResourcesCommand(BaseCommand):
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(ResourcesCommand, self).get_parser(*args, **kwargs)
|
||||
parser.add_argument(
|
||||
'-e', '--env',
|
||||
type=int,
|
||||
required=True,
|
||||
help="ID of environment to get data from",
|
||||
)
|
||||
parser.add_argument(
|
||||
'-l', '--level',
|
||||
type=level_converter,
|
||||
default=[],
|
||||
help=("Level to get data from. Should be in format "
|
||||
"parent_level=parent1,level=value2"),
|
||||
)
|
||||
parser.add_argument(
|
||||
'-r', '--resource',
|
||||
type=str,
|
||||
required=True,
|
||||
help="Name or ID of resource to get data from",
|
||||
)
|
||||
return parser
|
||||
|
||||
def get_resource_url(self, parsed_args, last_part='values'):
|
||||
return '/environments/{}/{}resources/{}/{}'.format(
|
||||
parsed_args.env,
|
||||
''.join('{}/{}/'.format(*e) for e in parsed_args.level),
|
||||
parsed_args.resource,
|
||||
last_part,
|
||||
)
|
||||
|
||||
|
||||
class Get(show.ShowOne, ResourcesCommand):
|
||||
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(Get, self).get_parser(*args, **kwargs)
|
||||
parser.add_argument(
|
||||
'-k', '--key',
|
||||
type=str,
|
||||
help="Name of key to get from the resource. For fetching nested "
|
||||
"key value use '{0}' as delimiter. Example: "
|
||||
"k1{0}k2{0}k3".format(ResourceValues.KEYS_PATH_DELIMITER),
|
||||
)
|
||||
parser.add_argument(
|
||||
'-s', '--show-lookup',
|
||||
dest='show_lookup',
|
||||
help="Show lookup path for the value in the result",
|
||||
action='store_true'
|
||||
)
|
||||
return parser
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
params = {'effective': True}
|
||||
if parsed_args.show_lookup:
|
||||
params['show_lookup'] = True
|
||||
if parsed_args.key:
|
||||
params['key'] = parsed_args.key
|
||||
response = self.get_client().get(
|
||||
self.get_resource_url(parsed_args),
|
||||
params=params
|
||||
)
|
||||
if parsed_args.key:
|
||||
result = {parsed_args.key: response}
|
||||
else:
|
||||
result = response
|
||||
columns = sorted(result)
|
||||
try:
|
||||
data = data_utils.get_display_data_single(columns, result)
|
||||
return columns, data
|
||||
except fc_error.BadDataException:
|
||||
return zip(*response.items())
|
||||
|
||||
|
||||
class Set(ResourcesCommand):
|
||||
|
||||
url_last_part = 'values'
|
||||
entity_name = 'ResourceValue'
|
||||
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(Set, self).get_parser(*args, **kwargs)
|
||||
parser.add_argument(
|
||||
'-k', '--key',
|
||||
type=str,
|
||||
help="Name of key to get from the resource. For set nested "
|
||||
"key value use '{0}' as delimiter. Example: "
|
||||
"k1{0}k2{0}k3".format(ResourceValues.KEYS_PATH_DELIMITER),
|
||||
)
|
||||
parser.add_argument(
|
||||
'-v', '--value',
|
||||
type=str,
|
||||
help="Value for a key to set in the resource",
|
||||
)
|
||||
parser.add_argument(
|
||||
'-t', '--type',
|
||||
choices=('null', 'int', 'str', 'json', 'yaml', 'bool'),
|
||||
help="Type of value passed in --value",
|
||||
)
|
||||
parser.add_argument(
|
||||
'-f', '--format',
|
||||
choices=('json', 'yaml'),
|
||||
help="Format of data passed to stdin",
|
||||
)
|
||||
return parser
|
||||
|
||||
def verify_arguments(self, parsed_args):
|
||||
if parsed_args.value is not None: # have value
|
||||
if parsed_args.format is not None:
|
||||
raise Exception("You shouldn't specify --format if you pass "
|
||||
"value in command line, specify --type "
|
||||
"instead.")
|
||||
if parsed_args.type == 'null':
|
||||
raise Exception("You shouldn't specify a value for 'null' type"
|
||||
" because there can be only one.")
|
||||
if parsed_args.type is None:
|
||||
raise Exception("Please specify type of value passed in "
|
||||
"--value argument to properly represent it"
|
||||
" in the storage.")
|
||||
elif parsed_args.type != 'null': # have no value
|
||||
if parsed_args.type is not None:
|
||||
raise Exception("--type specifies type for value provided in "
|
||||
"--value but there is not --value argument")
|
||||
if parsed_args.format is None:
|
||||
raise Exception("Please specify format of data passed to stdin"
|
||||
" to replace the key.")
|
||||
|
||||
def get_value_to_set(self, parsed_args):
|
||||
type_ = parsed_args.type
|
||||
if type_ == 'null':
|
||||
return None
|
||||
elif type_ == 'bool':
|
||||
if parsed_args.value.lower() in ('1', 'true'):
|
||||
return True
|
||||
elif parsed_args.value.lower() in ('0', 'false'):
|
||||
return False
|
||||
else:
|
||||
raise Exception(
|
||||
"Bad value for 'bool' type: '{}'. Should be one of '0', "
|
||||
"'1', 'false', 'true'.".format(parsed_args.value))
|
||||
elif type_ == 'int':
|
||||
return int(parsed_args.value)
|
||||
elif type_ == 'str':
|
||||
return parsed_args.value
|
||||
elif type_ == 'json':
|
||||
return json.loads(parsed_args.value)
|
||||
elif type_ == 'yaml':
|
||||
return yaml.safe_load(parsed_args.value)
|
||||
elif type_ is None:
|
||||
if parsed_args.format == 'json':
|
||||
return json.load(self.app.stdin)
|
||||
elif parsed_args.format == 'yaml':
|
||||
docs_gen = yaml.safe_load_all(self.app.stdin)
|
||||
doc = next(docs_gen)
|
||||
guard = object()
|
||||
if next(docs_gen, guard) is not guard:
|
||||
self.app.stderr.write("Warning: will use only first "
|
||||
"document from YAML stream")
|
||||
return doc
|
||||
assert False, "Shouldn't get here"
|
||||
|
||||
def get_update_message(self, parsed_args):
|
||||
if parsed_args.key is None:
|
||||
message = '{0} was set\n'.format(self.entity_name)
|
||||
else:
|
||||
message = '{0} for key {1} was set\n'.format(
|
||||
self.entity_name, parsed_args.key)
|
||||
return message
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
self.verify_arguments(parsed_args)
|
||||
value = self.get_value_to_set(parsed_args)
|
||||
|
||||
client = self.get_client()
|
||||
resource_url = self.get_resource_url(parsed_args, self.url_last_part)
|
||||
if parsed_args.key:
|
||||
keys_path = parsed_args.key.split(
|
||||
ResourceValues.KEYS_PATH_DELIMITER)
|
||||
keys_path.append(value)
|
||||
resource_url += '/keys/set'
|
||||
result = client.patch(resource_url, [keys_path])
|
||||
else:
|
||||
result = client.put(resource_url, value)
|
||||
if result is None:
|
||||
result = self.get_update_message(parsed_args)
|
||||
self.app.stdout.write(six.text_type(result))
|
||||
|
||||
|
||||
class Delete(ResourcesCommand):
|
||||
|
||||
url_last_part = 'values'
|
||||
entity_name = 'ResourceValue'
|
||||
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(Delete, self).get_parser(*args, **kwargs)
|
||||
parser.add_argument(
|
||||
'-k', '--key',
|
||||
type=str,
|
||||
help="Name of key to delete from the resource. For nested "
|
||||
"key deletion use '{0}' as delimiter. Example: "
|
||||
"k1{0}k2{0}k3".format(ResourceValues.KEYS_PATH_DELIMITER),
|
||||
required=True
|
||||
)
|
||||
return parser
|
||||
|
||||
def get_deletion_message(self, parsed_args):
|
||||
return '{0} for key {1} was deleted\n'.format(
|
||||
self.entity_name, parsed_args.key)
|
||||
|
||||
def get_resource_url(self, parsed_args, last_part='values'):
|
||||
url = super(Delete, self).get_resource_url(
|
||||
parsed_args, last_part=last_part)
|
||||
return url + '/keys/delete'
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
client = self.get_client()
|
||||
resource_url = self.get_resource_url(parsed_args, self.url_last_part)
|
||||
keys_path = parsed_args.key.split(
|
||||
ResourceValues.KEYS_PATH_DELIMITER)
|
||||
result = client.patch(resource_url, [keys_path])
|
||||
if result is None:
|
||||
result = self.get_deletion_message(parsed_args)
|
||||
self.app.stdout.write(six.text_type(result))
|
||||
|
||||
|
||||
class Override(Set):
|
||||
url_last_part = 'overrides'
|
||||
entity_name = 'ResourceOverride'
|
||||
|
||||
|
||||
class DeleteOverride(Delete):
|
||||
url_last_part = 'overrides'
|
||||
entity_name = 'ResourceOverride'
|
@ -1,54 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import requests
|
||||
|
||||
|
||||
class HTTPClient(object):
|
||||
def __init__(self, base_url):
|
||||
self.base_url = base_url
|
||||
self.session = self.get_session()
|
||||
|
||||
def get_session(self):
|
||||
session = requests.Session()
|
||||
session.headers.update(self.default_headers())
|
||||
return session
|
||||
|
||||
def default_headers(self):
|
||||
return {
|
||||
"Content-Type": "application/json",
|
||||
"Accept": "application/json",
|
||||
}
|
||||
|
||||
def request(self, method, url, **kwargs):
|
||||
full_url = self.base_url + url
|
||||
resp = self.session.request(method, full_url, **kwargs)
|
||||
if resp.headers.get('Content-Type') == 'application/json' and \
|
||||
resp.content:
|
||||
return resp.json()
|
||||
else:
|
||||
return None
|
||||
|
||||
def get(self, url, params=None):
|
||||
return self.request('GET', url, params=params)
|
||||
|
||||
def put(self, url, body):
|
||||
return self.request('PUT', url, json=body)
|
||||
|
||||
def post(self, url, body):
|
||||
return self.request('POST', url, json=body)
|
||||
|
||||
def patch(self, url, body):
|
||||
return self.request('PATCH', url, json=body)
|
||||
|
||||
def delete(self, url):
|
||||
return self.request('DELETE', url)
|
@ -1,77 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import itertools
|
||||
|
||||
from werkzeug import routing
|
||||
from werkzeug import urls
|
||||
|
||||
from tuning_box.library import resource_keys_operation
|
||||
|
||||
|
||||
class Levels(routing.BaseConverter):
|
||||
"""Converter that maps nested levels to list of tuples.
|
||||
|
||||
For example, "level1/value1/level2/value2/" is mapped to
|
||||
[("level1", "value1"), ("level2", "value2")].
|
||||
|
||||
Note that since it can be empty it includes following "/":
|
||||
|
||||
Rule('/smth/<levels:levels>values')
|
||||
|
||||
will parse "/smth/values" and "/smth/level1/value1/values".
|
||||
"""
|
||||
|
||||
regex = "([^/]+/[^/]+/)*"
|
||||
|
||||
def to_python(self, value):
|
||||
spl = value.split('/')
|
||||
return list(zip(spl[::2], spl[1::2]))
|
||||
|
||||
def to_url(self, value):
|
||||
parts = itertools.chain.from_iterable(value)
|
||||
quoted_parts = (urls.url_quote(p, charset=self.map.charset, safe='')
|
||||
for p in parts)
|
||||
return ''.join(p + '/' for p in quoted_parts)
|
||||
|
||||
|
||||
class IdOrName(routing.BaseConverter):
|
||||
"""Converter that matches either int or URL part including "/" as string"""
|
||||
|
||||
regex = '[^/].*?'
|
||||
|
||||
def to_python(self, value):
|
||||
try:
|
||||
return int(value)
|
||||
except ValueError:
|
||||
return value
|
||||
|
||||
def to_url(self, value):
|
||||
return super(IdOrName, self).to_url(str(value))
|
||||
|
||||
|
||||
class KeysOperation(routing.BaseConverter):
|
||||
"""Converter that matches keys operations
|
||||
|
||||
Allowed operations: add, delete, erase
|
||||
"""
|
||||
|
||||
regex = '(' + ')|('.join(
|
||||
resource_keys_operation.KeysOperationMixin.OPERATIONS
|
||||
) + ')'
|
||||
|
||||
|
||||
ALL = {
|
||||
'levels': Levels,
|
||||
'id_or_name': IdOrName,
|
||||
'keys_operation': KeysOperation
|
||||
}
|
280
tuning_box/db.py
280
tuning_box/db.py
@ -1,280 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import functools
|
||||
import json
|
||||
import re
|
||||
|
||||
import flask_sqlalchemy
|
||||
import sqlalchemy
|
||||
import sqlalchemy.event
|
||||
import sqlalchemy.ext.declarative as sa_decl
|
||||
from sqlalchemy.orm import exc as orm_exc
|
||||
from sqlalchemy import types
|
||||
|
||||
from tuning_box import errors
|
||||
|
||||
try:
|
||||
from importlib import reload
|
||||
except ImportError:
|
||||
pass # in 2.x reload is builtin
|
||||
|
||||
db = flask_sqlalchemy.SQLAlchemy(session_options={'autocommit': True})
|
||||
pk_type = db.Integer
|
||||
pk = functools.partial(db.Column, pk_type, primary_key=True)
|
||||
|
||||
|
||||
def with_transaction(f):
|
||||
@functools.wraps(f)
|
||||
def inner(*args, **kwargs):
|
||||
with db.session.begin():
|
||||
return f(*args, **kwargs)
|
||||
|
||||
return inner
|
||||
|
||||
|
||||
def fk(cls, **kwargs):
|
||||
ondelete = kwargs.pop('ondelete', None)
|
||||
return db.Column(pk_type, db.ForeignKey(cls.id, ondelete=ondelete),
|
||||
**kwargs)
|
||||
|
||||
|
||||
class BaseQuery(flask_sqlalchemy.BaseQuery):
|
||||
def get_by_id_or_name(self, id_or_name, fail_on_none=True):
|
||||
if isinstance(id_or_name, int):
|
||||
result = self.get(id_or_name)
|
||||
else:
|
||||
result = self.filter_by(name=id_or_name).one_or_none()
|
||||
if fail_on_none and result is None:
|
||||
raise errors.TuningboxNotFound(
|
||||
"Object not found by name or id {0}".format(id_or_name)
|
||||
)
|
||||
return result
|
||||
|
||||
# one_or_none is not present in sqlalchemy < 1.0.9
|
||||
def one_or_none(self):
|
||||
ret = list(self)
|
||||
l = len(ret)
|
||||
if l == 1:
|
||||
return ret[0]
|
||||
elif l == 0:
|
||||
return None
|
||||
else:
|
||||
raise orm_exc.MultipleResultsFound(
|
||||
"Multiple rows were found for one_or_none()")
|
||||
|
||||
|
||||
def _tablename(cls_name):
|
||||
def repl(match):
|
||||
res = match.group().lower()
|
||||
if match.start():
|
||||
res = "_" + res
|
||||
return res
|
||||
|
||||
return ModelMixin.table_prefix + re.sub("[A-Z]", repl, cls_name)
|
||||
|
||||
|
||||
class ModelMixin(object):
|
||||
query_class = BaseQuery
|
||||
id = db.Column(pk_type, primary_key=True)
|
||||
|
||||
try:
|
||||
table_prefix = ModelMixin.table_prefix # keep prefix during reload
|
||||
except NameError:
|
||||
table_prefix = "" # first import, not reload
|
||||
|
||||
@sa_decl.declared_attr
|
||||
def __tablename__(cls):
|
||||
return _tablename(cls.__name__)
|
||||
|
||||
def __repr__(self):
|
||||
args = []
|
||||
for attr in self.__repr_attrs__:
|
||||
value = getattr(self, attr)
|
||||
if attr == 'content' and value is not None and len(value) > 15:
|
||||
value = value[:10] + '<...>'
|
||||
args.append('{}={!r}'.format(attr, value))
|
||||
return '{}({})'.format(type(self).__name__, ','.join(args))
|
||||
|
||||
|
||||
class Json(types.TypeDecorator):
|
||||
impl = db.Text
|
||||
|
||||
def process_bind_param(self, value, dialect):
|
||||
return json.dumps(value)
|
||||
|
||||
def process_result_value(self, value, dialect):
|
||||
return json.loads(value)
|
||||
|
||||
|
||||
# Component registry
|
||||
|
||||
|
||||
class Component(ModelMixin, db.Model):
|
||||
name = db.Column(db.String(128), unique=True)
|
||||
|
||||
__repr_attrs__ = ('id', 'name')
|
||||
|
||||
|
||||
class ResourceDefinition(ModelMixin, db.Model):
|
||||
name = db.Column(db.String(128))
|
||||
component_id = fk(Component, ondelete='CASCADE')
|
||||
component = db.relationship(
|
||||
Component,
|
||||
backref=sqlalchemy.orm.backref('resource_definitions',
|
||||
cascade='all, delete-orphan')
|
||||
)
|
||||
|
||||
content = db.Column(Json)
|
||||
|
||||
__repr_attrs__ = ('id', 'name', 'component', 'content')
|
||||
|
||||
# Environment data storage
|
||||
|
||||
|
||||
class Environment(ModelMixin, db.Model):
|
||||
@sa_decl.declared_attr
|
||||
def environment_components_table(cls):
|
||||
return db.Table(
|
||||
_tablename('environment_components'),
|
||||
db.Column('environment_id', pk_type,
|
||||
db.ForeignKey(cls.id, ondelete='CASCADE')),
|
||||
db.Column('component_id', pk_type,
|
||||
db.ForeignKey(Component.id, ondelete='CASCADE')),
|
||||
)
|
||||
|
||||
@sa_decl.declared_attr
|
||||
def components(cls):
|
||||
return db.relationship(
|
||||
Component, secondary=cls.environment_components_table)
|
||||
|
||||
__repr_attrs__ = ('id',)
|
||||
|
||||
|
||||
class EnvironmentHierarchyLevel(ModelMixin, db.Model):
|
||||
environment_id = fk(Environment, ondelete='CASCADE')
|
||||
environment = db.relationship(
|
||||
Environment,
|
||||
backref=sqlalchemy.orm.backref('hierarchy_levels',
|
||||
cascade="all, delete-orphan")
|
||||
)
|
||||
name = db.Column(db.String(128))
|
||||
|
||||
@sa_decl.declared_attr
|
||||
def parent_id(cls):
|
||||
return db.Column(pk_type, db.ForeignKey(cls.id))
|
||||
|
||||
@sa_decl.declared_attr
|
||||
def parent(cls):
|
||||
return db.relationship(cls,
|
||||
backref=db.backref('child', uselist=False),
|
||||
remote_side=cls.id)
|
||||
|
||||
__table_args__ = (
|
||||
db.UniqueConstraint('environment_id', 'name'),
|
||||
db.UniqueConstraint('environment_id', 'parent_id'),
|
||||
)
|
||||
__repr_attrs__ = ('id', 'environment', 'parent', 'name')
|
||||
|
||||
@classmethod
|
||||
def get_for_environment(cls, environment):
|
||||
query = cls.query.filter_by(environment=environment, parent=None)
|
||||
root_level = query.one_or_none()
|
||||
if not root_level:
|
||||
return []
|
||||
env_levels = [root_level]
|
||||
while env_levels[-1].child:
|
||||
env_levels.append(env_levels[-1].child)
|
||||
return env_levels
|
||||
|
||||
values = db.relationship('EnvironmentHierarchyLevelValue')
|
||||
|
||||
|
||||
class EnvironmentHierarchyLevelValue(ModelMixin, db.Model):
|
||||
level_id = fk(EnvironmentHierarchyLevel, ondelete='CASCADE')
|
||||
level = db.relationship(EnvironmentHierarchyLevel)
|
||||
value = db.Column(db.String(128))
|
||||
|
||||
__table_args__ = (
|
||||
db.UniqueConstraint('level_id', 'value'),
|
||||
)
|
||||
|
||||
__repr_attrs__ = ('id', 'level', 'value')
|
||||
|
||||
|
||||
class ResourceValues(ModelMixin, db.Model):
|
||||
environment_id = fk(Environment, ondelete='CASCADE')
|
||||
environment = db.relationship(Environment)
|
||||
resource_definition_id = fk(ResourceDefinition, ondelete='CASCADE')
|
||||
resource_definition = db.relationship(ResourceDefinition)
|
||||
level_value_id = fk(EnvironmentHierarchyLevelValue, ondelete='CASCADE')
|
||||
level_value = db.relationship('EnvironmentHierarchyLevelValue')
|
||||
values = db.Column(Json, server_default='{}')
|
||||
overrides = db.Column(Json, server_default='{}')
|
||||
|
||||
__table_args__ = (
|
||||
db.UniqueConstraint(environment_id, resource_definition_id,
|
||||
level_value_id),
|
||||
)
|
||||
__repr_attrs__ = ('id', 'environment', 'resource_definition',
|
||||
'level_value', 'values')
|
||||
|
||||
|
||||
def get_or_create(cls, **attrs):
|
||||
with db.session.begin(nested=True):
|
||||
item = cls.query.filter_by(**attrs).one_or_none()
|
||||
if not item:
|
||||
item = cls(**attrs)
|
||||
db.session.add(item)
|
||||
# TODO(yorik-sar): handle constraints failure in case of
|
||||
# race condition
|
||||
return item
|
||||
|
||||
|
||||
def fix_sqlite():
|
||||
engine = db.engine
|
||||
|
||||
@sqlalchemy.event.listens_for(engine, "connect")
|
||||
def _connect(dbapi_connection, connection_record):
|
||||
dbapi_connection.isolation_level = None
|
||||
|
||||
@sqlalchemy.event.listens_for(engine, "begin")
|
||||
def _begin(conn):
|
||||
conn.execute("BEGIN")
|
||||
|
||||
|
||||
def prefix_tables(module, prefix):
|
||||
ModelMixin.table_prefix = prefix
|
||||
reload(module)
|
||||
|
||||
|
||||
def unprefix_tables(module):
|
||||
ModelMixin.table_prefix = ""
|
||||
reload(module)
|
||||
|
||||
|
||||
def get_or_404(cls, ident):
|
||||
result = cls.query.get(ident)
|
||||
if result is None:
|
||||
raise errors.TuningboxNotFound(
|
||||
"{0} not found by id {1}".format(cls.__name__, ident)
|
||||
)
|
||||
return result
|
||||
|
||||
|
||||
def find_or_404(cls, **attrs):
|
||||
item = cls.query.filter_by(**attrs).one_or_none()
|
||||
if not item:
|
||||
raise errors.TuningboxNotFound(
|
||||
"{0} not found by {1}".format(cls.__name__, attrs)
|
||||
)
|
||||
return item
|
@ -1,47 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
|
||||
class BaseTuningboxError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class TuningboxIntegrityError(BaseTuningboxError):
|
||||
pass
|
||||
|
||||
|
||||
class TuningboxNotFound(BaseTuningboxError):
|
||||
pass
|
||||
|
||||
|
||||
class RequestValidationError(BaseTuningboxError):
|
||||
pass
|
||||
|
||||
|
||||
class KeysOperationError(BaseTuningboxError):
|
||||
pass
|
||||
|
||||
|
||||
class UnknownKeysOperation(KeysOperationError):
|
||||
pass
|
||||
|
||||
|
||||
class KeysPathNotExisted(KeysOperationError):
|
||||
pass
|
||||
|
||||
|
||||
class KeysPathInvalid(KeysOperationError):
|
||||
pass
|
||||
|
||||
|
||||
class KeysPathUnreachable(KeysOperationError):
|
||||
pass
|
@ -1,174 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from __future__ import absolute_import
|
||||
|
||||
from cliff import command
|
||||
from fuelclient import client as fc_client
|
||||
|
||||
from tuning_box import cli
|
||||
from tuning_box.cli import base as cli_base
|
||||
from tuning_box.cli import components
|
||||
from tuning_box.cli import environments
|
||||
from tuning_box.cli import hierarchy_levels
|
||||
from tuning_box.cli import resource_definitions
|
||||
from tuning_box.cli import resources
|
||||
from tuning_box import client as tb_client
|
||||
|
||||
|
||||
class FuelHTTPClient(tb_client.HTTPClient):
|
||||
if hasattr(fc_client, 'DefaultAPIClient'):
|
||||
# Handling python-fuelclient version >= 10.0
|
||||
fc_api = fc_client.DefaultAPIClient
|
||||
else:
|
||||
# Handling python-fuelclient version <= 9.0
|
||||
fc_api = fc_client.APIClient
|
||||
|
||||
def __init__(self):
|
||||
service_catalog = self.fc_api.keystone_client.service_catalog
|
||||
base_url = service_catalog.url_for(
|
||||
service_type='config',
|
||||
endpoint_type='publicURL',
|
||||
)
|
||||
super(FuelHTTPClient, self).__init__(base_url)
|
||||
|
||||
def default_headers(self):
|
||||
headers = super(FuelHTTPClient, self).default_headers()
|
||||
if self.fc_api.auth_token:
|
||||
headers['X-Auth-Token'] = self.fc_api.auth_token
|
||||
return headers
|
||||
|
||||
|
||||
class FuelBaseCommand(cli_base.BaseCommand):
|
||||
def get_client(self):
|
||||
return FuelHTTPClient()
|
||||
|
||||
|
||||
class Get(FuelBaseCommand, resources.Get):
|
||||
pass
|
||||
|
||||
|
||||
class Set(FuelBaseCommand, resources.Set):
|
||||
pass
|
||||
|
||||
|
||||
class Delete(FuelBaseCommand, resources.Delete):
|
||||
pass
|
||||
|
||||
|
||||
class Override(FuelBaseCommand, resources.Override):
|
||||
pass
|
||||
|
||||
|
||||
class DeleteOverride(FuelBaseCommand, resources.DeleteOverride):
|
||||
pass
|
||||
|
||||
|
||||
class CreateEnvironment(FuelBaseCommand, environments.CreateEnvironment):
|
||||
pass
|
||||
|
||||
|
||||
class ListEnvironments(FuelBaseCommand, environments.ListEnvironments):
|
||||
pass
|
||||
|
||||
|
||||
class ShowEnvironment(FuelBaseCommand, environments.ShowEnvironment):
|
||||
pass
|
||||
|
||||
|
||||
class DeleteEnvironment(FuelBaseCommand, environments.DeleteEnvironment):
|
||||
pass
|
||||
|
||||
|
||||
class UpdateEnvironment(FuelBaseCommand, environments.UpdateEnvironment):
|
||||
pass
|
||||
|
||||
|
||||
class CreateComponent(FuelBaseCommand, components.CreateComponent):
|
||||
pass
|
||||
|
||||
|
||||
class ListComponents(FuelBaseCommand, components.ListComponents):
|
||||
pass
|
||||
|
||||
|
||||
class ShowComponent(FuelBaseCommand, components.ShowComponent):
|
||||
pass
|
||||
|
||||
|
||||
class DeleteComponent(FuelBaseCommand, components.DeleteComponent):
|
||||
pass
|
||||
|
||||
|
||||
class UpdateComponent(FuelBaseCommand, components.UpdateComponent):
|
||||
pass
|
||||
|
||||
|
||||
class CreateResourceDefinition(
|
||||
FuelBaseCommand,
|
||||
resource_definitions.CreateResourceDefinition
|
||||
):
|
||||
pass
|
||||
|
||||
|
||||
class ListResourceDefinitions(
|
||||
FuelBaseCommand,
|
||||
resource_definitions.ListResourceDefinitions
|
||||
):
|
||||
pass
|
||||
|
||||
|
||||
class ShowResourceDefinition(
|
||||
FuelBaseCommand,
|
||||
resource_definitions.ShowResourceDefinition
|
||||
):
|
||||
pass
|
||||
|
||||
|
||||
class DeleteResourceDefinition(
|
||||
FuelBaseCommand,
|
||||
resource_definitions.DeleteResourceDefinition
|
||||
):
|
||||
pass
|
||||
|
||||
|
||||
class UpdateResourceDefinition(
|
||||
FuelBaseCommand,
|
||||
resource_definitions.UpdateResourceDefinition
|
||||
):
|
||||
pass
|
||||
|
||||
|
||||
class ListHierarchyLevels(
|
||||
FuelBaseCommand,
|
||||
hierarchy_levels.ListHierarchyLevels
|
||||
):
|
||||
pass
|
||||
|
||||
|
||||
class ShowHierarchyLevel(
|
||||
FuelBaseCommand,
|
||||
hierarchy_levels.ShowHierarchyLevel
|
||||
):
|
||||
pass
|
||||
|
||||
|
||||
class Config(command.Command):
|
||||
def get_parser(self, *args, **kwargs):
|
||||
parser = super(Config, self).get_parser(*args, **kwargs)
|
||||
parser.add_argument('args', nargs='*')
|
||||
return parser
|
||||
|
||||
def take_action(self, parsed_args):
|
||||
client = FuelHTTPClient()
|
||||
app = cli.TuningBoxApp(client)
|
||||
app.run(parsed_args.args)
|
@ -1,100 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from sqlalchemy.orm import exc as sa_exc
|
||||
|
||||
from tuning_box import db
|
||||
from tuning_box import errors
|
||||
from tuning_box.library import hierarchy_levels
|
||||
|
||||
|
||||
def load_objects(model, ids):
|
||||
if ids is None:
|
||||
return None
|
||||
result = []
|
||||
for obj_id in ids:
|
||||
obj = model.query.filter_by(id=obj_id).one_or_none()
|
||||
if obj is None:
|
||||
raise errors.TuningboxNotFound(
|
||||
"{0} not found by identifier: "
|
||||
"{1}".format(model.__tablename__, obj_id)
|
||||
)
|
||||
result.append(obj)
|
||||
return result
|
||||
|
||||
|
||||
def load_objects_by_id_or_name(model, identifiers):
|
||||
if identifiers is None:
|
||||
return None
|
||||
result = []
|
||||
for identifier in identifiers:
|
||||
obj = model.query.get_by_id_or_name(
|
||||
identifier, fail_on_none=False)
|
||||
if obj is None:
|
||||
raise errors.TuningboxNotFound(
|
||||
"{0} not found by identifier: "
|
||||
"{1}".format(model.__tablename__, identifier)
|
||||
)
|
||||
result.append(obj)
|
||||
return result
|
||||
|
||||
|
||||
def get_resource_definition(id_or_name, environment_id):
|
||||
query = db.ResourceDefinition.query.join(db.Component). \
|
||||
join(db.Environment.environment_components_table). \
|
||||
filter_by(environment_id=environment_id)
|
||||
|
||||
if isinstance(id_or_name, int):
|
||||
query = query.filter(db.ResourceDefinition.id == id_or_name)
|
||||
else:
|
||||
query = query.filter(db.ResourceDefinition.name == id_or_name)
|
||||
|
||||
result = query.all()
|
||||
|
||||
if not result:
|
||||
raise errors.TuningboxNotFound(
|
||||
"{0} not found by {1} in environment {2}".format(
|
||||
db.ResourceDefinition.__tablename__,
|
||||
id_or_name,
|
||||
environment_id
|
||||
)
|
||||
)
|
||||
elif len(result) > 1:
|
||||
raise sa_exc.MultipleResultsFound
|
||||
|
||||
return result[0]
|
||||
|
||||
|
||||
def get_resource_values(environment, levels, res_def):
|
||||
level_value = hierarchy_levels.get_environment_level_value(
|
||||
environment, levels)
|
||||
res_values = db.ResourceValues.query.filter_by(
|
||||
environment_id=environment.id,
|
||||
resource_definition_id=res_def.id,
|
||||
level_value=level_value,
|
||||
).all()
|
||||
|
||||
if not res_values:
|
||||
raise errors.TuningboxNotFound(
|
||||
"Resource values not found by environment {0}, "
|
||||
"resource definition {1} for levels {2}".format(
|
||||
environment.id, res_def.id, levels
|
||||
)
|
||||
)
|
||||
elif len(res_values) > 1:
|
||||
raise errors.TuningboxIntegrityError(
|
||||
"Found more than one resource values for environment {0}, "
|
||||
"resource definition {1} for levels {2}".format(
|
||||
environment.id, res_def.id, levels
|
||||
)
|
||||
)
|
||||
return res_values[0]
|
@ -1,77 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
|
||||
import flask
|
||||
import flask_restful
|
||||
from flask_restful import fields
|
||||
|
||||
from tuning_box import db
|
||||
from tuning_box import library
|
||||
from tuning_box.library import resource_definitions
|
||||
|
||||
component_fields = {
|
||||
'id': fields.Integer,
|
||||
'name': fields.String,
|
||||
'resource_definitions': fields.List(fields.Nested(
|
||||
resource_definitions.resource_definition_fields
|
||||
))
|
||||
}
|
||||
|
||||
|
||||
class ComponentsCollection(flask_restful.Resource):
|
||||
method_decorators = [flask_restful.marshal_with(component_fields)]
|
||||
|
||||
def get(self):
|
||||
return db.Component.query.order_by(db.Component.id).all()
|
||||
|
||||
@db.with_transaction
|
||||
def post(self):
|
||||
component = db.Component(name=flask.request.json['name'])
|
||||
component.resource_definitions = []
|
||||
for res_def_data in flask.request.json.get('resource_definitions', []):
|
||||
res_def = db.ResourceDefinition(
|
||||
name=res_def_data['name'], content=res_def_data.get('content'))
|
||||
component.resource_definitions.append(res_def)
|
||||
db.db.session.add(component)
|
||||
return component, 201
|
||||
|
||||
|
||||
class Component(flask_restful.Resource):
|
||||
method_decorators = [flask_restful.marshal_with(component_fields)]
|
||||
|
||||
def get(self, component_id):
|
||||
return db.get_or_404(db.Component, component_id)
|
||||
|
||||
@db.with_transaction
|
||||
def _perform_update(self, component_id):
|
||||
component = db.get_or_404(db.Component, component_id)
|
||||
update_by = flask.request.json
|
||||
component.name = update_by.get('name', component.name)
|
||||
res_definitions = update_by.get('resource_definitions')
|
||||
if res_definitions is not None:
|
||||
ids = [data['id'] for data in res_definitions]
|
||||
resources = library.load_objects(db.ResourceDefinition, ids)
|
||||
component.resource_definitions = resources
|
||||
|
||||
def put(self, component_id):
|
||||
return self.patch(component_id)
|
||||
|
||||
def patch(self, component_id):
|
||||
self._perform_update(component_id)
|
||||
return None, 204
|
||||
|
||||
@db.with_transaction
|
||||
def delete(self, component_id):
|
||||
component = db.get_or_404(db.Component, component_id)
|
||||
db.db.session.delete(component)
|
||||
return None, 204
|
@ -1,142 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import flask
|
||||
import flask_restful
|
||||
from flask_restful import fields
|
||||
|
||||
from tuning_box import db
|
||||
from tuning_box import errors
|
||||
from tuning_box import library
|
||||
|
||||
environment_fields = {
|
||||
'id': fields.Integer,
|
||||
'components': fields.List(fields.Integer(attribute='id')),
|
||||
'hierarchy_levels': fields.List(fields.String(attribute='name')),
|
||||
}
|
||||
|
||||
|
||||
def prepare_env_for_output(env):
|
||||
# Proper order of levels can't be provided by ORM backref
|
||||
hierarchy_levels = db.EnvironmentHierarchyLevel.get_for_environment(env)
|
||||
return {'id': env.id, 'components': env.components,
|
||||
'hierarchy_levels': hierarchy_levels}
|
||||
|
||||
|
||||
class EnvironmentsCollection(flask_restful.Resource):
|
||||
method_decorators = [flask_restful.marshal_with(environment_fields)]
|
||||
|
||||
def get(self):
|
||||
envs = db.Environment.query.order_by(db.Environment.id).all()
|
||||
result = []
|
||||
for env in envs:
|
||||
result.append(prepare_env_for_output(env))
|
||||
return result, 200
|
||||
|
||||
def _check_components(self, components):
|
||||
identities = set()
|
||||
duplicates = set()
|
||||
id_names = ('id', 'name')
|
||||
for component in components:
|
||||
for id_name in id_names:
|
||||
value = getattr(component, id_name)
|
||||
|
||||
if value not in identities:
|
||||
identities.add(value)
|
||||
else:
|
||||
duplicates.add(value)
|
||||
if duplicates:
|
||||
raise errors.TuningboxIntegrityError(
|
||||
"Components duplicates: {0}".format(duplicates))
|
||||
|
||||
@db.with_transaction
|
||||
def post(self):
|
||||
component_ids = flask.request.json['components']
|
||||
components = [db.Component.query.get_by_id_or_name(i)
|
||||
for i in component_ids]
|
||||
self._check_components(components)
|
||||
|
||||
hierarchy_levels = []
|
||||
level = None
|
||||
for name in flask.request.json['hierarchy_levels']:
|
||||
level = db.EnvironmentHierarchyLevel(name=name, parent=level)
|
||||
hierarchy_levels.append(level)
|
||||
|
||||
environment = db.Environment(components=components,
|
||||
hierarchy_levels=hierarchy_levels)
|
||||
if 'id' in flask.request.json:
|
||||
environment.id = flask.request.json['id']
|
||||
db.db.session.add(environment)
|
||||
return prepare_env_for_output(environment), 201
|
||||
|
||||
|
||||
class Environment(flask_restful.Resource):
|
||||
method_decorators = [flask_restful.marshal_with(environment_fields)]
|
||||
|
||||
def get(self, environment_id):
|
||||
env = db.get_or_404(db.Environment, environment_id)
|
||||
return prepare_env_for_output(env), 200
|
||||
|
||||
def _update_components(self, environment, components):
|
||||
if components is not None:
|
||||
new_components = library.load_objects_by_id_or_name(
|
||||
db.Component, components)
|
||||
environment.components = new_components
|
||||
|
||||
def _update_hierarchy_levels(self, environment, hierarchy_levels_names):
|
||||
if hierarchy_levels_names is not None:
|
||||
old_hierarchy_levels = db.EnvironmentHierarchyLevel.query.filter(
|
||||
db.EnvironmentHierarchyLevel.environment_id == environment.id
|
||||
).all()
|
||||
|
||||
new_hierarchy_levels = []
|
||||
|
||||
for level_name in hierarchy_levels_names:
|
||||
level = db.get_or_create(
|
||||
db.EnvironmentHierarchyLevel,
|
||||
name=level_name,
|
||||
environment=environment
|
||||
)
|
||||
new_hierarchy_levels.append(level)
|
||||
|
||||
parent_id = None
|
||||
for level in new_hierarchy_levels:
|
||||
level.parent_id = parent_id
|
||||
parent_id = level.id
|
||||
for old_level in old_hierarchy_levels:
|
||||
if old_level not in new_hierarchy_levels:
|
||||
db.db.session.delete(old_level)
|
||||
environment.hierarchy_levels = new_hierarchy_levels
|
||||
|
||||
@db.with_transaction
|
||||
def _perform_update(self, environment_id):
|
||||
environment = db.get_or_404(db.Environment, environment_id)
|
||||
update_by = flask.request.json
|
||||
|
||||
components = update_by.get('components')
|
||||
self._update_components(environment, components)
|
||||
|
||||
hierarchy_levels = update_by.get('hierarchy_levels')
|
||||
self._update_hierarchy_levels(environment, hierarchy_levels)
|
||||
|
||||
def put(self, environment_id):
|
||||
return self.patch(environment_id)
|
||||
|
||||
def patch(self, environment_id):
|
||||
self._perform_update(environment_id)
|
||||
return None, 204
|
||||
|
||||
@db.with_transaction
|
||||
def delete(self, environment_id):
|
||||
environment = db.get_or_404(db.Environment, environment_id)
|
||||
db.db.session.delete(environment)
|
||||
return None, 204
|
@ -1,110 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import flask
|
||||
from flask import current_app as app
|
||||
|
||||
import flask_restful
|
||||
from flask_restful import fields
|
||||
|
||||
from tuning_box import db
|
||||
from tuning_box import errors
|
||||
|
||||
|
||||
def iter_environment_level_values(environment, levels):
|
||||
app.logger.debug("Getting environment level values. Env: %s, "
|
||||
"levels: %s", environment.id, levels)
|
||||
env_levels = db.EnvironmentHierarchyLevel.get_for_environment(environment)
|
||||
app.logger.debug("Environment levels got. Env: %s, levels: %s",
|
||||
environment.id, [l.name for l in env_levels])
|
||||
|
||||
if len(env_levels) < len(levels):
|
||||
raise errors.TuningboxNotFound(
|
||||
"Levels {0} can't be matched with "
|
||||
"environment {1} levels: {2}".format(
|
||||
[l[0] for l in levels],
|
||||
environment.id,
|
||||
[l.name for l in env_levels]
|
||||
)
|
||||
)
|
||||
level_pairs = zip(env_levels, levels)
|
||||
for env_level, (level_name, level_value) in level_pairs:
|
||||
if env_level.name != level_name:
|
||||
raise errors.TuningboxNotFound(
|
||||
"Unexpected level name '{0}'. Expected '{1}'.".format(
|
||||
level_name, env_level.name)
|
||||
)
|
||||
|
||||
level_value_db = db.get_or_create(
|
||||
db.EnvironmentHierarchyLevelValue,
|
||||
level=env_level,
|
||||
value=level_value,
|
||||
)
|
||||
yield level_value_db
|
||||
|
||||
|
||||
def get_environment_level_value(environment, levels):
|
||||
level_value = None
|
||||
for level_value in iter_environment_level_values(environment, levels):
|
||||
pass
|
||||
return level_value
|
||||
|
||||
|
||||
environment_hierarchy_level_fields = {
|
||||
'id': fields.Integer,
|
||||
'name': fields.String,
|
||||
'environment_id': fields.Integer,
|
||||
'parent': fields.String(attribute='parent.name'),
|
||||
'values': fields.List(fields.String(attribute='value'))
|
||||
}
|
||||
|
||||
|
||||
class EnvironmentHierarchyLevelsCollection(flask_restful.Resource):
|
||||
method_decorators = [
|
||||
flask_restful.marshal_with(environment_hierarchy_level_fields)
|
||||
]
|
||||
|
||||
def get(self, environment_id):
|
||||
env = db.get_or_404(db.Environment, environment_id)
|
||||
return db.EnvironmentHierarchyLevel.get_for_environment(env)
|
||||
|
||||
|
||||
class EnvironmentHierarchyLevels(flask_restful.Resource):
|
||||
method_decorators = [
|
||||
flask_restful.marshal_with(environment_hierarchy_level_fields)
|
||||
]
|
||||
|
||||
def _get_query_params(self, environment_id, id_or_name):
|
||||
params = {'environment_id': environment_id}
|
||||
if isinstance(id_or_name, int):
|
||||
params['id'] = id_or_name
|
||||
else:
|
||||
params['name'] = id_or_name
|
||||
return params
|
||||
|
||||
def get(self, environment_id, id_or_name):
|
||||
params = self._get_query_params(environment_id, id_or_name)
|
||||
level = db.find_or_404(db.EnvironmentHierarchyLevel, **params)
|
||||
return level
|
||||
|
||||
@db.with_transaction
|
||||
def _do_update(self, environment_id, id_or_name):
|
||||
params = self._get_query_params(environment_id, id_or_name)
|
||||
level = db.find_or_404(db.EnvironmentHierarchyLevel, **params)
|
||||
level.name = flask.request.json.get('name', level.name)
|
||||
|
||||
def put(self, environment_id, id_or_name):
|
||||
return self.patch(environment_id, id_or_name)
|
||||
|
||||
def patch(self, environment_id, id_or_name):
|
||||
self._do_update(environment_id, id_or_name)
|
||||
return None, 204
|
@ -1,108 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import flask
|
||||
import flask_restful
|
||||
from flask_restful import fields
|
||||
|
||||
from tuning_box import db
|
||||
from tuning_box.library import resource_keys_operation
|
||||
|
||||
resource_definition_fields = {
|
||||
'id': fields.Integer,
|
||||
'name': fields.String,
|
||||
'component_id': fields.Integer(default=None),
|
||||
'content': fields.Raw,
|
||||
}
|
||||
|
||||
|
||||
class ResourceDefinitionsCollection(flask_restful.Resource):
|
||||
method_decorators = [
|
||||
flask_restful.marshal_with(resource_definition_fields)
|
||||
]
|
||||
|
||||
def get(self):
|
||||
query = db.ResourceDefinition.query
|
||||
if 'component_id' in flask.request.args:
|
||||
component_id = flask.request.args.get('component_id')
|
||||
component_id = component_id or None
|
||||
query = query.filter(
|
||||
db.ResourceDefinition.component_id == component_id
|
||||
)
|
||||
return query.all()
|
||||
|
||||
@db.with_transaction
|
||||
def post(self):
|
||||
data = dict()
|
||||
for field_name in resource_definition_fields.keys():
|
||||
data[field_name] = flask.request.json.get(field_name, None)
|
||||
resource_definition = db.ResourceDefinition(**data)
|
||||
db.db.session.add(resource_definition)
|
||||
return resource_definition, 201
|
||||
|
||||
|
||||
class ResourceDefinition(flask_restful.Resource):
|
||||
method_decorators = [
|
||||
flask_restful.marshal_with(resource_definition_fields)]
|
||||
|
||||
def get(self, resource_definition_id):
|
||||
return db.get_or_404(db.ResourceDefinition, resource_definition_id)
|
||||
|
||||
@db.with_transaction
|
||||
def _perform_update(self, resource_definition_id):
|
||||
res_definition = db.get_or_404(
|
||||
db.ResourceDefinition, resource_definition_id)
|
||||
update_by = flask.request.json
|
||||
skip_fields = ('id', )
|
||||
|
||||
for field_name in resource_definition_fields.keys():
|
||||
|
||||
if field_name in skip_fields:
|
||||
continue
|
||||
if field_name in update_by:
|
||||
setattr(
|
||||
res_definition, field_name,
|
||||
update_by.get(field_name)
|
||||
)
|
||||
|
||||
def put(self, resource_definition_id):
|
||||
return self.patch(resource_definition_id)
|
||||
|
||||
def patch(self, resource_definition_id):
|
||||
self._perform_update(resource_definition_id)
|
||||
return None, 204
|
||||
|
||||
@db.with_transaction
|
||||
def delete(self, resource_definition_id):
|
||||
res_definition = db.get_or_404(
|
||||
db.ResourceDefinition, resource_definition_id)
|
||||
db.db.session.delete(res_definition)
|
||||
return None, 204
|
||||
|
||||
|
||||
class ResourceDefinitionKeys(flask_restful.Resource,
|
||||
resource_keys_operation.KeysOperationMixin):
|
||||
|
||||
@db.with_transaction
|
||||
def _do_update(self, resource_definition_id, operation):
|
||||
res_definition = db.get_or_404(
|
||||
db.ResourceDefinition, resource_definition_id)
|
||||
result = self.perform_operation(operation, res_definition.content,
|
||||
flask.request.json)
|
||||
res_definition.content = result
|
||||
|
||||
def put(self, resource_definition_id, operation):
|
||||
return self.patch(resource_definition_id, operation)
|
||||
|
||||
def patch(self, resource_definition_id, operation):
|
||||
self._do_update(resource_definition_id, operation)
|
||||
return None, 204
|
@ -1,208 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import copy
|
||||
|
||||
import flask
|
||||
|
||||
from tuning_box import db
|
||||
from tuning_box import errors
|
||||
from tuning_box import library
|
||||
|
||||
|
||||
class KeysOperationMixin(object):
|
||||
|
||||
OPERATION_GET = 'get'
|
||||
OPERATION_SET = 'set'
|
||||
OPERATION_DELETE = 'delete'
|
||||
|
||||
OPERATIONS = (OPERATION_GET, OPERATION_SET, OPERATION_DELETE)
|
||||
|
||||
def _check_out_of_index(self, cur_point, key, keys_path):
|
||||
if isinstance(cur_point, (list, tuple)) and key >= len(cur_point):
|
||||
raise errors.KeysPathNotExisted(
|
||||
"Keys path doesn't exist {0}. "
|
||||
"Failed on the key {1}".format(keys_path, key)
|
||||
)
|
||||
|
||||
def _check_key_existed(self, cur_point, key, keys_path):
|
||||
if isinstance(cur_point, dict) and key not in cur_point:
|
||||
raise errors.KeysPathNotExisted(
|
||||
"Keys path doesn't exist {0}. "
|
||||
"Failed on the key {1}".format(keys_path, key)
|
||||
)
|
||||
|
||||
def _check_path_is_reachable(self, cur_point, key, keys_path):
|
||||
if not isinstance(cur_point, (list, tuple, dict)):
|
||||
raise errors.KeysPathUnreachable(
|
||||
"Leaf value {0} found on key {1} "
|
||||
"in keys path {2}".format(cur_point, key, keys_path)
|
||||
)
|
||||
|
||||
def _cast_key(self, key, cur_point):
|
||||
"""Casts indexes of lists and tuples to integer.
|
||||
|
||||
Keys paths can be passed as part of url or as command line
|
||||
parameter: k1.k2.0.k4. So we need to cast list and tuple
|
||||
indexes to integers
|
||||
|
||||
:param key: key
|
||||
:param cur_point: data structure where key should be set
|
||||
:return:
|
||||
"""
|
||||
if isinstance(cur_point, (list, tuple)):
|
||||
key = int(key)
|
||||
return key
|
||||
|
||||
def do_get(self, storage, keys_paths):
|
||||
"""Gets values from storage by keys paths.
|
||||
|
||||
Keys path is list of keys paths. If we have keys_paths
|
||||
[['a', 'b']], then storage['a']['b'] will be get as result.
|
||||
|
||||
:param storage: original data
|
||||
:param keys_paths: lists of keys paths to be set
|
||||
:returns: value from storage specified by keys_paths
|
||||
"""
|
||||
|
||||
# Removing show lookup information from the data
|
||||
show_lookup = 'show_lookup' in flask.request.args
|
||||
effective = 'effective' in flask.request.args
|
||||
|
||||
if effective and show_lookup:
|
||||
storage_copy = copy.deepcopy(storage)
|
||||
for k in storage_copy.iterkeys():
|
||||
storage_copy[k] = storage[k][0]
|
||||
else:
|
||||
storage_copy = storage
|
||||
|
||||
result = []
|
||||
for keys_path in keys_paths:
|
||||
cur_point = storage_copy
|
||||
if not keys_path:
|
||||
continue
|
||||
|
||||
try:
|
||||
for key in keys_path[:-1]:
|
||||
key = self._cast_key(key, cur_point)
|
||||
cur_point = cur_point[key]
|
||||
key = keys_path[-1]
|
||||
key = self._cast_key(key, cur_point)
|
||||
self._check_path_is_reachable(cur_point, key, keys_path)
|
||||
|
||||
if effective and show_lookup:
|
||||
result.append([cur_point[key], storage[keys_path[0]][1]])
|
||||
else:
|
||||
result.append(cur_point[key])
|
||||
|
||||
except (KeyError, IndexError):
|
||||
raise errors.KeysPathNotExisted(
|
||||
"Keys path doesn't exist {0}. "
|
||||
"Failed on the key {1}".format(keys_path, key)
|
||||
)
|
||||
return result
|
||||
|
||||
def do_set(self, storage, keys_paths):
|
||||
"""Sets values from keys paths to storage.
|
||||
|
||||
Keys path is list of keys paths. If we have keys_paths
|
||||
[['a', 'b', 'val']], then storage['a']['b'] will be set to 'val'.
|
||||
Last value in the keys path is value to be set.
|
||||
|
||||
:param storage: original data
|
||||
:param keys_paths: lists of keys paths to be set
|
||||
:returns: result of merging keys_paths and storage
|
||||
"""
|
||||
|
||||
storage_copy = copy.deepcopy(storage)
|
||||
for keys_path in keys_paths:
|
||||
cur_point = storage_copy
|
||||
if len(keys_path) < 2:
|
||||
raise errors.KeysPathInvalid(
|
||||
"Keys path {0} invalid. Keys path should contain "
|
||||
"at least one key and value".format(keys_path)
|
||||
)
|
||||
|
||||
for key in keys_path[:-2]:
|
||||
key = self._cast_key(key, cur_point)
|
||||
self._check_path_is_reachable(cur_point, key, keys_path)
|
||||
self._check_out_of_index(cur_point, key, keys_path)
|
||||
self._check_key_existed(cur_point, key, keys_path)
|
||||
cur_point = cur_point[key]
|
||||
|
||||
assign_to = self._cast_key(keys_path[-2], cur_point)
|
||||
self._check_path_is_reachable(cur_point, assign_to, keys_path)
|
||||
self._check_out_of_index(cur_point, assign_to, keys_path)
|
||||
cur_point[assign_to] = keys_path[-1]
|
||||
|
||||
return storage_copy
|
||||
|
||||
def do_delete(self, storage, keys_paths):
|
||||
"""Deletes keys paths from storage.
|
||||
|
||||
Keys path is list of keys paths. If we have keys_paths
|
||||
[['a', 'b']], then storage['a']['b'] will be removed.
|
||||
|
||||
:param storage: data
|
||||
:param keys_paths: lists of keys paths to be deleted
|
||||
:returns: result of keys_paths deletion from storage
|
||||
"""
|
||||
|
||||
storage_copy = copy.deepcopy(storage)
|
||||
for keys_path in keys_paths:
|
||||
cur_point = storage_copy
|
||||
if not keys_path:
|
||||
continue
|
||||
|
||||
try:
|
||||
for key in keys_path[:-1]:
|
||||
key = self._cast_key(key, cur_point)
|
||||
cur_point = cur_point[key]
|
||||
key = keys_path[-1]
|
||||
key = self._cast_key(key, cur_point)
|
||||
self._check_path_is_reachable(cur_point, key, keys_path)
|
||||
del cur_point[key]
|
||||
except (KeyError, IndexError):
|
||||
raise errors.KeysPathNotExisted(
|
||||
"Keys path doesn't exist {0}. "
|
||||
"Failed on the key {1}".format(keys_path, key)
|
||||
)
|
||||
return storage_copy
|
||||
|
||||
def perform_operation(self, operation, storage, keys_paths):
|
||||
if operation == self.OPERATION_SET:
|
||||
return self.do_set(storage, keys_paths)
|
||||
elif operation == self.OPERATION_DELETE:
|
||||
return self.do_delete(storage, keys_paths)
|
||||
elif operation == self.OPERATION_GET:
|
||||
return self.do_get(storage, keys_paths)
|
||||
else:
|
||||
raise errors.UnknownKeysOperation(
|
||||
"Unknown operation: {0}. "
|
||||
"Allowed operations: {1}".format(operation, self.OPERATIONS)
|
||||
)
|
||||
|
||||
|
||||
class ResourceKeysMixin(KeysOperationMixin):
|
||||
|
||||
@db.with_transaction
|
||||
def _do_update(self, environment_id, levels,
|
||||
resource_id_or_name, operation, storage_name):
|
||||
|
||||
environment = db.Environment.query.get_or_404(environment_id)
|
||||
res_def = library.get_resource_definition(
|
||||
resource_id_or_name, environment_id)
|
||||
|
||||
res_values = library.get_resource_values(environment, levels, res_def)
|
||||
result = self.perform_operation(
|
||||
operation, getattr(res_values, storage_name), flask.request.json)
|
||||
setattr(res_values, storage_name, result)
|
@ -1,69 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import flask
|
||||
import flask_restful
|
||||
|
||||
from tuning_box import db
|
||||
from tuning_box import library
|
||||
from tuning_box.library import hierarchy_levels
|
||||
from tuning_box.library import resource_keys_operation
|
||||
|
||||
|
||||
class ResourceOverrides(flask_restful.Resource):
|
||||
|
||||
@db.with_transaction
|
||||
def put(self, environment_id, levels, resource_id_or_name):
|
||||
environment = db.Environment.query.get_or_404(environment_id)
|
||||
res_def = library.get_resource_definition(
|
||||
resource_id_or_name, environment_id)
|
||||
|
||||
level_value = hierarchy_levels.get_environment_level_value(
|
||||
environment, levels)
|
||||
esv = db.get_or_create(
|
||||
db.ResourceValues,
|
||||
environment=environment,
|
||||
resource_definition=res_def,
|
||||
level_value=level_value,
|
||||
)
|
||||
esv.overrides = flask.request.json
|
||||
return None, 204
|
||||
|
||||
@db.with_transaction
|
||||
def get(self, environment_id, resource_id_or_name, levels):
|
||||
environment = db.Environment.query.get_or_404(environment_id)
|
||||
res_def = library.get_resource_definition(
|
||||
resource_id_or_name, environment_id)
|
||||
|
||||
level_value = hierarchy_levels.get_environment_level_value(
|
||||
environment, levels)
|
||||
res_values = db.ResourceValues.query.filter_by(
|
||||
resource_definition=res_def,
|
||||
environment=environment,
|
||||
level_value=level_value,
|
||||
).one_or_none()
|
||||
if not res_values:
|
||||
return {}
|
||||
return res_values.overrides
|
||||
|
||||
|
||||
class ResourceOverridesKeys(flask_restful.Resource,
|
||||
resource_keys_operation.ResourceKeysMixin):
|
||||
|
||||
def put(self, environment_id, levels, resource_id_or_name, operation):
|
||||
return self.patch(environment_id, levels,
|
||||
resource_id_or_name, operation)
|
||||
|
||||
def patch(self, environment_id, levels, resource_id_or_name, operation):
|
||||
self._do_update(environment_id, levels, resource_id_or_name,
|
||||
operation, 'overrides')
|
||||
return None, 204
|
@ -1,168 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import flask
|
||||
from flask import current_app as app
|
||||
import flask_restful
|
||||
import six
|
||||
from sqlalchemy import or_
|
||||
|
||||
from tuning_box import db
|
||||
from tuning_box import errors
|
||||
from tuning_box import library
|
||||
from tuning_box.library import hierarchy_levels
|
||||
from tuning_box.library import resource_keys_operation
|
||||
from tuning_box.library.resource_keys_operation import KeysOperationMixin
|
||||
|
||||
|
||||
class ResourceValues(flask_restful.Resource, KeysOperationMixin):
|
||||
|
||||
KEYS_PATH_DELIMITER = '.'
|
||||
|
||||
@db.with_transaction
|
||||
def put(self, environment_id, levels, resource_id_or_name):
|
||||
environment = db.Environment.query.get_or_404(environment_id)
|
||||
res_def = library.get_resource_definition(
|
||||
resource_id_or_name, environment_id)
|
||||
|
||||
level_value = hierarchy_levels.get_environment_level_value(
|
||||
environment, levels)
|
||||
esv = db.get_or_create(
|
||||
db.ResourceValues,
|
||||
environment=environment,
|
||||
resource_definition=res_def,
|
||||
level_value=level_value,
|
||||
)
|
||||
esv.values = flask.request.json
|
||||
return None, 204
|
||||
|
||||
def _calculate_effective_values(self, result, level_value,
|
||||
resource_values_idx, show_lookup,
|
||||
lookup_path):
|
||||
level_value_id = getattr(level_value, 'id', None)
|
||||
if level_value_id in resource_values_idx:
|
||||
resource_value = resource_values_idx[level_value_id]
|
||||
if show_lookup:
|
||||
values = ((k, (v, lookup_path)) for k, v in
|
||||
six.iteritems(resource_value.values))
|
||||
overrides = ((k, (v, lookup_path)) for k, v in
|
||||
six.iteritems(resource_value.overrides))
|
||||
else:
|
||||
values = resource_value.values
|
||||
overrides = resource_value.overrides
|
||||
result.update(values)
|
||||
result.update(overrides)
|
||||
|
||||
@db.with_transaction
|
||||
def get(self, environment_id, resource_id_or_name, levels):
|
||||
app.logger.debug("Getting resource value. Env: %s, "
|
||||
"resource: %s, levels: %s", environment_id,
|
||||
resource_id_or_name, levels)
|
||||
|
||||
effective = 'effective' in flask.request.args
|
||||
show_lookup = 'show_lookup' in flask.request.args
|
||||
|
||||
if show_lookup and not effective:
|
||||
raise errors.RequestValidationError(
|
||||
"Lookup path tracing can be done only for effective values")
|
||||
|
||||
environment = db.Environment.query.get_or_404(environment_id)
|
||||
res_def = library.get_resource_definition(
|
||||
resource_id_or_name, environment_id)
|
||||
|
||||
level_values = list(hierarchy_levels.iter_environment_level_values(
|
||||
environment, levels))
|
||||
|
||||
level_values_ids = [l.id for l in level_values]
|
||||
app.logger.debug("Got level values ids: %s", level_values_ids)
|
||||
|
||||
if effective:
|
||||
app.logger.debug("Getting effective resource value. Env: %s, "
|
||||
"resource: %s, levels: %s", environment_id,
|
||||
resource_id_or_name, levels)
|
||||
resource_values = db.ResourceValues.query.filter(
|
||||
or_(
|
||||
db.ResourceValues.level_value_id.in_(level_values_ids),
|
||||
db.ResourceValues.level_value_id.is_(None)
|
||||
),
|
||||
db.ResourceValues.resource_definition == res_def,
|
||||
db.ResourceValues.environment == environment
|
||||
).all()
|
||||
app.logger.debug("Processing values for resource: %s, env: %s. "
|
||||
"Loaded resource values: %s",
|
||||
res_def.id, environment.id, len(resource_values))
|
||||
# Creating index of resource_values by level_value_id
|
||||
resource_values_idx = {r.level_value_id: r
|
||||
for r in resource_values}
|
||||
app.logger.debug("Resource values index size: %s",
|
||||
len(resource_values_idx))
|
||||
|
||||
result = {}
|
||||
lookup_path = '/'
|
||||
self._calculate_effective_values(
|
||||
result, None, resource_values_idx, show_lookup,
|
||||
lookup_path)
|
||||
|
||||
for level_value in level_values:
|
||||
name = level_value.level.name
|
||||
value = level_value.value
|
||||
lookup_path += name + '/' + value + '/'
|
||||
|
||||
self._calculate_effective_values(
|
||||
result, level_value, resource_values_idx, show_lookup,
|
||||
lookup_path)
|
||||
|
||||
app.logger.debug("Effective values got for resource: "
|
||||
"%s, env: %s", res_def.id, environment.id)
|
||||
else:
|
||||
if not level_values:
|
||||
level_value = None
|
||||
else:
|
||||
level_value = level_values[-1]
|
||||
resource_values = db.ResourceValues.query.filter_by(
|
||||
resource_definition=res_def,
|
||||
environment=environment,
|
||||
level_value=level_value,
|
||||
).one_or_none()
|
||||
app.logger.debug("Values got for resource: "
|
||||
"%s, env: %s", res_def.id, environment.id)
|
||||
if resource_values:
|
||||
result = resource_values.values
|
||||
else:
|
||||
result = {}
|
||||
return self._extract_keys_paths(result)
|
||||
|
||||
def _extract_keys_paths(self, data):
|
||||
if 'key' not in flask.request.args:
|
||||
return data
|
||||
keys_path = flask.request.args['key'].split(self.KEYS_PATH_DELIMITER)
|
||||
app.logger.debug("Extracting data by keys paths: %s", keys_path)
|
||||
result = self.do_get(data, [keys_path])
|
||||
# Single keys path is passed as GET request parameter, so we need
|
||||
# only first result
|
||||
result = result[0]
|
||||
app.logger.debug("Extracted data by keys paths: %s is: %s",
|
||||
keys_path, result)
|
||||
return result
|
||||
|
||||
|
||||
class ResourceValuesKeys(flask_restful.Resource,
|
||||
resource_keys_operation.ResourceKeysMixin):
|
||||
|
||||
def put(self, environment_id, levels, resource_id_or_name, operation):
|
||||
return self.patch(environment_id, levels,
|
||||
resource_id_or_name, operation)
|
||||
|
||||
def patch(self, environment_id, levels, resource_id_or_name, operation):
|
||||
self._do_update(environment_id, levels, resource_id_or_name,
|
||||
operation, 'values')
|
||||
return None, 204
|
@ -1,27 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import logging
|
||||
|
||||
|
||||
def get_formatter():
|
||||
date_format = "%Y-%m-%d %H:%M:%S"
|
||||
log_format = "%(asctime)s.%(msecs)03d %(levelname)s " \
|
||||
"(%(module)s) %(message)s"
|
||||
return logging.Formatter(fmt=log_format, datefmt=date_format)
|
||||
|
||||
|
||||
def init_logger(app, log_level):
|
||||
handler = logging.StreamHandler()
|
||||
handler.setFormatter(get_formatter())
|
||||
app.logger.addHandler(handler)
|
||||
app.logger.setLevel(log_level)
|
@ -1,21 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from keystonemiddleware import auth_token
|
||||
|
||||
|
||||
class KeystoneMiddleware(auth_token.AuthProtocol):
|
||||
|
||||
def __init__(self, app):
|
||||
self.app = app.wsgi_app
|
||||
auth_settings = app.config.get('AUTH')
|
||||
super(KeystoneMiddleware, self).__init__(self.app, auth_settings)
|
@ -1,39 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from alembic import command as alembic_command
|
||||
from alembic import config as alembic_config
|
||||
|
||||
import tuning_box
|
||||
from tuning_box import app
|
||||
from tuning_box import db
|
||||
|
||||
|
||||
def get_alembic_config(engine):
|
||||
config = alembic_config.Config()
|
||||
config.set_main_option('sqlalchemy.url', str(engine.url))
|
||||
config.set_main_option(
|
||||
'script_location', tuning_box.get_migrations_dir())
|
||||
config.set_main_option('version_table', 'alembic_version')
|
||||
return config
|
||||
|
||||
|
||||
def upgrade():
|
||||
with app.build_app(with_keystone=False).app_context():
|
||||
config = get_alembic_config(db.db.engine)
|
||||
alembic_command.upgrade(config, 'head')
|
||||
|
||||
|
||||
def downgrade():
|
||||
with app.build_app(with_keystone=False).app_context():
|
||||
config = get_alembic_config(db.db.engine)
|
||||
alembic_command.downgrade(config, 'base')
|
@ -1,76 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import logging.config
|
||||
|
||||
from alembic import context
|
||||
import sqlalchemy
|
||||
|
||||
from tuning_box import db
|
||||
|
||||
config = context.config
|
||||
if config.get_main_option('table_prefix') is None:
|
||||
config.set_main_option('table_prefix', '')
|
||||
if config.config_file_name:
|
||||
logging.config.fileConfig(config.config_file_name)
|
||||
target_metadata = db.db.metadata
|
||||
|
||||
|
||||
def run_migrations_offline():
|
||||
"""Run migrations in 'offline' mode.
|
||||
|
||||
This configures the context with just a URL
|
||||
and not an Engine, though an Engine is acceptable
|
||||
here as well. By skipping the Engine creation
|
||||
we don't even need a DBAPI to be available.
|
||||
|
||||
Calls to context.execute() here emit the given string to the
|
||||
script output.
|
||||
|
||||
"""
|
||||
context.configure(
|
||||
url=config.get_main_option('sqlalchemy.url'),
|
||||
version_table=config.get_main_option('version_table'),
|
||||
literal_binds=True,
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
def run_migrations_online():
|
||||
"""Run migrations in 'online' mode.
|
||||
|
||||
In this scenario we need to create an Engine
|
||||
and associate a connection with the context.
|
||||
|
||||
"""
|
||||
connectable = sqlalchemy.engine_from_config(
|
||||
config.get_section(config.config_ini_section),
|
||||
prefix='sqlalchemy.',
|
||||
poolclass=sqlalchemy.pool.NullPool,
|
||||
)
|
||||
|
||||
with connectable.connect() as connection:
|
||||
context.configure(
|
||||
connection=connection,
|
||||
target_metadata=target_metadata,
|
||||
version_table=config.get_main_option('version_table'),
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
if context.is_offline_mode():
|
||||
run_migrations_offline()
|
||||
else:
|
||||
run_migrations_online()
|
@ -1,41 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""${message}
|
||||
|
||||
Revision ID: ${up_revision}
|
||||
Revises:${" " if down_revision else ""}${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = ${repr(up_revision)}
|
||||
down_revision = ${repr(down_revision)}
|
||||
branch_labels = ${repr(branch_labels)}
|
||||
depends_on = ${repr(depends_on)}
|
||||
|
||||
from alembic import context
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
import tuning_box.db
|
||||
${imports if imports else ""}
|
||||
|
||||
def upgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
${downgrades if downgrades else "pass"}
|
@ -1,174 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Cascade deletion
|
||||
|
||||
Revision ID: 0c586adad733
|
||||
Revises: 9ae15c85fa92
|
||||
Create Date: 2016-08-11 10:05:51.127370
|
||||
|
||||
"""
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = '0c586adad733'
|
||||
down_revision = '9ae15c85fa92'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
from alembic import context
|
||||
from alembic import op
|
||||
|
||||
|
||||
def upgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
|
||||
# Environment components
|
||||
with op.batch_alter_table(table_prefix + 'environment_components') \
|
||||
as batch:
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'environment_components_component_id_fkey',
|
||||
'foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'environment_components_component_id_fkey',
|
||||
table_prefix + 'component',
|
||||
['component_id'], ['id'], ondelete='CASCADE'
|
||||
)
|
||||
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'environment_components_environment_id_fkey',
|
||||
'foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'environment_components_environment_id_fkey',
|
||||
table_prefix + 'environment',
|
||||
['environment_id'], ['id'], ondelete='CASCADE'
|
||||
)
|
||||
|
||||
# Resource values
|
||||
with op.batch_alter_table(table_prefix + 'resource_values') \
|
||||
as batch:
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'environment_schema_values_environment_id_fkey',
|
||||
'foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'resource_values_environment_id_fkey',
|
||||
table_prefix + 'environment',
|
||||
['environment_id'], ['id'], ondelete='CASCADE'
|
||||
)
|
||||
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'resource_values_resource_definition_id_fkey',
|
||||
'foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'resource_values_resource_definition_id_fkey',
|
||||
table_prefix + 'resource_definition',
|
||||
['resource_definition_id'], ['id'], ondelete='CASCADE'
|
||||
)
|
||||
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'environment_schema_values_level_value_id_fkey',
|
||||
'foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'environment_resource_values_level_value_id_fkey',
|
||||
table_prefix + 'environment_hierarchy_level_value',
|
||||
['level_value_id'], ['id'], ondelete='CASCADE'
|
||||
)
|
||||
|
||||
# Resource definition
|
||||
with op.batch_alter_table(table_prefix + 'resource_definition') \
|
||||
as batch:
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'schema_component_id_fkey',
|
||||
'foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'resource_definition_component_id_fkey',
|
||||
table_prefix + 'component',
|
||||
['component_id'], ['id'], ondelete='CASCADE'
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
|
||||
# Resource definition
|
||||
with op.batch_alter_table(table_prefix + 'resource_definition') \
|
||||
as batch:
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'resource_definition_component_id_fkey',
|
||||
'foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'schema_component_id_fkey',
|
||||
table_prefix + 'component',
|
||||
['component_id'], ['id']
|
||||
)
|
||||
|
||||
# Resource values
|
||||
with op.batch_alter_table(table_prefix + 'resource_values') \
|
||||
as batch:
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'environment_resource_values_level_value_id_fkey',
|
||||
'foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'environment_schema_values_level_value_id_fkey',
|
||||
table_prefix + 'environment_hierarchy_level_value',
|
||||
['level_value_id'], ['id']
|
||||
)
|
||||
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'resource_values_resource_definition_id_fkey',
|
||||
'foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'resource_values_resource_definition_id_fkey',
|
||||
table_prefix + 'resource_definition',
|
||||
['resource_definition_id'], ['id']
|
||||
)
|
||||
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'resource_values_environment_id_fkey',
|
||||
'foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'environment_schema_values_environment_id_fkey',
|
||||
table_prefix + 'environment',
|
||||
['environment_id'], ['id']
|
||||
)
|
||||
|
||||
# Environment components
|
||||
with op.batch_alter_table(table_prefix + 'environment_components') \
|
||||
as batch:
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'environment_components_environment_id_fkey',
|
||||
'foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'environment_components_environment_id_fkey',
|
||||
table_prefix + 'environment',
|
||||
['environment_id'], ['id']
|
||||
)
|
||||
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'environment_components_component_id_fkey',
|
||||
'foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'environment_components_component_id_fkey',
|
||||
table_prefix + 'component',
|
||||
['component_id'], ['id']
|
||||
)
|
@ -1,102 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Switch to new API
|
||||
|
||||
Revision ID: 3b2a0f134e45
|
||||
Revises: f16eb4eff7c
|
||||
Create Date: 2016-03-17 16:30:11.989340
|
||||
|
||||
"""
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = '3b2a0f134e45'
|
||||
down_revision = 'f16eb4eff7c'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
from alembic import context
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
import tuning_box.db
|
||||
|
||||
|
||||
def upgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
op.drop_table(table_prefix + 'template')
|
||||
table_name = table_prefix + 'environment_schema_values'
|
||||
with op.batch_alter_table(table_name) as batch:
|
||||
batch.drop_constraint(table_name + '_schema_id_fkey', 'foreignkey')
|
||||
batch.alter_column(
|
||||
'schema_id',
|
||||
new_column_name='resource_definition_id',
|
||||
existing_type=sa.Integer(),
|
||||
)
|
||||
op.rename_table(table_name, table_prefix + 'resource_values')
|
||||
op.rename_table(table_prefix + 'schema',
|
||||
table_prefix + 'resource_definition')
|
||||
with op.batch_alter_table(table_prefix + 'resource_definition') as batch:
|
||||
batch.drop_column('namespace_id')
|
||||
op.drop_table(table_prefix + 'namespace')
|
||||
table_name = table_prefix + 'resource_values'
|
||||
with op.batch_alter_table(table_name) as batch:
|
||||
batch.create_foreign_key(
|
||||
table_name + '_resource_definition_id_fkey',
|
||||
table_prefix + 'resource_definition',
|
||||
['resource_definition_id'],
|
||||
['id'],
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
table_name = table_prefix + 'resource_values'
|
||||
with op.batch_alter_table(table_name) as batch:
|
||||
batch.drop_constraint(table_name + '_resource_definition_id_fkey',
|
||||
'foreignkey')
|
||||
op.create_table(
|
||||
table_prefix + 'namespace',
|
||||
sa.Column('id', sa.Integer(), nullable=False, primary_key=True),
|
||||
sa.Column('name', sa.String(length=128), nullable=True),
|
||||
)
|
||||
table_name = table_prefix + 'schema'
|
||||
op.rename_table(table_prefix + 'resource_definition', table_name)
|
||||
with op.batch_alter_table(table_name) as batch:
|
||||
batch.add_column(
|
||||
sa.Column('namespace_id', sa.Integer(), nullable=True))
|
||||
table_name = table_prefix + 'environment_schema_values'
|
||||
op.rename_table(table_prefix + 'resource_values', table_name)
|
||||
with op.batch_alter_table(table_name) as batch:
|
||||
batch.alter_column(
|
||||
'resource_definition_id',
|
||||
new_column_name='schema_id',
|
||||
existing_type=sa.Integer(),
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_name + '_schema_id_fkey',
|
||||
table_prefix + 'schema',
|
||||
['schema_id'],
|
||||
['id'],
|
||||
)
|
||||
table_name = table_prefix + 'template'
|
||||
op.create_table(
|
||||
table_name,
|
||||
sa.Column('id', sa.Integer(), nullable=False, primary_key=True),
|
||||
sa.Column('name', sa.String(length=128), nullable=True),
|
||||
sa.Column('component_id', sa.Integer(), nullable=True),
|
||||
sa.Column('content', tuning_box.db.Json(), nullable=True),
|
||||
sa.ForeignKeyConstraint(
|
||||
['component_id'], [table_prefix + 'component.id'],
|
||||
name=table_name + '_component_id_fkey',
|
||||
),
|
||||
)
|
@ -1,52 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Add server_default to resource_values.values
|
||||
|
||||
Revision ID: 967a44dd16d5
|
||||
Revises: 3b2a0f134e45
|
||||
Create Date: 2016-03-25 21:12:28.939719
|
||||
|
||||
"""
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = '967a44dd16d5'
|
||||
down_revision = '3b2a0f134e45'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
from alembic import context
|
||||
from alembic import op
|
||||
|
||||
import tuning_box.db
|
||||
|
||||
|
||||
def upgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
table_name = table_prefix + 'resource_values'
|
||||
with op.batch_alter_table(table_name) as batch:
|
||||
batch.alter_column(
|
||||
'values',
|
||||
server_default='{}',
|
||||
existing_type=tuning_box.db.Json(),
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
table_name = table_prefix + 'resource_values'
|
||||
with op.batch_alter_table(table_name) as batch:
|
||||
batch.alter_column(
|
||||
'values',
|
||||
server_default=None,
|
||||
existing_type=tuning_box.db.Json(),
|
||||
)
|
@ -1,90 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Remove fake root hierarchy level values
|
||||
|
||||
Revision ID: 9ae15c85fa92
|
||||
Revises: d054eefc4c5b
|
||||
Create Date: 2016-04-12 20:22:06.323291
|
||||
|
||||
"""
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = '9ae15c85fa92'
|
||||
down_revision = 'd054eefc4c5b'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
from alembic import context
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.ext import automap
|
||||
|
||||
|
||||
def _get_autobase(table_prefix, bind):
|
||||
metadata = sa.MetaData(bind=bind)
|
||||
table_name = table_prefix + 'environment_hierarchy_level_value'
|
||||
metadata.reflect(only=[table_name])
|
||||
AutoBase = automap.automap_base(metadata=metadata)
|
||||
|
||||
def classname_for_table(base, refl_table_name, table):
|
||||
assert refl_table_name.startswith(table_prefix)
|
||||
noprefix_name = refl_table_name[len(table_prefix):]
|
||||
uname = u"".join(s.capitalize() for s in noprefix_name.split('_'))
|
||||
if not isinstance(uname, str):
|
||||
return uname.encode('utf-8')
|
||||
else:
|
||||
return uname
|
||||
|
||||
AutoBase.prepare(classname_for_table=classname_for_table)
|
||||
return AutoBase
|
||||
|
||||
|
||||
def _get_ehlv_class():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
bind = op.get_bind()
|
||||
AutoBase = _get_autobase(table_prefix, bind)
|
||||
return AutoBase.classes.EnvironmentHierarchyLevelValue
|
||||
|
||||
|
||||
def _get_session():
|
||||
return sa.orm.Session(bind=op.get_bind(), autocommit=True)
|
||||
|
||||
|
||||
def upgrade():
|
||||
EHLV = _get_ehlv_class()
|
||||
session = _get_session()
|
||||
with session.begin():
|
||||
fake_roots = session.query(EHLV) \
|
||||
.filter_by(level_id=None, parent_id=None, value=None) \
|
||||
.all()
|
||||
if fake_roots:
|
||||
fake_root_ids = [r.id for r in fake_roots]
|
||||
session.query(EHLV) \
|
||||
.filter(EHLV.parent_id.in_(fake_root_ids)) \
|
||||
.update({EHLV.parent_id: None}, synchronize_session=False)
|
||||
for r in fake_roots:
|
||||
session.delete(r)
|
||||
|
||||
|
||||
def downgrade():
|
||||
EHLV = _get_ehlv_class()
|
||||
session = _get_session()
|
||||
with session.begin():
|
||||
fake_root = EHLV(level_id=None, parent_id=None, value=None)
|
||||
session.add(fake_root)
|
||||
session.flush()
|
||||
session.query(EHLV) \
|
||||
.filter(EHLV.parent_id == None, # noqa
|
||||
EHLV.id != fake_root.id) \
|
||||
.update({EHLV.parent_id: fake_root.id},
|
||||
synchronize_session=False)
|
@ -1,78 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Remove hierarchy for level values
|
||||
|
||||
Revision ID: a86472389a70
|
||||
Revises: 0c586adad733
|
||||
Create Date: 2016-08-18 14:00:03.197693
|
||||
|
||||
"""
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = 'a86472389a70'
|
||||
down_revision = '0c586adad733'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
from alembic import context
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
def upgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
table_name = table_prefix + 'environment_hierarchy_level_value'
|
||||
with op.batch_alter_table(table_name) as batch:
|
||||
batch.drop_column('parent_id')
|
||||
|
||||
batch.drop_constraint(
|
||||
table_name + '_level_id_fkey',
|
||||
type_='foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_name + '_level_id_fkey',
|
||||
table_prefix + 'environment_hierarchy_level',
|
||||
['level_id'], ['id'], ondelete='CASCADE'
|
||||
)
|
||||
|
||||
batch.create_unique_constraint(
|
||||
table_name + '_level_id_value_unique',
|
||||
['level_id', 'value']
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
table_name = table_prefix + 'environment_hierarchy_level_value'
|
||||
with op.batch_alter_table(table_name) as batch:
|
||||
batch.drop_constraint(
|
||||
table_name + '_level_id_value_unique',
|
||||
type_='unique'
|
||||
)
|
||||
|
||||
batch.drop_constraint(
|
||||
table_name + '_level_id_fkey',
|
||||
type_='foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_name + '_level_id_fkey',
|
||||
table_prefix + 'environment_hierarchy_level',
|
||||
['level_id'], ['id']
|
||||
)
|
||||
|
||||
batch.add_column(sa.Column('parent_id', sa.Integer(), nullable=True))
|
||||
batch.create_foreign_key(
|
||||
table_name + '_parent_id_fkey',
|
||||
table_name,
|
||||
['parent_id'], ['id']
|
||||
)
|
@ -1,46 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Add overrides to resource_values
|
||||
|
||||
Revision ID: ad192a40fd68
|
||||
Revises: 967a44dd16d5
|
||||
Create Date: 2016-03-25 21:26:19.170101
|
||||
|
||||
"""
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = 'ad192a40fd68'
|
||||
down_revision = '967a44dd16d5'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
from alembic import context
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
import tuning_box.db
|
||||
|
||||
|
||||
def upgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
op.add_column(table_prefix + 'resource_values', sa.Column(
|
||||
'overrides',
|
||||
tuning_box.db.Json(),
|
||||
server_default='{}',
|
||||
nullable=True,
|
||||
))
|
||||
|
||||
|
||||
def downgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
op.drop_column(table_prefix + 'resource_values', 'overrides')
|
@ -1,59 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Level cascade deletion on environment removal
|
||||
|
||||
Revision ID: adf671eddeb4
|
||||
Revises: a86472389a70
|
||||
Create Date: 2016-08-19 16:39:46.745113
|
||||
|
||||
"""
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = 'adf671eddeb4'
|
||||
down_revision = 'a86472389a70'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
from alembic import context
|
||||
from alembic import op
|
||||
|
||||
|
||||
def upgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
table_name = table_prefix + 'environment_hierarchy_level'
|
||||
|
||||
with op.batch_alter_table(table_name) as batch:
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'environment_hierarchy_level_environment_id_fkey',
|
||||
type_='foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'environment_hierarchy_level_environment_id_fkey',
|
||||
table_prefix + 'environment',
|
||||
['environment_id'], ['id'], ondelete='CASCADE'
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
table_name = table_prefix + 'environment_hierarchy_level'
|
||||
with op.batch_alter_table(table_name) as batch:
|
||||
batch.drop_constraint(
|
||||
table_prefix + 'environment_hierarchy_level_environment_id_fkey',
|
||||
type_='foreignkey'
|
||||
)
|
||||
batch.create_foreign_key(
|
||||
table_prefix + 'environment_hierarchy_level_environment_id_fkey',
|
||||
table_prefix + 'environment',
|
||||
['environment_id'], ['id']
|
||||
)
|
@ -1,48 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Add unique constraint on component.name
|
||||
|
||||
Revision ID: d054eefc4c5b
|
||||
Revises: ad192a40fd68
|
||||
Create Date: 2016-03-25 21:34:54.487361
|
||||
|
||||
"""
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = 'd054eefc4c5b'
|
||||
down_revision = 'ad192a40fd68'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
from alembic import context
|
||||
from alembic import op
|
||||
|
||||
|
||||
def upgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
table_name = table_prefix + 'component'
|
||||
with op.batch_alter_table(table_name) as batch:
|
||||
batch.create_unique_constraint(
|
||||
table_name + '_component_name_unique',
|
||||
['name'],
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
table_name = table_prefix + 'component'
|
||||
with op.batch_alter_table(table_name) as batch:
|
||||
batch.drop_constraint(
|
||||
table_name + '_component_name_unique',
|
||||
type_='unique',
|
||||
)
|
@ -1,169 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Initial revision
|
||||
|
||||
Revision ID: f16eb4eff7c
|
||||
Revises:
|
||||
Create Date: 2016-03-02 17:10:04.750584
|
||||
|
||||
"""
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = 'f16eb4eff7c'
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
from alembic import context
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
import tuning_box.db
|
||||
|
||||
|
||||
def upgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
op.create_table(
|
||||
table_prefix + 'component',
|
||||
sa.Column('id', sa.Integer(), nullable=False, primary_key=True),
|
||||
sa.Column('name', sa.String(length=128), nullable=True),
|
||||
)
|
||||
op.create_table(
|
||||
table_prefix + 'environment',
|
||||
sa.Column('id', sa.Integer(), nullable=False, primary_key=True),
|
||||
)
|
||||
op.create_table(
|
||||
table_prefix + 'namespace',
|
||||
sa.Column('id', sa.Integer(), nullable=False, primary_key=True),
|
||||
sa.Column('name', sa.String(length=128), nullable=True),
|
||||
)
|
||||
table_name = table_prefix + 'environment_components'
|
||||
op.create_table(
|
||||
table_name,
|
||||
sa.Column('environment_id', sa.Integer(), nullable=True),
|
||||
sa.Column('component_id', sa.Integer(), nullable=True),
|
||||
sa.ForeignKeyConstraint(
|
||||
['component_id'], [table_prefix + 'component.id'],
|
||||
name=table_name + '_component_id_fkey',
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
['environment_id'], [table_prefix + 'environment.id'],
|
||||
name=table_name + '_environment_id_fkey',
|
||||
),
|
||||
)
|
||||
table_name = table_prefix + 'environment_hierarchy_level'
|
||||
op.create_table(
|
||||
table_name,
|
||||
sa.Column('id', sa.Integer(), nullable=False, primary_key=True),
|
||||
sa.Column('environment_id', sa.Integer(), nullable=True),
|
||||
sa.Column('name', sa.String(length=128), nullable=True),
|
||||
sa.Column('parent_id', sa.Integer(), nullable=True),
|
||||
sa.ForeignKeyConstraint(
|
||||
['environment_id'], [table_prefix + 'environment.id'],
|
||||
name=table_name + '_environment_id_fkey',
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
['parent_id'], [table_prefix + 'environment_hierarchy_level.id'],
|
||||
name=table_name + '_parent_id_fkey',
|
||||
),
|
||||
sa.UniqueConstraint(
|
||||
'environment_id', 'name',
|
||||
name=table_name + '_environment_id_name_key',
|
||||
),
|
||||
sa.UniqueConstraint(
|
||||
'environment_id', 'parent_id',
|
||||
name=table_name[:-4] + '_environment_id_parent_id_key',
|
||||
),
|
||||
)
|
||||
table_name = table_prefix + 'schema'
|
||||
op.create_table(
|
||||
table_name,
|
||||
sa.Column('id', sa.Integer(), nullable=False, primary_key=True),
|
||||
sa.Column('name', sa.String(length=128), nullable=True),
|
||||
sa.Column('component_id', sa.Integer(), nullable=True),
|
||||
sa.Column('namespace_id', sa.Integer(), nullable=True),
|
||||
sa.Column('content', tuning_box.db.Json(), nullable=True),
|
||||
sa.ForeignKeyConstraint(
|
||||
['component_id'], [table_prefix + 'component.id'],
|
||||
name=table_name + '_component_id_fkey'),
|
||||
sa.ForeignKeyConstraint(
|
||||
['namespace_id'], [table_prefix + 'namespace.id'],
|
||||
name=table_name + '_namespace_id_fkey'),
|
||||
)
|
||||
table_name = table_prefix + 'template'
|
||||
op.create_table(
|
||||
table_name,
|
||||
sa.Column('id', sa.Integer(), nullable=False, primary_key=True),
|
||||
sa.Column('name', sa.String(length=128), nullable=True),
|
||||
sa.Column('component_id', sa.Integer(), nullable=True),
|
||||
sa.Column('content', tuning_box.db.Json(), nullable=True),
|
||||
sa.ForeignKeyConstraint(
|
||||
['component_id'], [table_prefix + 'component.id'],
|
||||
name=table_name + '_component_id_fkey',
|
||||
),
|
||||
)
|
||||
table_name = table_prefix + 'environment_hierarchy_level_value'
|
||||
op.create_table(
|
||||
table_name,
|
||||
sa.Column('id', sa.Integer(), nullable=False, primary_key=True),
|
||||
sa.Column('level_id', sa.Integer(), nullable=True),
|
||||
sa.Column('parent_id', sa.Integer(), nullable=True),
|
||||
sa.Column('value', sa.String(length=128), nullable=True),
|
||||
sa.ForeignKeyConstraint(
|
||||
['level_id'], [table_prefix + 'environment_hierarchy_level.id'],
|
||||
name=table_name + '_level_id_fkey',
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
['parent_id'], [table_name + '.id'],
|
||||
name=table_name + '_parent_id_fkey',
|
||||
),
|
||||
)
|
||||
table_name = table_prefix + 'environment_schema_values'
|
||||
op.create_table(
|
||||
table_name,
|
||||
sa.Column('id', sa.Integer(), nullable=False, primary_key=True),
|
||||
sa.Column('environment_id', sa.Integer(), nullable=True),
|
||||
sa.Column('schema_id', sa.Integer(), nullable=True),
|
||||
sa.Column('level_value_id', sa.Integer(), nullable=True),
|
||||
sa.Column('values', tuning_box.db.Json(), nullable=True),
|
||||
sa.ForeignKeyConstraint(
|
||||
['environment_id'], [table_prefix + 'environment.id'],
|
||||
name=table_name + '_environment_id_fkey',
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
['level_value_id'],
|
||||
[table_prefix + 'environment_hierarchy_level_value.id'],
|
||||
name=table_name + '_level_value_id_fkey',
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
['schema_id'], [table_prefix + 'schema.id'],
|
||||
name=table_name + '_schema_id_fkey',
|
||||
),
|
||||
sa.UniqueConstraint(
|
||||
'environment_id', 'schema_id', 'level_value_id',
|
||||
name=table_name[:-6] + 'environment_id_schema_id_leve_key',
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
table_prefix = context.config.get_main_option('table_prefix')
|
||||
op.drop_table(table_prefix + 'environment_schema_values')
|
||||
op.drop_table(table_prefix + 'environment_hierarchy_level_value')
|
||||
op.drop_table(table_prefix + 'template')
|
||||
op.drop_table(table_prefix + 'schema')
|
||||
op.drop_table(table_prefix + 'environment_hierarchy_level')
|
||||
op.drop_table(table_prefix + 'environment_components')
|
||||
op.drop_table(table_prefix + 'namespace')
|
||||
op.drop_table(table_prefix + 'environment')
|
||||
op.drop_table(table_prefix + 'component')
|
@ -1,87 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from __future__ import absolute_import
|
||||
|
||||
import itertools
|
||||
import threading
|
||||
|
||||
from nailgun.db import sqlalchemy as nailgun_sa
|
||||
from nailgun import extensions
|
||||
import web
|
||||
|
||||
import tuning_box
|
||||
from tuning_box import app as tb_app
|
||||
from tuning_box import db as tb_db
|
||||
|
||||
|
||||
class App2WebPy(web.application):
|
||||
def __init__(self):
|
||||
web.application.__init__(self)
|
||||
self.__name__ = self
|
||||
self.app = None
|
||||
self.lock = threading.Lock()
|
||||
|
||||
def create_app(self):
|
||||
raise NotImplementedError
|
||||
|
||||
def get_app(self):
|
||||
with self.lock:
|
||||
if not self.app:
|
||||
self.app = self.create_app()
|
||||
return self.app
|
||||
|
||||
def handle(self):
|
||||
written_data = []
|
||||
|
||||
def write(data):
|
||||
assert start_response.called
|
||||
written_data.append(data)
|
||||
|
||||
def start_response(status, headers, exc_info=None):
|
||||
assert not start_response.called
|
||||
assert not exc_info
|
||||
start_response.called = True
|
||||
web.ctx.status = status
|
||||
web.ctx.headers.extend(headers)
|
||||
return write
|
||||
|
||||
start_response.called = False
|
||||
|
||||
app = self.get_app()
|
||||
environ = dict(web.ctx.environ)
|
||||
environ["SCRIPT_NAME"] = environ["PATH_INFO"][:-len(web.ctx.path)]
|
||||
environ["PATH_INFO"] = environ["REQUEST_URI"] = web.ctx.path
|
||||
result = app(environ, start_response)
|
||||
return itertools.chain(written_data, result)
|
||||
|
||||
|
||||
class TB2WebPy(App2WebPy):
|
||||
def create_app(self):
|
||||
# Nailgun API already contains keystone middleware
|
||||
app = tb_app.build_app(with_keystone=False)
|
||||
tb_db.prefix_tables(tb_db, Extension.table_prefix())
|
||||
app.config["PROPAGATE_EXCEPTIONS"] = True
|
||||
app.config["SQLALCHEMY_DATABASE_URI"] = nailgun_sa.db_str
|
||||
return app
|
||||
|
||||
|
||||
class Extension(extensions.BaseExtension):
|
||||
name = 'tuning_box'
|
||||
version = tuning_box.__version__
|
||||
description = 'Plug tuning_box endpoints into Nailgun itself'
|
||||
|
||||
urls = [{'uri': '/config', 'handler': TB2WebPy()}]
|
||||
|
||||
@classmethod
|
||||
def alembic_migrations_path(cls):
|
||||
return tuning_box.get_migrations_dir()
|
@ -1,32 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Copyright 2010-2011 OpenStack Foundation
|
||||
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from oslotest import base
|
||||
|
||||
from tuning_box import db
|
||||
|
||||
|
||||
class TestCase(base.BaseTestCase):
|
||||
|
||||
"""Test case base class for all unit tests."""
|
||||
|
||||
|
||||
class PrefixedTestCaseMixin(object):
|
||||
def setUp(self):
|
||||
db.prefix_tables(db, 'test_prefix_')
|
||||
self.addCleanup(db.unprefix_tables, db)
|
||||
super(PrefixedTestCaseMixin, self).setUp()
|
@ -1,99 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from requests_mock.contrib import fixture as req_fixture
|
||||
import six
|
||||
import testscenarios
|
||||
|
||||
from tuning_box import cli
|
||||
from tuning_box.cli import base as cli_base
|
||||
from tuning_box import client as tb_client
|
||||
from tuning_box.tests import base
|
||||
|
||||
|
||||
class FormatOutputTest(testscenarios.WithScenarios, base.TestCase):
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('output', 'format_', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('none,plain', (None, 'plain', '')),
|
||||
('none,json', (None, 'json', 'null')),
|
||||
# TODO(yorik-sar): look into why YAML return those elipsis
|
||||
('none,yaml', (None, 'yaml', 'null\n...\n')),
|
||||
('str,plain', (u"a string", 'plain', 'a string')),
|
||||
('str,json', (u"a string", 'json', '"a string"')),
|
||||
('str,yaml', (u"a string", 'yaml', 'a string\n...\n')),
|
||||
('int,plain', (42, 'plain', '42')),
|
||||
('int,json', (42, 'json', '42')),
|
||||
('int,yaml', (42, 'yaml', '42\n...\n')),
|
||||
('float,plain', (1.2, 'plain', '1.2')),
|
||||
('float,json', (1.2, 'json', '1.2')),
|
||||
('float,yaml', (1.2, 'yaml', '1.2\n...\n')),
|
||||
('list,plain', ([1, 2], 'plain', '[1, 2]')),
|
||||
('list,json', ([1, 2], 'json', '[1, 2]')),
|
||||
('list,yaml', ([1, 2], 'yaml', '- 1\n- 2\n')),
|
||||
('dict,plain', ({'a': 1}, 'plain', '{"a": 1}')),
|
||||
('dict,json', ({'a': 1}, 'json', '{"a": 1}')),
|
||||
('dict,yaml', ({'a': 1}, 'yaml', 'a: 1\n')),
|
||||
]
|
||||
]
|
||||
|
||||
output = None
|
||||
format_ = None
|
||||
expected_result = None
|
||||
|
||||
def test_format_output(self):
|
||||
res = cli_base.format_output(self.output, self.format_)
|
||||
self.assertEqual(self.expected_result, res)
|
||||
|
||||
|
||||
class SafeTuningBoxApp(cli.TuningBoxApp):
|
||||
def __init__(self, client):
|
||||
super(SafeTuningBoxApp, self).__init__(
|
||||
client=client,
|
||||
**self.get_std_streams()
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def get_std_streams():
|
||||
if bytes is str:
|
||||
io_cls = six.BytesIO
|
||||
else:
|
||||
io_cls = six.StringIO
|
||||
return {k: io_cls() for k in ('stdin', 'stdout', 'stderr')}
|
||||
|
||||
def build_option_parser(self, description, version, argparse_kwargs=None):
|
||||
parser = super(SafeTuningBoxApp, self).build_option_parser(
|
||||
description, version, argparse_kwargs)
|
||||
parser.set_defaults(debug=True)
|
||||
return parser
|
||||
|
||||
def get_fuzzy_matches(self, cmd):
|
||||
# Turn off guessing, we need exact failures in tests
|
||||
return []
|
||||
|
||||
def run(self, argv):
|
||||
try:
|
||||
super(SafeTuningBoxApp, self).run(argv)
|
||||
except SystemExit as e:
|
||||
# We should check exit code only if system exit was called.
|
||||
exit_code = e.code
|
||||
assert exit_code == 0
|
||||
|
||||
|
||||
class _BaseCLITest(base.TestCase):
|
||||
BASE_URL = 'http://somehost/prefix'
|
||||
|
||||
def setUp(self):
|
||||
super(_BaseCLITest, self).setUp()
|
||||
client = tb_client.HTTPClient(self.BASE_URL)
|
||||
self.req_mock = self.useFixture(req_fixture.Fixture())
|
||||
self.cli = SafeTuningBoxApp(client=client)
|
@ -1,145 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import testscenarios
|
||||
|
||||
from tuning_box.tests.cli import _BaseCLITest
|
||||
|
||||
|
||||
class TestCreateComponent(testscenarios.WithScenarios, _BaseCLITest):
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('json', ('/components',
|
||||
'comp create --name comp_name --format json',
|
||||
'{\n "a": "b"\n}')),
|
||||
('yaml', ('/components',
|
||||
'comp create -n comp_name -f yaml',
|
||||
'a: b\n')),
|
||||
]
|
||||
]
|
||||
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_post(self):
|
||||
self.req_mock.post(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json={'a': 'b'},
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestListComponents(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('json', ('/components', 'comp list -f json', '[]')),
|
||||
('yaml', ('/components', 'comp list --format yaml', '[]\n')),
|
||||
]
|
||||
]
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_get(self):
|
||||
self.req_mock.get(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json=[],
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestShowComponent(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('yaml', ('/components/9', 'comp show 9 -f yaml',
|
||||
'id: 1\nname: n\nresource_definitions: []\n')),
|
||||
]
|
||||
]
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_get(self):
|
||||
self.req_mock.get(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json={'id': 1, 'name': 'n', 'resource_definitions': []},
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestDeleteComponent(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('', ('/components/9', 'comp delete 9',
|
||||
'Component with id 9 was deleted\n')),
|
||||
]
|
||||
]
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_delete(self):
|
||||
self.req_mock.delete(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'}
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestUpdateComponent(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('no_data', ('/components/9', 'comp update 9', '{}')),
|
||||
('s_name', ('/components/9',
|
||||
'comp update 9 -n comp_name', '{}')),
|
||||
('l_name', ('/components/9',
|
||||
'comp update 9 --name comp_name', '{}')),
|
||||
('s_r_defs', ('/components/9',
|
||||
'comp update 9 -r 1,2 ', '{}')),
|
||||
('l_r_ders', ('/components/9',
|
||||
'comp update 9 --resource-definitions 1,2', '{}')),
|
||||
('empty_s_r_defs', ('/components/9',
|
||||
'comp update 9 -r [] -n comp_name', '{}')),
|
||||
('empty_l_r_defs', ('/components/9',
|
||||
'comp update 9 --resource-definitions []',
|
||||
'{}'))
|
||||
]
|
||||
]
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_update(self):
|
||||
self.req_mock.patch(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json={}
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
@ -1,153 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import testscenarios
|
||||
|
||||
from tuning_box.tests.cli import _BaseCLITest
|
||||
|
||||
|
||||
class TestCreateEnvironment(testscenarios.WithScenarios, _BaseCLITest):
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('json', ('/environments',
|
||||
'env create -l lvl1 -i 1 -f json',
|
||||
'{\n "a": "b"\n}')),
|
||||
('yaml', ('/environments',
|
||||
'env create -l lvl1,lvl2 -i 1 -f yaml',
|
||||
'a: b\n')),
|
||||
('multi', ('/environments',
|
||||
'env create -l lvl1,lvl2 -i 1,2,3 -f yaml',
|
||||
'a: b\n')),
|
||||
('no_data', ('/environments',
|
||||
'env create -f yaml',
|
||||
'a: b\n'))
|
||||
|
||||
]
|
||||
]
|
||||
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_post(self):
|
||||
self.req_mock.post(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json={'a': 'b'},
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestListEnvironments(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('json', ('/environments', 'env list -f json', '[]')),
|
||||
('yaml', ('/environments', 'env list -f yaml', '[]\n'))
|
||||
]
|
||||
]
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_get(self):
|
||||
self.req_mock.get(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json={},
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestShowEnvironment(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('json', ('/environments/9', 'env show 9 -f json -c id',
|
||||
'{\n "id": 1\n}')),
|
||||
('yaml', ('/environments/9', 'env show 9 -f yaml -c id',
|
||||
'id: 1\n'))
|
||||
]
|
||||
]
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_get(self):
|
||||
self.req_mock.get(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json={'id': 1},
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestDeleteEnvironment(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('json', ('/environments/9', 'env delete 9',
|
||||
'Environment with id 9 was deleted\n'))
|
||||
]
|
||||
]
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_delete(self):
|
||||
self.req_mock.delete(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'}
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestUpdateEnvironment(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('no_data', ('/environments/9', 'env update 9', '{}')),
|
||||
('level', ('/environments/9', 'env update 9 -l lvl1', '{}')),
|
||||
('levels', ('/environments/9',
|
||||
'env update 9 -l lvl1,lvl2',
|
||||
'{}')),
|
||||
('component', ('/environments/9',
|
||||
'env update 9 -l lvl1,lvl2 -i 1',
|
||||
'{}')),
|
||||
('components', ('/environments/9',
|
||||
'env update 9 -l lvl1,lvl2 -i 1,2',
|
||||
'{}')),
|
||||
('erase', ('/environments/9', 'env update 9 -l [] -i 1,2', '{}')),
|
||||
('erase', ('/environments/9', 'env update 9 -l [] -i []', '{}')),
|
||||
]
|
||||
]
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_update(self):
|
||||
self.req_mock.patch(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json={}
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
@ -1,20 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from tuning_box.tests.cli import _BaseCLITest
|
||||
|
||||
|
||||
class TestApp(_BaseCLITest):
|
||||
def test_help(self):
|
||||
self.cli.run(["--help"])
|
||||
self.assertEqual('', self.cli.stderr.getvalue())
|
||||
self.assertNotIn('Could not', self.cli.stdout.getvalue())
|
@ -1,67 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import testscenarios
|
||||
|
||||
from tuning_box.tests.cli import _BaseCLITest
|
||||
|
||||
|
||||
class TestHierarchyLevels(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('json', ('/environments/9/hierarchy_levels',
|
||||
'lvl list -e 9 -f json', '[]')),
|
||||
('yaml', ('/environments/9/hierarchy_levels',
|
||||
'lvl list -e 9 -f yaml', '[]\n'))
|
||||
]
|
||||
]
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_get(self):
|
||||
self.req_mock.get(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json={},
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestShowHierarchyLevel(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('json', ('/environments/9/hierarchy_levels/n',
|
||||
'lvl show -e 9 -f json -c id n',
|
||||
'{\n "id": 1\n}')),
|
||||
('yaml', ('/environments/9/hierarchy_levels/nn',
|
||||
'lvl show -e 9 -f yaml -c id nn',
|
||||
'id: 1\n'))
|
||||
]
|
||||
]
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_get(self):
|
||||
self.req_mock.get(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json={'id': 1},
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
@ -1,150 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import testscenarios
|
||||
import yaml
|
||||
|
||||
from tuning_box.tests.cli import _BaseCLITest
|
||||
|
||||
|
||||
class TestCreateResourceDefinition(testscenarios.WithScenarios, _BaseCLITest):
|
||||
scenarios = [
|
||||
(s[0],
|
||||
dict(zip(('args', 'expected_body', 'stdin'), s[1])))
|
||||
for s in [
|
||||
('json', ('def create -n n -i 1 -d json -f yaml',
|
||||
'content: {}\ncomponent_id: 1\nid: 1\nname: n\n',
|
||||
'{"a": 3}')),
|
||||
('yaml', ('def create -n n -i 1 -d yaml -f yaml',
|
||||
'content: {}\ncomponent_id: 1\nid: 1\nname: n\n',
|
||||
'a: b\n')),
|
||||
]
|
||||
]
|
||||
|
||||
args = None
|
||||
expected_body = None
|
||||
stdin = None
|
||||
|
||||
def test_post(self):
|
||||
url = self.BASE_URL + '/resource_definitions'
|
||||
self.req_mock.post(
|
||||
url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json={'id': 1, 'component_id': 1, 'name': 'n', 'content': {}}
|
||||
)
|
||||
if self.stdin:
|
||||
self.cli.stdin.write(self.stdin)
|
||||
self.cli.stdin.seek(0)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(
|
||||
yaml.safe_load(self.expected_body),
|
||||
yaml.safe_load(self.cli.stdout.getvalue())
|
||||
)
|
||||
|
||||
|
||||
class TestListResourceDefinitions(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('json', ('/resource_definitions', 'def list -f json', '[]')),
|
||||
('yaml', ('/resource_definitions', 'def list --format yaml',
|
||||
'[]\n')),
|
||||
]
|
||||
]
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_get(self):
|
||||
self.req_mock.get(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json=[],
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestShowComponent(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('yaml', ('/resource_definitions/9', 'def show 9 -f yaml',
|
||||
'id: 1\nname: n\ncomponent_id: 2\ncontent: {}\n')),
|
||||
]
|
||||
]
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_get(self):
|
||||
self.req_mock.get(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json={'id': 1, 'name': 'n', 'component_id': 2, 'content': {}},
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestDeleteComponent(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('', ('/resource_definitions/9', 'def delete 9',
|
||||
'Resource_definition with id 9 was deleted\n')),
|
||||
]
|
||||
]
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_delete(self):
|
||||
self.req_mock.delete(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'}
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestUpdateResourceDefinition(testscenarios.WithScenarios, _BaseCLITest):
|
||||
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('args', 'expected_result', 'stdin'), s[1])))
|
||||
for s in [
|
||||
('no_data', ('def update 9', '{}')),
|
||||
('name', ('def update 9 -n comp_name', '{}', False)),
|
||||
('component_id', ('def update 9 -i 1', '{}', False)),
|
||||
('content', ('def update 9 -p "a" -t yaml', '{}', False)),
|
||||
('stdin_content', ('def update 9 -d yaml', '{}', 'a: b')),
|
||||
('stdin_content', ('def update 9 -d yaml', '{}', 'a: b')),
|
||||
]
|
||||
]
|
||||
args = None
|
||||
expected_result = None
|
||||
stdin = None
|
||||
|
||||
def test_update(self):
|
||||
self.req_mock.patch(
|
||||
self.BASE_URL + '/resource_definitions/9',
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json={}
|
||||
)
|
||||
if self.stdin:
|
||||
self.cli.stdin.write(self.stdin)
|
||||
self.cli.stdin.seek(0)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
@ -1,268 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import testscenarios
|
||||
|
||||
from tuning_box.cli import base as cli_base
|
||||
from tuning_box.tests import base
|
||||
from tuning_box.tests.cli import _BaseCLITest
|
||||
|
||||
|
||||
class TestLevelsConverter(testscenarios.WithScenarios, base.TestCase):
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('input', 'expected_result', 'expected_error'), s[1])))
|
||||
for s in [
|
||||
('empty', ('', None, TypeError)),
|
||||
('one', ('lvl=val', [('lvl', 'val')])),
|
||||
('two', ('lvl1=val1,lvl2=val2', [('lvl1', 'val1'),
|
||||
('lvl2', 'val2')])),
|
||||
('no_eq', ('val', None, TypeError)),
|
||||
('no_eq2', ('lvl1=val2,val', None, TypeError)),
|
||||
('two_eq', ('lvl1=foo=baz', [('lvl1', 'foo=baz')])),
|
||||
]
|
||||
]
|
||||
|
||||
input = None
|
||||
expected_result = None
|
||||
expected_error = None
|
||||
|
||||
def test_levels(self):
|
||||
if self.expected_error:
|
||||
self.assertRaises(
|
||||
self.expected_error, cli_base.level_converter, self.input)
|
||||
else:
|
||||
res = cli_base.level_converter(self.input)
|
||||
self.assertEqual(self.expected_result, res)
|
||||
|
||||
|
||||
class TestGet(testscenarios.WithScenarios, _BaseCLITest):
|
||||
scenarios = [
|
||||
(s[0], dict(zip(('mock_url', 'args', 'expected_result'), s[1])))
|
||||
for s in [
|
||||
('global,json', (
|
||||
'/environments/1/resources/1/values?effective',
|
||||
'get --env 1 --resource 1 --format=json',
|
||||
'{\n "hello": "world"\n}',
|
||||
)),
|
||||
('global,lookup', (
|
||||
'/environments/1/resources/1/values?effective',
|
||||
'get --env 1 --resource 1 --format=json --show-lookup',
|
||||
'{\n "hello": "world"\n}',
|
||||
)),
|
||||
('lowlevel,json', (
|
||||
'/environments/1/lvl1/value1/resources/1/values?effective',
|
||||
'get --env 1 --level lvl1=value1 --resource 1 --format=json',
|
||||
'{\n "hello": "world"\n}',
|
||||
)),
|
||||
('global,yaml', (
|
||||
'/environments/1/resources/1/values?effective',
|
||||
'get --env 1 --resource 1 --format yaml',
|
||||
'hello: world\n',
|
||||
)),
|
||||
('lowlevel,yaml', (
|
||||
'/environments/1/lvl1/value1/resources/1/values?effective',
|
||||
'get --env 1 --level lvl1=value1 --resource 1 --format yaml',
|
||||
'hello: world\n',
|
||||
)),
|
||||
('key,json', (
|
||||
'/environments/1/resources/1/values?effective&key=k',
|
||||
'get --env 1 --resource 1 --key k --format json',
|
||||
'{\n "k": {\n "hello": "world"\n }\n}'
|
||||
)),
|
||||
('key,lookup', (
|
||||
'/environments/1/resources/1/values?effective&key=k',
|
||||
'get --env 1 --resource 1 --key k --format json -s',
|
||||
'{\n "k": {\n "hello": "world"\n }\n}'
|
||||
)),
|
||||
('key,yaml', (
|
||||
'/environments/1/resources/1/values?effective&key=k',
|
||||
'get --env 1 --resource 1 --key k --format yaml',
|
||||
"k:\n hello: world\n"
|
||||
)),
|
||||
('no_key,json', (
|
||||
'/environments/1/resources/1/values?key=k&effective',
|
||||
'get --env 1 --resource 1 --key k --format json',
|
||||
'{\n "k": {\n "hello": "world"\n }\n}'
|
||||
)),
|
||||
('no_key,yaml', (
|
||||
'/environments/1/resources/1/values?key=no.key&effective',
|
||||
'get --env 1 --resource 1 --key no.key --format yaml',
|
||||
"no.key:\n hello: world\n"
|
||||
))
|
||||
]
|
||||
]
|
||||
|
||||
mock_url = None
|
||||
args = None
|
||||
expected_result = None
|
||||
|
||||
def test_get(self):
|
||||
self.req_mock.get(
|
||||
self.BASE_URL + self.mock_url,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
json={'hello': 'world'},
|
||||
)
|
||||
self.cli.run(self.args.split())
|
||||
self.assertEqual(self.expected_result, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestSetWithStdin(testscenarios.WithScenarios, _BaseCLITest):
|
||||
scenarios = [
|
||||
(s[0],
|
||||
dict(zip(('args', 'expected_body', 'stdin'), s[1])))
|
||||
for s in [
|
||||
('json', ('--format json', {'a': 3}, '{"a": 3}')),
|
||||
('yaml', ('--format yaml', {'a': 3}, 'a: 3'))
|
||||
]
|
||||
]
|
||||
|
||||
args = None
|
||||
expected_body = None
|
||||
stdin = None
|
||||
|
||||
url_last_part = 'values'
|
||||
cmd = 'set'
|
||||
|
||||
def test_set(self):
|
||||
url = self.BASE_URL + '/environments/1/lvl1/value1/resources/1/' + \
|
||||
self.url_last_part
|
||||
self.req_mock.put(url)
|
||||
args = [self.cmd] + ("--env 1 --level lvl1=value1 --resource 1 " +
|
||||
self.args).split()
|
||||
if self.stdin:
|
||||
self.cli.stdin.write(self.stdin)
|
||||
self.cli.stdin.seek(0)
|
||||
self.cli.run(args)
|
||||
req_history = self.req_mock.request_history
|
||||
self.assertEqual('PUT', req_history[-1].method)
|
||||
self.assertEqual(self.expected_body, req_history[-1].json())
|
||||
|
||||
|
||||
class TestSet(testscenarios.WithScenarios, _BaseCLITest):
|
||||
scenarios = [
|
||||
(s[0],
|
||||
dict(zip(('args', 'expected_body'), s[1])))
|
||||
for s in [
|
||||
('json', ('--type json --value "aaa"', 'aaa')),
|
||||
('yaml', ('--type yaml --value "aaa"', 'aaa'))
|
||||
]
|
||||
]
|
||||
|
||||
args = None
|
||||
expected_body = None
|
||||
|
||||
url_last_part = 'values'
|
||||
cmd = 'set'
|
||||
|
||||
def test_set(self):
|
||||
url = self.BASE_URL + '/environments/1/lvl1/value1/resources/1/' + \
|
||||
self.url_last_part
|
||||
self.req_mock.put(url)
|
||||
args = [self.cmd] + ("--env 1 --level lvl1=value1 --resource 1 " +
|
||||
self.args).split()
|
||||
self.cli.run(args)
|
||||
req_history = self.req_mock.request_history
|
||||
self.assertEqual('PUT', req_history[-1].method)
|
||||
self.assertEqual(self.expected_body, req_history[-1].json())
|
||||
|
||||
|
||||
class TestSetKeys(testscenarios.WithScenarios, _BaseCLITest):
|
||||
scenarios = [
|
||||
(s[0],
|
||||
dict(zip(('args', 'expected_body', 'stdin'), s[1])))
|
||||
for s in [
|
||||
('key,json', ('--key c --format json', [['c', {'a': 3}]],
|
||||
'{"a": 3}')),
|
||||
('key,yaml', ('--key c --format yaml', [['c', {'a': 3}]],
|
||||
'a: 3')),
|
||||
('key,null', ('--key c --type null', [['c', None]])),
|
||||
('key,str', ('--key c --type str --value 4', [['c', '4']]))
|
||||
]
|
||||
]
|
||||
|
||||
args = None
|
||||
expected_body = None
|
||||
stdin = None
|
||||
|
||||
url_last_part = 'values'
|
||||
cmd = 'set'
|
||||
|
||||
def test_set(self):
|
||||
url = self.BASE_URL + '/environments/1/lvl1/value1/resources/1/' + \
|
||||
self.url_last_part + '/keys/set'
|
||||
self.req_mock.patch(url)
|
||||
args = [self.cmd] + ("--env 1 --level lvl1=value1 --resource 1 " +
|
||||
self.args).split()
|
||||
if self.stdin:
|
||||
self.cli.stdin.write(self.stdin)
|
||||
self.cli.stdin.seek(0)
|
||||
self.cli.run(args)
|
||||
req_history = self.req_mock.request_history
|
||||
self.assertEqual('PATCH', req_history[-1].method)
|
||||
self.assertEqual(self.expected_body, req_history[-1].json())
|
||||
|
||||
|
||||
class TestDelete(testscenarios.WithScenarios, _BaseCLITest):
|
||||
scenarios = [
|
||||
(s[0],
|
||||
dict(zip(('args', 'expected_body'), s[1])))
|
||||
for s in [
|
||||
('k1', ('-k k1', "ResourceValue for key k1 was deleted\n")),
|
||||
('xx', ('-k xx', "ResourceValue for key xx was deleted\n")),
|
||||
('x.x', ('-k x.x', "ResourceValue for key x.x was deleted\n")),
|
||||
('x.0', ('-k x.0', "ResourceValue for key x.0 was deleted\n"))
|
||||
]
|
||||
]
|
||||
|
||||
args = None
|
||||
expected_body = None
|
||||
url_last_part = 'values'
|
||||
cmd = 'del'
|
||||
|
||||
def test_delete(self):
|
||||
url = self.BASE_URL + '/environments/1/lvl1/value1/resources/1/' + \
|
||||
self.url_last_part + '/keys/delete'
|
||||
self.req_mock.patch(url)
|
||||
args = [self.cmd] + ("--env 1 --level lvl1=value1 --resource 1 " +
|
||||
self.args).split()
|
||||
self.cli.run(args)
|
||||
self.assertEqual(self.expected_body, self.cli.stdout.getvalue())
|
||||
|
||||
|
||||
class TestOverride(TestSet):
|
||||
url_last_part = 'overrides'
|
||||
cmd = 'override'
|
||||
|
||||
|
||||
class TestDeleteOverride(testscenarios.WithScenarios, _BaseCLITest):
|
||||
scenarios = [
|
||||
(s[0],
|
||||
dict(zip(('args', 'expected_body'), s[1])))
|
||||
for s in [
|
||||
('k1', ('-k k1', "ResourceOverride for key k1 was deleted\n")),
|
||||
('xx', ('-k xx', "ResourceOverride for key xx was deleted\n")),
|
||||
]
|
||||
]
|
||||
|
||||
args = None
|
||||
expected_body = None
|
||||
url_last_part = 'overrides'
|
||||
cmd = 'rm override'
|
||||
|
||||
def test_delete(self):
|
||||
url = self.BASE_URL + '/environments/1/lvl1/value1/resources/1/' + \
|
||||
self.url_last_part + '/keys/delete'
|
||||
self.req_mock.patch(url)
|
||||
args = [self.cmd] + ("--env 1 --level lvl1=value1 --resource 1 " +
|
||||
self.args).split()
|
||||
self.cli.run(args)
|
||||
self.assertEqual(self.expected_body, self.cli.stdout.getvalue())
|
@ -1,204 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import copy
|
||||
|
||||
from tuning_box import db
|
||||
from tuning_box.library import components
|
||||
from tuning_box.tests.test_app import BaseTest
|
||||
|
||||
|
||||
class TestComponents(BaseTest):
|
||||
|
||||
@property
|
||||
def _component_json(self):
|
||||
return {
|
||||
'id': 7,
|
||||
'name': 'component1',
|
||||
'resource_definitions': [{
|
||||
'id': 5,
|
||||
'name': 'resdef1',
|
||||
'component_id': 7,
|
||||
'content': {'key': 'nsname.key'},
|
||||
}],
|
||||
}
|
||||
|
||||
def test_get_components_empty(self):
|
||||
res = self.client.get('/components')
|
||||
self.assertEqual(res.status_code, 200)
|
||||
self.assertEqual(res.json, [])
|
||||
|
||||
def test_get_components(self):
|
||||
self._fixture()
|
||||
res = self.client.get('/components')
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertEqual(self._component_json, res.json[0])
|
||||
|
||||
def test_get_one_component(self):
|
||||
self._fixture()
|
||||
res = self.client.get('/components/7')
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertEqual(self._component_json, res.json)
|
||||
|
||||
def test_get_one_component_404(self):
|
||||
res = self.client.get('/components/7')
|
||||
self.assertEqual(res.status_code, 404)
|
||||
|
||||
def test_post_component(self):
|
||||
self._fixture() # Just for namespace
|
||||
json = self._component_json
|
||||
del json['id']
|
||||
del json['resource_definitions'][0]['id']
|
||||
del json['resource_definitions'][0]['component_id']
|
||||
json['name'] = 'component2'
|
||||
res = self.client.post('/components', data=json)
|
||||
self.assertEqual(201, res.status_code)
|
||||
json['id'] = res.json['id']
|
||||
json['resource_definitions'][0]['component_id'] = json['id']
|
||||
json['resource_definitions'][0]['id'] = 6
|
||||
self.assertEqual(res.json, json)
|
||||
self._assert_db_effect(db.Component, res.json['id'],
|
||||
components.component_fields, json)
|
||||
|
||||
def test_post_component_conflict(self):
|
||||
self._fixture() # Just for namespace
|
||||
json = self._component_json
|
||||
del json['id']
|
||||
del json['resource_definitions'][0]['id']
|
||||
del json['resource_definitions'][0]['component_id']
|
||||
res = self.client.post('/components', data=json)
|
||||
self.assertEqual(res.status_code, 409)
|
||||
self._assert_not_in_db(db.Component, 8)
|
||||
|
||||
def test_post_component_conflict_propagate_exc(self):
|
||||
self.app.config["PROPAGATE_EXCEPTIONS"] = True
|
||||
self._fixture() # Just for namespace
|
||||
json = self._component_json
|
||||
del json['id']
|
||||
del json['resource_definitions'][0]['id']
|
||||
del json['resource_definitions'][0]['component_id']
|
||||
res = self.client.post('/components', data=json)
|
||||
self.assertEqual(res.status_code, 409)
|
||||
self._assert_not_in_db(db.Component, 8)
|
||||
|
||||
def test_post_component_no_resdef_content(self):
|
||||
self._fixture() # Just for namespace
|
||||
json = self._component_json
|
||||
del json['id']
|
||||
del json['resource_definitions'][0]['id']
|
||||
del json['resource_definitions'][0]['component_id']
|
||||
del json['resource_definitions'][0]['content']
|
||||
json['name'] = 'component2'
|
||||
res = self.client.post('/components', data=json)
|
||||
self.assertEqual(res.status_code, 201)
|
||||
json['id'] = res.json['id']
|
||||
json['resource_definitions'][0]['component_id'] = json['id']
|
||||
json['resource_definitions'][0]['id'] = 6
|
||||
json['resource_definitions'][0]['content'] = None
|
||||
self.assertEqual(json, res.json)
|
||||
self._assert_db_effect(db.Component, res.json['id'],
|
||||
components.component_fields, json)
|
||||
|
||||
def test_delete_component(self):
|
||||
self._fixture()
|
||||
res = self.client.delete('/components/7')
|
||||
self.assertEqual(res.status_code, 204)
|
||||
self.assertEqual(res.data, b'')
|
||||
self._assert_not_in_db(db.Component, 7)
|
||||
|
||||
with self.app.app_context():
|
||||
actual_res_defs = db.ResourceDefinition.query.all()
|
||||
self.assertEqual([], actual_res_defs)
|
||||
|
||||
def test_delete_component_404(self):
|
||||
res = self.client.delete('/components/7')
|
||||
self.assertEqual(res.status_code, 404)
|
||||
|
||||
def test_put_component_404(self):
|
||||
res = self.client.put('/components/7')
|
||||
self.assertEqual(res.status_code, 404)
|
||||
|
||||
def test_put_component(self):
|
||||
self._fixture()
|
||||
component_url = '/components/7'
|
||||
initial_data = self._component_json
|
||||
new_name = 'new_{0}'.format(initial_data['name'])
|
||||
|
||||
# Updating name
|
||||
res = self.client.put(component_url, data={'name': new_name})
|
||||
self.assertEqual(204, res.status_code)
|
||||
actual_component = self.client.get(component_url).json
|
||||
self.assertEqual(new_name, actual_component['name'])
|
||||
self.assertItemsEqual(initial_data['resource_definitions'],
|
||||
actual_component['resource_definitions'])
|
||||
|
||||
# Updating resource_definitions
|
||||
res = self.client.put(component_url,
|
||||
data={'resource_definitions': []})
|
||||
self.assertEqual(204, res.status_code)
|
||||
actual_component = self.client.get(component_url).json
|
||||
self.assertEqual([], actual_component['resource_definitions'])
|
||||
|
||||
# Restoring resource_definitions and name
|
||||
res_def = {
|
||||
'name': 'resdef1',
|
||||
'component_id': 7,
|
||||
'content': {'key': 'nsname.key'}
|
||||
}
|
||||
res = self.client.post(
|
||||
'/resource_definitions',
|
||||
data=res_def
|
||||
)
|
||||
self.assertEqual(201, res.status_code)
|
||||
|
||||
res = self.client.put(
|
||||
component_url,
|
||||
data={'name': initial_data['name']}
|
||||
)
|
||||
self.assertEqual(204, res.status_code)
|
||||
actual_component = self.client.get(component_url).json
|
||||
self.assertEqual(initial_data['name'],
|
||||
actual_component['name'])
|
||||
self.assertItemsEqual(
|
||||
(d['name'] for d in initial_data['resource_definitions']),
|
||||
(d['name'] for d in actual_component['resource_definitions'])
|
||||
)
|
||||
|
||||
def test_put_component_resource_not_found(self):
|
||||
self._fixture()
|
||||
component_url = '/components/7'
|
||||
initial_data = self._component_json
|
||||
|
||||
resource_definition = copy.deepcopy(
|
||||
initial_data['resource_definitions'][0])
|
||||
resource_definition['id'] = None
|
||||
|
||||
res = self.client.put(
|
||||
component_url,
|
||||
data={'resource_definitions': [resource_definition]}
|
||||
)
|
||||
self.assertEqual(404, res.status_code)
|
||||
|
||||
def test_put_component_ignore_changing_id(self):
|
||||
self._fixture()
|
||||
component_url = '/components/7'
|
||||
initial_data = self._component_json
|
||||
new_name = 'new_{0}'.format(initial_data['name'])
|
||||
|
||||
res = self.client.put(component_url,
|
||||
data={'name': new_name, 'id': None,
|
||||
'fake': 'xxxx'})
|
||||
self.assertEqual(204, res.status_code)
|
||||
actual_component = self.client.get(component_url).json
|
||||
self.assertEqual(new_name, actual_component['name'])
|
||||
self.assertItemsEqual(initial_data['resource_definitions'],
|
||||
actual_component['resource_definitions'])
|
@ -1,322 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from tuning_box import db
|
||||
from tuning_box import library
|
||||
from tuning_box.library import environments
|
||||
from tuning_box.tests.test_app import BaseTest
|
||||
|
||||
|
||||
class TestEnvironments(BaseTest):
|
||||
|
||||
collection_url = '/environments'
|
||||
object_url = collection_url + '/{0}'
|
||||
|
||||
def test_get_environments_empty(self):
|
||||
res = self.client.get(self.collection_url)
|
||||
self.assertEqual(res.status_code, 200)
|
||||
self.assertEqual(res.json, [])
|
||||
|
||||
def test_get_environments(self):
|
||||
self._fixture()
|
||||
res = self.client.get(self.collection_url)
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertEqual(1, len(res.json))
|
||||
self.assertEqual(
|
||||
{'id': 9, 'components': [7], 'hierarchy_levels': ['lvl1', 'lvl2']},
|
||||
res.json[0]
|
||||
)
|
||||
|
||||
def test_get_one_environment(self):
|
||||
self._fixture()
|
||||
env_id = 9
|
||||
res = self.client.get(self.object_url.format(env_id))
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertEqual(
|
||||
{'id': 9, 'components': [7], 'hierarchy_levels': ['lvl1', 'lvl2']},
|
||||
res.json
|
||||
)
|
||||
|
||||
def test_post_environment_hierarchy_levels_order(self):
|
||||
self._fixture()
|
||||
levels = ['lvla', 'lvlb']
|
||||
data = {'components': [7], 'hierarchy_levels': levels}
|
||||
res = self.client.post(self.collection_url, data=data)
|
||||
self.assertEqual(201, res.status_code)
|
||||
self.assertEqual(levels, res.json['hierarchy_levels'])
|
||||
|
||||
def test_get_one_environment_404(self):
|
||||
env_id = 9
|
||||
res = self.client.get(self.object_url.format(env_id))
|
||||
self.assertEqual(res.status_code, 404)
|
||||
|
||||
def test_post_environment(self):
|
||||
self._fixture()
|
||||
json = {'components': [7], 'hierarchy_levels': ['lvla', 'lvlb']}
|
||||
res = self.client.post(self.collection_url, data=json)
|
||||
self.assertEqual(res.status_code, 201)
|
||||
json['id'] = res.json['id']
|
||||
self.assertEqual(json, res.json)
|
||||
self._assert_db_effect(
|
||||
db.Environment, res.json['id'],
|
||||
environments.environment_fields, json)
|
||||
|
||||
def test_post_environment_preserve_id(self):
|
||||
self._fixture()
|
||||
json = {
|
||||
'id': 42,
|
||||
'components': [7],
|
||||
'hierarchy_levels': ['lvla', 'lvlb'],
|
||||
}
|
||||
res = self.client.post(self.collection_url, data=json)
|
||||
self.assertEqual(201, res.status_code)
|
||||
self.assertEqual(json, res.json)
|
||||
self._assert_db_effect(
|
||||
db.Environment, 42, environments.environment_fields, json)
|
||||
|
||||
def test_post_environment_preserve_id_conflict(self):
|
||||
self._fixture()
|
||||
json = {
|
||||
'id': 9,
|
||||
'components': [7],
|
||||
'hierarchy_levels': ['lvla', 'lvlb'],
|
||||
}
|
||||
res = self.client.post(self.collection_url, data=json)
|
||||
self.assertEqual(res.status_code, 409)
|
||||
|
||||
def test_post_environment_preserve_id_conflict_propagate_exc(self):
|
||||
self.app.config["PROPAGATE_EXCEPTIONS"] = True
|
||||
self._fixture()
|
||||
json = {
|
||||
'id': 9,
|
||||
'components': [7],
|
||||
'hierarchy_levels': ['lvla', 'lvlb'],
|
||||
}
|
||||
res = self.client.post(self.collection_url, data=json)
|
||||
self.assertEqual(res.status_code, 409)
|
||||
|
||||
def test_post_environment_by_component_name(self):
|
||||
self._fixture()
|
||||
json = {
|
||||
'components': ['component1'],
|
||||
'hierarchy_levels': ['lvla', 'lvlb'],
|
||||
}
|
||||
res = self.client.post(self.collection_url, data=json)
|
||||
self.assertEqual(res.status_code, 201)
|
||||
json['id'] = res.json['id']
|
||||
json['components'] = [7]
|
||||
self.assertEqual(json, res.json)
|
||||
self._assert_db_effect(
|
||||
db.Environment, res.json['id'],
|
||||
environments.environment_fields, json)
|
||||
|
||||
def test_post_components_duplication(self):
|
||||
self._fixture()
|
||||
json = {
|
||||
'components': ['component1', 7],
|
||||
'hierarchy_levels': ['lvl'],
|
||||
}
|
||||
res = self.client.post(self.collection_url, data=json)
|
||||
self.assertEqual(409, res.status_code)
|
||||
|
||||
def test_post_components_no_duplication(self):
|
||||
self._fixture()
|
||||
components_url = '/components'
|
||||
res = self.client.get(components_url)
|
||||
self.assertEqual(200, res.status_code)
|
||||
component = res.json[0]
|
||||
|
||||
# Creating component with name equal to id of existed component
|
||||
res = self.client.post(
|
||||
components_url,
|
||||
data={
|
||||
'name': component['id'],
|
||||
'resource_definitions': []
|
||||
}
|
||||
)
|
||||
self.assertEqual(201, res.status_code)
|
||||
new_component = res.json
|
||||
|
||||
# Checking no components duplication detected
|
||||
json = {
|
||||
'components': [component['id'], new_component['name']],
|
||||
'hierarchy_levels': ['lvl'],
|
||||
}
|
||||
res = self.client.post(self.collection_url, data=json)
|
||||
self.assertEqual(201, res.status_code)
|
||||
|
||||
def test_post_environment_404(self):
|
||||
self._fixture()
|
||||
json = {'components': [8], 'hierarchy_levels': ['lvla', 'lvlb']}
|
||||
res = self.client.post(self.collection_url, data=json)
|
||||
self.assertEqual(res.status_code, 404)
|
||||
self._assert_not_in_db(db.Environment, 10)
|
||||
|
||||
def test_post_environment_by_component_name_404(self):
|
||||
self._fixture()
|
||||
json = {
|
||||
'components': ['component2'],
|
||||
'hierarchy_levels': ['lvla', 'lvlb'],
|
||||
}
|
||||
res = self.client.post(self.collection_url, data=json)
|
||||
self.assertEqual(res.status_code, 404)
|
||||
self._assert_not_in_db(db.Environment, 10)
|
||||
|
||||
def test_delete_environment(self):
|
||||
self._fixture()
|
||||
env_id = 9
|
||||
env_url = self.object_url.format(env_id)
|
||||
res = self.client.get(env_url)
|
||||
self.assertEqual(200, res.status_code)
|
||||
levels = ['lvl1', 'lvl2']
|
||||
self.assertEqual(levels, res.json['hierarchy_levels'])
|
||||
|
||||
res = self.client.delete(env_url)
|
||||
self.assertEqual(204, res.status_code)
|
||||
self.assertEqual(res.data, b'')
|
||||
self._assert_not_in_db(db.Environment, 9)
|
||||
|
||||
with self.app.app_context():
|
||||
for name in levels:
|
||||
obj = db.EnvironmentHierarchyLevel.query.filter(
|
||||
db.EnvironmentHierarchyLevel.name == name
|
||||
).first()
|
||||
self.assertIsNone(obj)
|
||||
|
||||
def test_delete_environment_404(self):
|
||||
env_id = 9
|
||||
res = self.client.delete(self.object_url.format(env_id))
|
||||
self.assertEqual(res.status_code, 404)
|
||||
|
||||
def test_put_environment_404(self):
|
||||
env_id = 7
|
||||
res = self.client.put(self.object_url.format(env_id))
|
||||
self.assertEqual(res.status_code, 404)
|
||||
|
||||
def test_put_environment_components(self):
|
||||
self._fixture()
|
||||
environment_url = '/environments/9'
|
||||
initial = self.client.get(environment_url).json
|
||||
|
||||
# Updating components
|
||||
res = self.client.put(environment_url,
|
||||
data={'components': []})
|
||||
self.assertEqual(204, res.status_code)
|
||||
actual = self.client.get(environment_url).json
|
||||
self.assertEqual([], actual['components'])
|
||||
|
||||
# Restoring components
|
||||
res = self.client.put(
|
||||
environment_url,
|
||||
data={'components': initial['components']}
|
||||
)
|
||||
self.assertEqual(204, res.status_code)
|
||||
actual = self.client.get(environment_url).json
|
||||
self.assertEqual(initial, actual)
|
||||
|
||||
def test_put_environment_component_not_found(self):
|
||||
self._fixture()
|
||||
env_id = 9
|
||||
res = self.client.put(
|
||||
self.object_url.format(env_id),
|
||||
data={'components': [None]}
|
||||
)
|
||||
self.assertEqual(404, res.status_code)
|
||||
|
||||
def check_hierarchy_levels(self, hierarchy_levels_names):
|
||||
with self.app.app_context():
|
||||
hierarchy_levels = library.load_objects_by_id_or_name(
|
||||
db.EnvironmentHierarchyLevel, hierarchy_levels_names)
|
||||
parent_id = None
|
||||
for level in hierarchy_levels:
|
||||
self.assertEqual(parent_id, level.parent_id)
|
||||
parent_id = level.id
|
||||
|
||||
def test_put_environment_hierarchy_levels(self):
|
||||
self._fixture()
|
||||
env_id = 9
|
||||
environment_url = self.object_url.format(env_id)
|
||||
initial = self.client.get(environment_url).json
|
||||
|
||||
# Updating hierarchy levels
|
||||
res = self.client.put(environment_url,
|
||||
data={'hierarchy_levels': []})
|
||||
self.assertEqual(204, res.status_code)
|
||||
actual = self.client.get(environment_url).json
|
||||
self.assertEqual([], actual['hierarchy_levels'])
|
||||
|
||||
# Restoring levels
|
||||
res = self.client.put(
|
||||
environment_url,
|
||||
data={'hierarchy_levels': initial['hierarchy_levels']}
|
||||
)
|
||||
self.assertEqual(204, res.status_code)
|
||||
actual = self.client.get(environment_url).json
|
||||
self.assertEqual(initial, actual)
|
||||
self.check_hierarchy_levels(actual['hierarchy_levels'])
|
||||
|
||||
def test_put_environment_hierarchy_levels_remove_level(self):
|
||||
self._fixture()
|
||||
env_id = 9
|
||||
environment_url = self.object_url.format(env_id)
|
||||
initial = self.client.get(environment_url).json
|
||||
expected_levels = initial['hierarchy_levels'][1:]
|
||||
|
||||
# Updating hierarchy levels
|
||||
res = self.client.put(
|
||||
environment_url,
|
||||
data={'hierarchy_levels': expected_levels}
|
||||
)
|
||||
self.assertEqual(204, res.status_code)
|
||||
actual = self.client.get(environment_url).json
|
||||
self.assertEqual(expected_levels, actual['hierarchy_levels'])
|
||||
self.check_hierarchy_levels(actual['hierarchy_levels'])
|
||||
|
||||
def test_put_environment_hierarchy_levels_reverse(self):
|
||||
self._fixture()
|
||||
env_id = 9
|
||||
env_url = self.object_url.format(env_id)
|
||||
initial = self.client.get(env_url).json
|
||||
expected_levels = initial['hierarchy_levels']
|
||||
expected_levels.reverse()
|
||||
|
||||
# Updating hierarchy levels
|
||||
res = self.client.put(
|
||||
env_url,
|
||||
data={'hierarchy_levels': expected_levels}
|
||||
)
|
||||
self.assertEqual(204, res.status_code)
|
||||
actual = self.client.get(env_url).json
|
||||
self.assertEqual(expected_levels, actual['hierarchy_levels'])
|
||||
self.check_hierarchy_levels(actual['hierarchy_levels'])
|
||||
|
||||
def test_put_environment_hierarchy_levels_with_new_level(self):
|
||||
self._fixture()
|
||||
env_id = 9
|
||||
env_url = self.object_url.format(env_id)
|
||||
initial = self.client.get(env_url).json
|
||||
expected_levels = ['root'] + initial['hierarchy_levels']
|
||||
|
||||
res = self.client.put(
|
||||
env_url,
|
||||
data={'hierarchy_levels': expected_levels}
|
||||
)
|
||||
self.assertEqual(204, res.status_code)
|
||||
|
||||
res = self.client.get('/environments/9/hierarchy_levels')
|
||||
self.assertEqual(200, res.status_code)
|
||||
|
||||
res = self.client.get(env_url)
|
||||
self.assertEqual(200, res.status_code)
|
||||
actual = res.json
|
||||
self.assertEqual(expected_levels, actual['hierarchy_levels'])
|
||||
self.check_hierarchy_levels(actual['hierarchy_levels'])
|
@ -1,149 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import six
|
||||
|
||||
from tuning_box import db
|
||||
from tuning_box import errors
|
||||
from tuning_box.library import hierarchy_levels
|
||||
from tuning_box.tests.test_app import BaseTest
|
||||
|
||||
|
||||
class TestLevelsHierarchy(BaseTest):
|
||||
|
||||
collection_url = '/environments/{0}/hierarchy_levels'
|
||||
object_url = collection_url + '/{1}'
|
||||
|
||||
def test_get_environment_level_value_root(self):
|
||||
self._fixture()
|
||||
with self.app.app_context(), db.db.session.begin():
|
||||
level_value = hierarchy_levels.get_environment_level_value(
|
||||
db.Environment(id=9),
|
||||
[],
|
||||
)
|
||||
self.assertIsNone(level_value)
|
||||
|
||||
def test_get_environment_level_value_deep(self):
|
||||
self._fixture()
|
||||
with self.app.app_context(), db.db.session.begin():
|
||||
level_value = hierarchy_levels.get_environment_level_value(
|
||||
db.Environment(id=9),
|
||||
[('lvl1', 'val1'), ('lvl2', 'val2')],
|
||||
)
|
||||
self.assertIsNotNone(level_value)
|
||||
self.assertEqual(level_value.level.name, 'lvl2')
|
||||
self.assertEqual(level_value.value, 'val2')
|
||||
level = level_value.level.parent
|
||||
self.assertIsNotNone(level)
|
||||
self.assertEqual(level.name, 'lvl1')
|
||||
self.assertIsNone(level.parent)
|
||||
|
||||
def test_get_environment_level_values(self):
|
||||
self._fixture()
|
||||
env_id = 9
|
||||
with self.app.app_context(), db.db.session.begin():
|
||||
# Creating level values
|
||||
hierarchy_levels.get_environment_level_value(
|
||||
db.Environment(id=env_id),
|
||||
[('lvl1', 'val11'), ('lvl2', 'val21')],
|
||||
)
|
||||
hierarchy_levels.get_environment_level_value(
|
||||
db.Environment(id=env_id),
|
||||
[('lvl1', 'val11'), ('lvl2', 'val22')],
|
||||
)
|
||||
hierarchy_levels.get_environment_level_value(
|
||||
db.Environment(id=env_id),
|
||||
[('lvl1', 'val12'), ('lvl2', 'val23')],
|
||||
)
|
||||
|
||||
res = self.client.get(self.collection_url.format(9))
|
||||
lvl1 = res.json[0]
|
||||
self.assertItemsEqual(['val11', 'val12'], lvl1['values'])
|
||||
|
||||
lvl2 = res.json[1]
|
||||
self.assertItemsEqual(['val21', 'val22', 'val23'], lvl2['values'])
|
||||
|
||||
def test_get_environment_level_value_bad_level(self):
|
||||
self._fixture()
|
||||
with self.app.app_context(), db.db.session.begin():
|
||||
exc = self.assertRaises(
|
||||
errors.TuningboxNotFound,
|
||||
hierarchy_levels.get_environment_level_value,
|
||||
db.Environment(id=9),
|
||||
[('lvlx', 'val1')],
|
||||
)
|
||||
self.assertEqual(
|
||||
six.text_type(exc),
|
||||
"Unexpected level name 'lvlx'. Expected 'lvl1'.",
|
||||
)
|
||||
|
||||
def test_get_hierarchy_levels(self):
|
||||
self._fixture()
|
||||
environment_id = 9
|
||||
expected_levels = ['lvl1', 'lvl2']
|
||||
res = self.client.get(self.collection_url.format(environment_id))
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertEqual(expected_levels, [d['name'] for d in res.json])
|
||||
|
||||
def test_get_hierarchy_levels_not_found(self):
|
||||
environment_id = 9
|
||||
res = self.client.get(self.collection_url.format(environment_id))
|
||||
self.assertEqual(404, res.status_code)
|
||||
|
||||
def test_get_hierarchy_level(self):
|
||||
self._fixture()
|
||||
environment_id = 9
|
||||
levels = ['lvl1', 'lvl2']
|
||||
for level in levels:
|
||||
res = self.client.get(self.object_url.format(environment_id,
|
||||
level))
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertEqual(level, res.json['name'])
|
||||
|
||||
res = self.client.get(self.object_url.format(environment_id,
|
||||
res.json['id']))
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertEqual(level, res.json['name'])
|
||||
|
||||
def test_get_hierarchy_level_not_found(self):
|
||||
levels = ['lvl1', 'lvl2']
|
||||
for level in levels:
|
||||
res = self.client.get(self.object_url.format(9, level))
|
||||
self.assertEqual(404, res.status_code)
|
||||
|
||||
def test_put_hierarchy_level(self):
|
||||
self._fixture()
|
||||
environment_id = 9
|
||||
level = 'lvl1'
|
||||
new_name = 'new_{0}'.format(level)
|
||||
res = self.client.put(self.object_url.format(environment_id, level),
|
||||
data={'name': new_name})
|
||||
self.assertEqual(204, res.status_code)
|
||||
|
||||
res = self.client.get(self.object_url.format(environment_id, new_name))
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertEqual(new_name, res.json['name'])
|
||||
|
||||
def test_put_hierarchy_level_not_found(self):
|
||||
self._fixture()
|
||||
environment_id = 9
|
||||
res = self.client.put(self.object_url.format(environment_id, 'xx'),
|
||||
data={'name': 'new_name'})
|
||||
self.assertEqual(404, res.status_code)
|
||||
|
||||
res = self.client.put(self.object_url.format(1, 'lvl1'),
|
||||
data={'name': 'new_name'})
|
||||
self.assertEqual(404, res.status_code)
|
||||
|
||||
res = self.client.put(self.object_url.format(1, 'xx'),
|
||||
data={'name': 'new_name'})
|
||||
self.assertEqual(404, res.status_code)
|
@ -1,111 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from tuning_box.app import db
|
||||
from tuning_box import errors
|
||||
from tuning_box import library
|
||||
from tuning_box.tests.test_app import BaseTest
|
||||
|
||||
|
||||
class TestLibrary(BaseTest):
|
||||
|
||||
def add_res_def_to_another_env(self, res_name):
|
||||
component_data = {
|
||||
'name': 'component2',
|
||||
'resource_definitions': [{
|
||||
'name': res_name,
|
||||
'content': {'key': 'value'}
|
||||
}]
|
||||
}
|
||||
res = self.client.post('/components', data=component_data)
|
||||
self.assertEqual(201, res.status_code)
|
||||
component_id = res.json['id']
|
||||
|
||||
env_data = {
|
||||
'components': [component_id],
|
||||
'hierarchy_levels': [],
|
||||
}
|
||||
res = self.client.post('/environments', data=env_data)
|
||||
self.assertEqual(201, res.status_code)
|
||||
|
||||
def test_get_resource_definition(self):
|
||||
self._fixture()
|
||||
res_name = 'resdef1'
|
||||
res_id = 5
|
||||
environment_id = 9
|
||||
component_id = 7
|
||||
|
||||
# Creating resource definition with the same name in another
|
||||
# environment
|
||||
self.add_res_def_to_another_env(res_name)
|
||||
res = self.client.get('/resource_definitions')
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertTrue(all(res_def['name'] == res_name
|
||||
for res_def in res.json))
|
||||
|
||||
with self.app.app_context():
|
||||
self.assertRaises(errors.TuningboxNotFound,
|
||||
library.get_resource_definition, res_id, None)
|
||||
self.assertRaises(errors.TuningboxNotFound,
|
||||
library.get_resource_definition, res_name, None)
|
||||
self.assertRaises(errors.TuningboxNotFound,
|
||||
library.get_resource_definition, '',
|
||||
environment_id)
|
||||
self.assertRaises(errors.TuningboxNotFound,
|
||||
library.get_resource_definition, None, None)
|
||||
self.assertRaises(errors.TuningboxNotFound,
|
||||
library.get_resource_definition, None,
|
||||
environment_id)
|
||||
|
||||
actual_res = library.get_resource_definition(res_id,
|
||||
environment_id)
|
||||
self.assertEqual(res_id, actual_res.id)
|
||||
self.assertEqual(res_name, actual_res.name)
|
||||
self.assertEqual(component_id, actual_res.component_id)
|
||||
|
||||
actual_res = library.get_resource_definition(res_id,
|
||||
environment_id)
|
||||
self.assertEqual(res_id, actual_res.id)
|
||||
self.assertEqual(res_name, actual_res.name)
|
||||
self.assertEqual(component_id, actual_res.component_id)
|
||||
|
||||
def test_get_resource_values(self):
|
||||
self._fixture()
|
||||
res_def_id = 5
|
||||
environment_id = 9
|
||||
values = {'k': 'v'}
|
||||
levels = (('lvl1', 'val1'), ('lvl2', 'val2'))
|
||||
self._add_resource_values(environment_id, res_def_id, levels, values)
|
||||
|
||||
with self.app.app_context(), db.db.session.begin():
|
||||
|
||||
environment = db.Environment.query.get(environment_id)
|
||||
res_def = db.ResourceDefinition.query.get(res_def_id)
|
||||
res_values = library.get_resource_values(
|
||||
environment, levels, res_def)
|
||||
self.assertEqual(values, res_values.values)
|
||||
|
||||
def test_get_resource_values_not_found(self):
|
||||
self._fixture()
|
||||
res_def_id = 5
|
||||
environment_id = 9
|
||||
levels = (('lvl1', 'val1'), ('lvl2', 'val2'))
|
||||
with self.app.app_context(), db.db.session.begin():
|
||||
environment = db.Environment.query.get(environment_id)
|
||||
res_def = db.ResourceDefinition.query.get(res_def_id)
|
||||
self.assertRaises(errors.TuningboxNotFound,
|
||||
library.get_resource_values, environment,
|
||||
levels, res_def)
|
||||
# Test for empty levels
|
||||
self.assertRaises(errors.TuningboxNotFound,
|
||||
library.get_resource_values, environment,
|
||||
(), res_def)
|
@ -1,213 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from tuning_box import db
|
||||
from tuning_box.library import resource_definitions
|
||||
from tuning_box.tests.test_app import BaseTest
|
||||
|
||||
|
||||
class TestResourceDefinitions(BaseTest):
|
||||
|
||||
collection_url = '/resource_definitions'
|
||||
object_url = '/resource_definitions/{0}'
|
||||
object_keys_url = object_url + '/keys/{1}'
|
||||
|
||||
@property
|
||||
def _resource_json(self):
|
||||
return {
|
||||
'id': 5,
|
||||
'name': 'resdef1',
|
||||
'component_id': 7,
|
||||
'content': {'key': 'nsname.key'},
|
||||
}
|
||||
|
||||
def test_post_resource_definition(self):
|
||||
data = self._resource_json
|
||||
data['component_id'] = None
|
||||
|
||||
res = self.client.post(self.collection_url, data=data)
|
||||
self.assertEqual(201, res.status_code)
|
||||
data['id'] = res.json['id']
|
||||
|
||||
self.assertEqual(data, res.json)
|
||||
self._assert_db_effect(
|
||||
db.ResourceDefinition,
|
||||
res.json['id'],
|
||||
resource_definitions.resource_definition_fields,
|
||||
data
|
||||
)
|
||||
|
||||
def test_get_resource_definitions_empty(self):
|
||||
res = self.client.get(self.collection_url)
|
||||
self.assertEqual(res.status_code, 200)
|
||||
self.assertEqual(res.json, [])
|
||||
|
||||
def test_get_definitions(self):
|
||||
self._fixture()
|
||||
res = self.client.get(self.collection_url)
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertEqual(1, len(res.json))
|
||||
self.assertEqual(self._resource_json, res.json[0])
|
||||
|
||||
def test_get_definitions_filtration(self):
|
||||
self._fixture()
|
||||
|
||||
resource_data = {
|
||||
'name': 'resdef2',
|
||||
'content': {'key': 'service.key'},
|
||||
}
|
||||
|
||||
res = self.client.post(self.collection_url, data=resource_data)
|
||||
self.assertEqual(201, res.status_code)
|
||||
resource_data = res.json
|
||||
|
||||
component_id = self._resource_json['component_id']
|
||||
res = self.client.get(self.collection_url,
|
||||
query_string={'component_id': component_id})
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertNotIn(resource_data['id'], (d['id'] for d in res.json))
|
||||
|
||||
res = self.client.get(self.collection_url + '?component_id=')
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertFalse(any(d['component_id'] for d in res.json))
|
||||
self.assertIn(resource_data['id'], (d['id'] for d in res.json))
|
||||
|
||||
res = self.client.get(self.collection_url)
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertIn(resource_data['id'], (d['id'] for d in res.json))
|
||||
self.assertIn(self._resource_json['id'], (d['id'] for d in res.json))
|
||||
|
||||
def test_get_one_resource_definition(self):
|
||||
self._fixture()
|
||||
res_id = self._resource_json['id']
|
||||
res = self.client.get(self.object_url.format(res_id))
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertEqual(self._resource_json, res.json)
|
||||
|
||||
def test_get_one_resource_definition_404(self):
|
||||
res_id = self._resource_json['id']
|
||||
res = self.client.get(
|
||||
self.object_url.format(res_id))
|
||||
self.assertEqual(res.status_code, 404)
|
||||
|
||||
def test_delete_resource_definition(self):
|
||||
self._fixture()
|
||||
res_id = self._resource_json['id']
|
||||
res = self.client.delete(self.object_url.format(res_id))
|
||||
self.assertEqual(res.status_code, 204)
|
||||
self.assertEqual(res.data, b'')
|
||||
self._assert_not_in_db(db.ResourceDefinition, res_id)
|
||||
|
||||
def test_delete_resource_definition_404(self):
|
||||
res_id = self._resource_json['id']
|
||||
res = self.client.delete(self.object_url.format(res_id))
|
||||
self.assertEqual(res.status_code, 404)
|
||||
|
||||
def test_put_resource_definition_404(self):
|
||||
res_id = self._resource_json['id']
|
||||
res = self.client.delete(self.object_url.format(res_id))
|
||||
self.assertEqual(res.status_code, 404)
|
||||
|
||||
def test_put_resource_definition(self):
|
||||
self._fixture()
|
||||
res_id = self._resource_json['id']
|
||||
|
||||
data = self._resource_json
|
||||
data['name'] = 'new_{0}'.format(data['name'])
|
||||
data['component_id'] = None
|
||||
data['content'] = {'x': 'y'}
|
||||
|
||||
res = self.client.put(self.object_url.format(res_id),
|
||||
data=data)
|
||||
self.assertEqual(204, res.status_code)
|
||||
actual_res_def = self.client.get(self.object_url.format(res_id)).json
|
||||
self.assertEqual(data, actual_res_def)
|
||||
|
||||
# Restoring resource_definition values
|
||||
res = self.client.put(
|
||||
self.object_url.format(res_id),
|
||||
data=self._resource_json
|
||||
)
|
||||
self.assertEqual(204, res.status_code)
|
||||
actual_res_def = self.client.get(self.object_url.format(res_id)).json
|
||||
self.assertEqual(self._resource_json, actual_res_def)
|
||||
|
||||
def test_put_resource_definition_ignore_changing_id(self):
|
||||
self._fixture()
|
||||
res_id = self._resource_json['id']
|
||||
|
||||
data = self._resource_json
|
||||
data['id'] = None
|
||||
res = self.client.put(self.object_url.format(res_id), data=data)
|
||||
self.assertEqual(204, res.status_code)
|
||||
actual_res_def = self.client.get(self.object_url.format(res_id)).json
|
||||
self.assertEqual(self._resource_json, actual_res_def)
|
||||
|
||||
def test_put_resource_definition_set_operation_error(self):
|
||||
self.app.config["PROPAGATE_EXCEPTIONS"] = True
|
||||
self._fixture()
|
||||
res_id = self._resource_json['id']
|
||||
|
||||
data = [['a', 'b', 'c', 'value']]
|
||||
res = self.client.put(self.object_keys_url.format(res_id, 'set'),
|
||||
data=data)
|
||||
self.assertEqual(409, res.status_code)
|
||||
|
||||
def test_put_resource_definition_set(self):
|
||||
self._fixture()
|
||||
res_id = self._resource_json['id']
|
||||
|
||||
data = [['key', 'key_value'], ['key_x', 'key_x_value']]
|
||||
res = self.client.put(
|
||||
self.object_keys_url.format(res_id, 'set'),
|
||||
data=data
|
||||
)
|
||||
self.assertEqual(204, res.status_code)
|
||||
|
||||
res = self.client.get(self.object_url.format(res_id))
|
||||
self.assertEqual(200, res.status_code)
|
||||
actual = res.json
|
||||
self.assertEqual({'key': 'key_value', 'key_x': 'key_x_value'},
|
||||
actual['content'])
|
||||
|
||||
def test_put_resource_definition_delete(self):
|
||||
self._fixture()
|
||||
res_id = self._resource_json['id']
|
||||
|
||||
data = [['key']]
|
||||
res = self.client.put(
|
||||
self.object_keys_url.format(res_id, 'delete'),
|
||||
data=data
|
||||
)
|
||||
self.assertEqual(204, res.status_code)
|
||||
|
||||
res = self.client.get(self.object_url.format(res_id))
|
||||
self.assertEqual(200, res.status_code)
|
||||
actual = res.json
|
||||
self.assertEqual({}, actual['content'])
|
||||
|
||||
def test_put_resource_definition_delete_no_key(self):
|
||||
self.app.config["PROPAGATE_EXCEPTIONS"] = True
|
||||
self._fixture()
|
||||
res_id = self._resource_json['id']
|
||||
|
||||
data = [['fake_key']]
|
||||
res = self.client.put(
|
||||
self.object_keys_url.format(res_id, 'delete'),
|
||||
data=data
|
||||
)
|
||||
self.assertEqual(409, res.status_code)
|
||||
|
||||
res = self.client.get(self.object_url.format(res_id))
|
||||
self.assertEqual(200, res.status_code)
|
||||
actual = res.json
|
||||
self.assertEqual(self._resource_json['content'], actual['content'])
|
@ -1,370 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from tuning_box import errors
|
||||
from tuning_box.library import resource_keys_operation
|
||||
from tuning_box.tests.test_app import BaseTest
|
||||
|
||||
|
||||
class TestResourceKeysOperations(BaseTest):
|
||||
|
||||
processor = resource_keys_operation.KeysOperationMixin()
|
||||
object_url = '/environments/{0}/{1}resources/{2}/values'
|
||||
object_keys_url = object_url + '/keys/{3}'
|
||||
|
||||
def test_unknown_operation(self):
|
||||
self.assertRaises(errors.UnknownKeysOperation,
|
||||
self.processor.perform_operation,
|
||||
'fake_operation', {}, [])
|
||||
|
||||
def test_set_new(self):
|
||||
keys = [['a', {}]]
|
||||
data = {}
|
||||
result = self.processor.do_set(data, keys)
|
||||
self.assertEqual({'a': {}}, result)
|
||||
|
||||
keys = [['a', {}], ['a', 'b', []]]
|
||||
data = {}
|
||||
result = self.processor.do_set(data, keys)
|
||||
self.assertEqual({'a': {'b': []}}, result)
|
||||
|
||||
keys = [['a', 0, 'b', 'c_updated']]
|
||||
data = {'a': [{'b': 'c'}]}
|
||||
result = self.processor.do_set(data, keys)
|
||||
self.assertEqual({'a': [{'b': 'c_updated'}]}, result)
|
||||
|
||||
keys = [['a', 'b']]
|
||||
data = {'a': {'b': 'c'}}
|
||||
result = self.processor.do_set(data, keys)
|
||||
self.assertEqual({'a': 'b'}, result)
|
||||
|
||||
def test_set_empty(self):
|
||||
keys = [['a', 'b', '']]
|
||||
data = {'a': {'b': 'value'}}
|
||||
result = self.processor.do_set(data, keys)
|
||||
self.assertEqual({'a': {'b': ''}}, result)
|
||||
|
||||
def test_set_not_modifies_storage(self):
|
||||
keys = [['a', 'c', 'value_c']]
|
||||
data = {'a': {'b': 'value_b'}}
|
||||
result = self.processor.do_set(data, keys)
|
||||
self.assertEqual({'a': {'b': 'value_b'}}, data)
|
||||
self.assertEqual({'a': {'c': 'value_c', 'b': 'value_b'}}, result)
|
||||
|
||||
def test_set_invalid_keys_path(self):
|
||||
self.assertRaises(errors.KeysPathInvalid, self.processor.do_set,
|
||||
{}, [[]])
|
||||
self.assertRaises(errors.KeysPathInvalid, self.processor.do_set,
|
||||
{}, [['a']])
|
||||
|
||||
def test_set_key_path_not_existed(self):
|
||||
keys = [['a', 'b', 'c']]
|
||||
data = {}
|
||||
self.assertRaises(errors.KeysPathNotExisted, self.processor.do_set,
|
||||
data, keys)
|
||||
|
||||
keys = [['a', 1, 'b']]
|
||||
data = {'a': [{'b': 'c'}]}
|
||||
self.assertRaises(errors.KeysPathNotExisted, self.processor.do_set,
|
||||
data, keys)
|
||||
|
||||
def test_set_key_path_unreachable(self):
|
||||
keys = [['a', 'b', 'c', 'd', 'e']]
|
||||
data = {'a': {'b': 'c'}}
|
||||
self.assertRaises(errors.KeysPathUnreachable, self.processor.do_set,
|
||||
data, keys)
|
||||
|
||||
keys = [['a', 'k1', 'v1']]
|
||||
data = {'a': 'v'}
|
||||
self.assertRaises(errors.KeysPathUnreachable, self.processor.do_set,
|
||||
data, keys)
|
||||
|
||||
def test_delete_key_path_not_existed(self):
|
||||
keys = [['a', 'b']]
|
||||
data = {}
|
||||
self.assertRaises(errors.KeysPathNotExisted, self.processor.do_delete,
|
||||
data, keys)
|
||||
|
||||
keys = [[1]]
|
||||
data = ['a']
|
||||
self.assertRaises(errors.KeysPathNotExisted, self.processor.do_delete,
|
||||
data, keys)
|
||||
|
||||
def test_delete_key_path_unreachable(self):
|
||||
keys = [['a', 'b', 'value_b']]
|
||||
data = {'a': {'b': 'value_b'}}
|
||||
self.assertRaises(errors.KeysPathUnreachable, self.processor.do_delete,
|
||||
data, keys)
|
||||
|
||||
keys = [['a', 'b', 'value_c']]
|
||||
data = {'a': {'b': 'value_b'}}
|
||||
self.assertRaises(errors.KeysPathUnreachable, self.processor.do_delete,
|
||||
data, keys)
|
||||
|
||||
def test_delete(self):
|
||||
keys = [['a']]
|
||||
data = {'a': 'val_a', 'b': {'a': 'val_b_a'}}
|
||||
result = self.processor.do_delete(data, keys)
|
||||
self.assertEqual({'b': {'a': 'val_b_a'}}, result)
|
||||
|
||||
keys = [[0]]
|
||||
data = ['a']
|
||||
result = self.processor.do_delete(data, keys)
|
||||
self.assertEqual([], result)
|
||||
|
||||
keys = [['a', 0, 'b']]
|
||||
data = {'a': [{'b': 'val_a_0_b', 'c': 'val_a_0_c'}, 'd']}
|
||||
result = self.processor.do_delete(data, keys)
|
||||
self.assertEqual({'a': [{'c': 'val_a_0_c'}, 'd']}, result)
|
||||
|
||||
keys = [['a', 'b'], ['a']]
|
||||
data = {'a': {'b': 'val_a_b', 'c': 'val_a_c'}, 'b': 'val_b'}
|
||||
result = self.processor.do_delete(data, keys)
|
||||
self.assertEqual({'b': 'val_b'}, result)
|
||||
|
||||
def test_delete_not_modifies_storage(self):
|
||||
keys = [['a', 'b']]
|
||||
data = {'a': {'b': 'value_b'}}
|
||||
result = self.processor.do_delete(data, keys)
|
||||
self.assertEqual({'a': {'b': 'value_b'}}, data)
|
||||
self.assertEqual({'a': {}}, result)
|
||||
|
||||
def test_put_resource_values_delete(self):
|
||||
self._fixture()
|
||||
environment_id = 9
|
||||
res_def_id = 5
|
||||
levels = (('lvl1', 'val1'), ('lvl2', 'val2'))
|
||||
values = {'key_0': 'val_0', 'key_1': 'val_1'}
|
||||
self._add_resource_values(environment_id, res_def_id, levels, values)
|
||||
|
||||
obj_url = self.object_url.format(
|
||||
environment_id,
|
||||
self.get_levels_path(levels),
|
||||
res_def_id
|
||||
)
|
||||
obj_keys_url = obj_url + '/keys/delete'
|
||||
|
||||
data = [['key_0']]
|
||||
res = self.client.put(obj_keys_url, data=data)
|
||||
self.assertEqual(204, res.status_code)
|
||||
|
||||
res = self.client.get(obj_url)
|
||||
self.assertEqual(200, res.status_code)
|
||||
actual = res.json
|
||||
self.assertEqual({'key_1': 'val_1'}, actual)
|
||||
|
||||
def test_put_resource_values_delete_nested_keys(self):
|
||||
self._fixture()
|
||||
environment_id = 9
|
||||
res_def_id = 5
|
||||
levels = (('lvl1', 'val1'), ('lvl2', 'val2'))
|
||||
values = {'k0': [{'k1': 'v01'}, 'b'], 'k2': {'k3': 'v23'}}
|
||||
self._add_resource_values(environment_id, res_def_id, levels, values)
|
||||
|
||||
obj_url = self.object_url.format(
|
||||
environment_id,
|
||||
self.get_levels_path(levels),
|
||||
res_def_id
|
||||
)
|
||||
obj_keys_url = obj_url + '/keys/delete'
|
||||
|
||||
data = [['k0', '0'], ['k2', 'k3']]
|
||||
res = self.client.put(obj_keys_url, data=data)
|
||||
self.assertEqual(204, res.status_code)
|
||||
|
||||
res = self.client.get(obj_url)
|
||||
self.assertEqual(200, res.status_code)
|
||||
actual = res.json
|
||||
self.assertEqual({'k0': ['b'], 'k2': {}}, actual)
|
||||
|
||||
def test_put_resource_values_not_found(self):
|
||||
self.app.config["PROPAGATE_EXCEPTIONS"] = True
|
||||
self._fixture()
|
||||
|
||||
res = self.client.put(
|
||||
'/environments/9/lvl1/val1/resources/5/values/keys/set',
|
||||
data={}
|
||||
)
|
||||
self.assertEqual(404, res.status_code)
|
||||
|
||||
def test_put_resource_values_set_operation_error(self):
|
||||
self.app.config["PROPAGATE_EXCEPTIONS"] = True
|
||||
self._fixture()
|
||||
|
||||
environment_id = 9
|
||||
res_def_id = 5
|
||||
levels = (('lvl1', 'val1'), ('lvl2', 'val2'))
|
||||
values = {'key': 'val'}
|
||||
self._add_resource_values(environment_id, res_def_id, levels, values)
|
||||
|
||||
data = [['a', 'b', 'c', 'value']]
|
||||
obj_keys_url = self.object_keys_url.format(
|
||||
environment_id,
|
||||
self.get_levels_path(levels),
|
||||
res_def_id,
|
||||
'set'
|
||||
)
|
||||
|
||||
res = self.client.put(obj_keys_url, data=data)
|
||||
self.assertEqual(409, res.status_code)
|
||||
|
||||
def test_put_resource_values_delete_operation_error(self):
|
||||
self.app.config["PROPAGATE_EXCEPTIONS"] = True
|
||||
self._fixture()
|
||||
environment_id = 9
|
||||
res_def_id = 5
|
||||
levels = (('lvl1', 'val1'), ('lvl2', 'val2'))
|
||||
values = {'key_0': 'val_0', 'key_1': 'val_1'}
|
||||
self._add_resource_values(environment_id, res_def_id, levels, values)
|
||||
|
||||
obj_keys_url = self.object_keys_url.format(
|
||||
environment_id,
|
||||
self.get_levels_path(levels),
|
||||
res_def_id,
|
||||
'delete'
|
||||
)
|
||||
data = [['fake_key']]
|
||||
res = self.client.put(obj_keys_url, data=data)
|
||||
self.assertEqual(409, res.status_code)
|
||||
|
||||
data = [['key_0', 'val_0']]
|
||||
res = self.client.put(obj_keys_url, data=data)
|
||||
self.assertEqual(409, res.status_code)
|
||||
|
||||
def test_put_resource_values_set(self):
|
||||
self._fixture()
|
||||
environment_id = 9
|
||||
res_def_id = 5
|
||||
levels = (('lvl1', 'val1'), ('lvl2', 'val2'))
|
||||
values = {'key': 'val'}
|
||||
self._add_resource_values(environment_id, res_def_id, levels, values)
|
||||
|
||||
obj_url = self.object_url.format(
|
||||
environment_id,
|
||||
self.get_levels_path(levels),
|
||||
res_def_id
|
||||
)
|
||||
obj_keys_url = obj_url + '/keys/set'
|
||||
|
||||
data = [['key', 'key_value'], ['key_x', 'key_x_value']]
|
||||
res = self.client.put(obj_keys_url, data=data)
|
||||
self.assertEqual(204, res.status_code)
|
||||
|
||||
res = self.client.get(obj_url)
|
||||
self.assertEqual(200, res.status_code)
|
||||
actual = res.json
|
||||
self.assertEqual({'key': 'key_value', 'key_x': 'key_x_value'},
|
||||
actual)
|
||||
|
||||
def test_put_resource_values_set_no_levels(self):
|
||||
self._fixture()
|
||||
environment_id = 9
|
||||
res_def_id = 5
|
||||
values = {'key': 'val'}
|
||||
self._add_resource_values(environment_id, res_def_id, (), values)
|
||||
|
||||
obj_url = '/environments/{0}/resources/{1}/values'.format(
|
||||
environment_id, res_def_id)
|
||||
obj_keys_url = obj_url + '/keys/set'
|
||||
|
||||
data = [['key', 'key_value'], ['key_x', 'key_x_value']]
|
||||
res = self.client.put(obj_keys_url, data=data)
|
||||
self.assertEqual(204, res.status_code)
|
||||
|
||||
res = self.client.get(obj_url)
|
||||
self.assertEqual(200, res.status_code)
|
||||
actual = res.json
|
||||
self.assertEqual({'key': 'key_value', 'key_x': 'key_x_value'},
|
||||
actual)
|
||||
|
||||
def test_put_resource_values_delete_by_name(self):
|
||||
self._fixture()
|
||||
environment_id = 9
|
||||
res_def_id = 5
|
||||
res_def_name = 'resdef1'
|
||||
levels = (('lvl1', 'val1'), ('lvl2', 'val2'))
|
||||
values = {'key_0': 'val_0', 'key_1': 'val_1'}
|
||||
self._add_resource_values(environment_id, res_def_id, levels, values)
|
||||
|
||||
obj_url = self.object_url.format(
|
||||
environment_id,
|
||||
self.get_levels_path(levels),
|
||||
res_def_name
|
||||
)
|
||||
obj_keys_url = obj_url + '/keys/delete'
|
||||
|
||||
data = [['key_0']]
|
||||
res = self.client.put(obj_keys_url, data=data)
|
||||
self.assertEqual(204, res.status_code)
|
||||
|
||||
obj_url = self.object_url.format(
|
||||
environment_id,
|
||||
self.get_levels_path(levels),
|
||||
res_def_id
|
||||
)
|
||||
|
||||
res = self.client.get(obj_url)
|
||||
self.assertEqual(200, res.status_code)
|
||||
actual = res.json
|
||||
self.assertEqual({'key_1': 'val_1'}, actual)
|
||||
|
||||
def test_put_resource_values_set_consistency(self):
|
||||
self.app.config["PROPAGATE_EXCEPTIONS"] = True
|
||||
self._fixture()
|
||||
environment_id = 9
|
||||
res_def_id = 5
|
||||
levels = (('lvl1', 'val1'), ('lvl2', 'val2'))
|
||||
values = {'k0': {'k1': 'v01'}}
|
||||
self._add_resource_values(environment_id, res_def_id, levels, values)
|
||||
|
||||
obj_url = self.object_url.format(
|
||||
environment_id,
|
||||
self.get_levels_path(levels),
|
||||
res_def_id
|
||||
)
|
||||
obj_keys_url = obj_url + '/keys/set'
|
||||
|
||||
# One keys path is invalid
|
||||
data = [['kk0', 'v'], ['k0', 'k1', 'k2', 'val']]
|
||||
res = self.client.put(obj_keys_url, data=data)
|
||||
self.assertEqual(409, res.status_code)
|
||||
|
||||
# Checking no changes in the resource value
|
||||
res = self.client.get(obj_url)
|
||||
self.assertEqual(200, res.status_code)
|
||||
actual = res.json
|
||||
self.assertEqual(values, actual)
|
||||
|
||||
def test_put_resource_values_set_nested_keys(self):
|
||||
self.app.config["PROPAGATE_EXCEPTIONS"] = True
|
||||
self._fixture()
|
||||
environment_id = 9
|
||||
res_def_id = 5
|
||||
levels = (('lvl1', 'val1'), ('lvl2', 'val2'))
|
||||
values = {'k0': {'k1': 'v01'}}
|
||||
self._add_resource_values(environment_id, res_def_id, levels, values)
|
||||
|
||||
obj_url = self.object_url.format(
|
||||
environment_id,
|
||||
self.get_levels_path(levels),
|
||||
res_def_id
|
||||
)
|
||||
obj_keys_url = obj_url + '/keys/set'
|
||||
|
||||
data = [['k0', 'k1', 'k2', 'val']]
|
||||
res = self.client.put(obj_keys_url, data=data)
|
||||
self.assertEqual(409, res.status_code)
|
||||
|
||||
res = self.client.get(obj_url)
|
||||
self.assertEqual(200, res.status_code)
|
||||
actual = res.json
|
||||
self.assertEqual(values, actual)
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user