Apply cookiecutter on collect-logs

It adds all the requires files for ansible-role-collect-logs
in order to consume it as a role.

It also moves README.md to README.rst
Added pre-commit and removes ansible-lint.sh.

https://tree.taiga.io/project/tripleo-ci-board/task/890

Change-Id: I259f148e03c09e27edfb1bcc251cc62f0f1f59e9
Signed-off-by: Chandan Kumar <chkumar@redhat.com>
This commit is contained in:
Chandan Kumar
2019-04-08 11:00:24 +05:30
parent 2c29415b76
commit 67821d2c49
32 changed files with 657 additions and 182 deletions

2
.ansible-lint Normal file
View File

@@ -0,0 +1,2 @@
---
parseable: true

71
.gitignore vendored Normal file
View File

@@ -0,0 +1,71 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
# C extensions
*.so
# Distribution / packaging
.Python
env/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
sdist/
var/
container_registry.egg-info/
.installed.cfg
*.egg
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*,cover
# Translations
*.mo
*.pot
# Django stuff:
*.log
# Sphinx documentation
doc/build/
# PyBuilder
target/
# virtualenv
.venv/
# jenkins config
jenkins/config.ini
playbooks/debug.yml
# Files created by releasenotes build
releasenotes/build
# Editors
.*.sw[klmnop]
# ansible retry files
*.retry

4
.gitreview Normal file
View File

@@ -0,0 +1,4 @@
[gerrit]
host=review.openstack.org
port=29418
project=openstack/ansible-role-collect-logs.git

40
.pre-commit-config.yaml Normal file
View File

@@ -0,0 +1,40 @@
---
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v2.1.0
hooks:
- id: end-of-file-fixer
- id: trailing-whitespace
- id: mixed-line-ending
- id: check-byte-order-marker
- id: check-executables-have-shebangs
- id: check-merge-conflict
- id: debug-statements
- id: flake8
- id: check-yaml
files: .*\.(yaml|yml)$
- repo: https://github.com/adrienverge/yamllint.git
rev: v1.15.0
hooks:
- id: yamllint
files: \.(yaml|yml)$
types: [file, yaml]
entry: yamllint --strict -f parsable
- repo: https://github.com/ansible/ansible-lint
rev: v4.1.0a0
hooks:
- id: ansible-lint
files: \.(yaml|yml)$
entry: ansible-lint --force-color -v
- repo: https://github.com/openstack-dev/bashate.git
rev: 0.6.0
hooks:
- id: bashate
entry: bashate --error . --verbose --ignore=E006,E040
# Run bashate check for all bash scripts
# Ignores the following rules:
# E006: Line longer than 79 columns (as many scripts use jinja
# templating, this is very difficult)
# E040: Syntax error determined using `bash -n` (as many scripts
# use jinja templating, this will often fail and the syntax
# error will be discovered in execution anyway)

7
.yamllint Normal file
View File

@@ -0,0 +1,7 @@
---
extends: default
rules:
line-length:
# matches hardcoded 160 value from ansible-lint
max: 160

175
LICENSE Normal file
View File

@@ -0,0 +1,175 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.

157
README.md
View File

@@ -1,157 +0,0 @@
collect-logs
============
An Ansible role for aggregating logs from TripleO nodes.
Requirements
------------
This role gathers logs and debug information from a target system and
collates them in a designated directory, `artcl_collect_dir`, on the localhost.
Additionally, the role will convert templated bash scripts, created and used by
TripleO-Quickstart during deployment, into rST files. These rST files are
combined with static rST files and fed into Sphinx to create user friendly
post-build-documentation specific to an original deployment.
Finally, the role optionally handles uploading these logs to a rsync server or
to an OpenStack Swift object storage. Logs from Swift can be exposed with
[os-loganalyze](https://github.com/openstack-infra/os-loganalyze).
Role Variables
--------------
### Collection related
* `artcl_collect_list` -- A list of files and directories to gather from
the target. Directories are collected recursively and need to end with a
"/" to get collected. Should be specified as a YaML list, e.g.:
```yaml
artcl_collect_list:
- /etc/nova/
- /home/stack/*.log
- /var/log/
```
* `artcl_collect_list_append` -- A list of files and directories to be appended
in the default list. This is useful for users that want to keep the original
list and just add more relevant paths.
* `artcl_exclude_list` -- A list of files and directories to exclude from
collecting. This list is passed to rsync as an exclude filter and it takes
precedence over the collection list. For details see the "FILTER RULES" topic
in the rsync man page.
* `artcl_collect_dir` -- A local directory where the logs should be
gathered, without a trailing slash.
* `artcl_gzip_only`: false/true -- When true, gathered files are gzipped one
by one in `artcl_collect_dir`, when false, a tar.gz file will contain all the
logs.
### Documentation generation related
* `artcl_gen_docs`: false/true -- If true, the role will use build artifacts
and Sphinx and produce user friendly documentation (default: false)
* `artcl_docs_source_dir` -- a local directory that serves as the Sphinx source
directory.
* `artcl_docs_build_dir` -- A local directory that serves as the Sphinx build
output directory.
* `artcl_create_docs_payload` -- Dictionary of lists that direct what and how
to construct documentation.
* `included_deployment_scripts` -- List of templated bash scripts to be
converted to rST files.
* `included_deployment_scripts` -- List of static rST files that will be
included in the output documentation.
* `table_of_contents` -- List that defines the order in which rST files
will be laid out in the output documentation.
* `artcl_verify_sphinx_build` -- false/true -- If true, verify items defined
in `artcl_create_docs_payload.table_of_contents` exist in sphinx generated
index.html (default: false)
```yaml
artcl_create_docs_payload:
included_deployment_scripts:
- undercloud-install
- undercloud-post-install
included_static_docs:
- env-setup-virt
table_of_contents:
- env-setup-virt
- undercloud-install
- undercloud-post-install
```
### Publishing related
* `artcl_publish`: true/false -- If true, the role will attempt to rsync logs
to the target specified by `artcl_rsync_url`. Uses `BUILD_URL`, `BUILD_TAG`
vars from the environment (set during a Jenkins job run) and requires the
next to variables to be set.
* `artcl_txt_rename`: false/true -- rename text based file to end in .txt.gz to
make upstream log servers display them in the browser instead of offering
them to download
* `artcl_publish_timeout`: the maximum seconds the role can spend uploading the
logs, the default is 1800 (30 minutes)
* `artcl_use_rsync`: false/true -- use rsync to upload the logs
* `artcl_rsync_use_daemon`: false/true -- use rsync daemon instead of ssh to connect
* `artcl_rsync_url` -- rsync target for uploading the logs. The localhost
needs to have passwordless authentication to the target or the
`PROVISIONER_KEY` Var specificed in the environment.
* `artcl_use_swift`: false/true -- use swift object storage to publish the logs
* `artcl_swift_auth_url` -- the OpenStack auth URL for Swift
* `artcl_swift_username` -- OpenStack username for Swift
* `artcl_swift_password` -- password for the Swift user
* `artcl_swift_tenant_name` -- OpenStack tenant name for Swift
* `artcl_swift_container` -- the name of the Swift container to use,
default is `logs`
* `artcl_swift_delete_after` -- The number of seconds after which Swift will
remove the uploaded objects, the default is 2678400 seconds = 31 days.
* `artcl_artifact_url` -- a HTTP URL at which the uploaded logs will be
accessible after upload.
* `artcl_collect_sosreport` -- true/false -- If true, create and collect a
sosreport for each host.
Example Playbook
----------------
```yaml
---
- name: Gather logs
hosts: all:!localhost
roles:
- collect-logs
```
Templated Bash to rST Conversion Notes
--------------------------------------
Templated bash scripts used during deployment are converted to rST files
during the `create-docs` portion of the role's call. Shell scripts are
fed into an awk script and output as restructured text. The awk script
has several simple rules:
1. Only lines between `### ---start_docs` and `### ---stop_docs` will be
parsed.
2. Lines containing `# nodoc` will be excluded.
3. Lines containing `## ::` indicate subsequent lines should be formatted
as code blocks
4. Other lines beginning with `## <anything else>` will have the prepended
`## ` removed. This is how and where general rST formatting is added.
5. All other lines, including shell comments, will be indented by four spaces.
Enabling sosreport Collection
-----------------------------
[sosreport](https://github.com/sosreport/sos) is a unified tool for collecting
system logs and other debug information. To enable creation of sosreport(s)
with this role, create a custom config (you can use centosci-logs.yml
as a template) and ensure that `artcl_collect_sosreport: true` is set.
License
-------
Apache 2.0
Author Information
------------------
RDO-CI Team

173
README.rst Normal file
View File

@@ -0,0 +1,173 @@
collect-logs
============
An Ansible role for aggregating logs from different nodes.
Requirements
------------
This role gathers logs and debug information from a target system and
collates them in a designated directory, ``artcl_collect_dir``, on the
localhost.
Additionally, the role will convert templated bash scripts, created and
used by TripleO-Quickstart during deployment, into rST files. These rST
files are combined with static rST files and fed into Sphinx to create
user friendly post-build-documentation specific to an original
deployment.
Finally, the role optionally handles uploading these logs to a rsync
server or to an OpenStack Swift object storage. Logs from Swift can be
exposed with
`os-loganalyze <https://github.com/openstack-infra/os-loganalyze>`__.
Role Variables
--------------
Collection related
~~~~~~~~~~~~~~~~~~
- ``artcl_collect_list`` A list of files and directories to gather
from the target. Directories are collected recursively and need to
end with a “/” to get collected. Should be specified as a YaML list,
e.g.:
.. code:: yaml
artcl_collect_list:
- /etc/nova/
- /home/stack/*.log
- /var/log/
- ``artcl_collect_list_append`` A list of files and directories to be
appended in the default list. This is useful for users that want to
keep the original list and just add more relevant paths.
- ``artcl_exclude_list`` A list of files and directories to exclude
from collecting. This list is passed to rsync as an exclude filter
and it takes precedence over the collection list. For details see the
“FILTER RULES” topic in the rsync man page.
- ``artcl_collect_dir`` A local directory where the logs should be
gathered, without a trailing slash.
- ``artcl_gzip_only``: false/true When true, gathered files are
gzipped one by one in ``artcl_collect_dir``, when false, a tar.gz
file will contain all the logs.
Documentation generation related
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
- ``artcl_gen_docs``: false/true If true, the role will use build
artifacts and Sphinx and produce user friendly documentation
(default: false)
- ``artcl_docs_source_dir`` a local directory that serves as the
Sphinx source directory.
- ``artcl_docs_build_dir`` A local directory that serves as the
Sphinx build output directory.
- ``artcl_create_docs_payload`` Dictionary of lists that direct what
and how to construct documentation.
- ``included_deployment_scripts`` List of templated bash scripts
to be converted to rST files.
- ``included_deployment_scripts`` List of static rST files that
will be included in the output documentation.
- ``table_of_contents`` List that defines the order in which rST
files will be laid out in the output documentation.
- ``artcl_verify_sphinx_build`` false/true If true, verify items
defined in ``artcl_create_docs_payload.table_of_contents`` exist in
sphinx generated index.html (default: false)
.. code:: yaml
artcl_create_docs_payload:
included_deployment_scripts:
- undercloud-install
- undercloud-post-install
included_static_docs:
- env-setup-virt
table_of_contents:
- env-setup-virt
- undercloud-install
- undercloud-post-install
Publishing related
~~~~~~~~~~~~~~~~~~
- ``artcl_publish``: true/false If true, the role will attempt to
rsync logs to the target specified by ``artcl_rsync_url``. Uses
``BUILD_URL``, ``BUILD_TAG`` vars from the environment (set during a
Jenkins job run) and requires the next to variables to be set.
- ``artcl_txt_rename``: false/true rename text based file to end in
.txt.gz to make upstream log servers display them in the browser
instead of offering them to download
- ``artcl_publish_timeout``: the maximum seconds the role can spend
uploading the logs, the default is 1800 (30 minutes)
- ``artcl_use_rsync``: false/true use rsync to upload the logs
- ``artcl_rsync_use_daemon``: false/true use rsync daemon instead of
ssh to connect
- ``artcl_rsync_url`` rsync target for uploading the logs. The
localhost needs to have passwordless authentication to the target or
the ``PROVISIONER_KEY`` Var specificed in the environment.
- ``artcl_use_swift``: false/true use swift object storage to publish
the logs
- ``artcl_swift_auth_url`` the OpenStack auth URL for Swift
- ``artcl_swift_username`` OpenStack username for Swift
- ``artcl_swift_password`` password for the Swift user
- ``artcl_swift_tenant_name`` OpenStack tenant name for Swift
- ``artcl_swift_container`` the name of the Swift container to use,
default is ``logs``
- ``artcl_swift_delete_after`` The number of seconds after which
Swift will remove the uploaded objects, the default is 2678400
seconds = 31 days.
- ``artcl_artifact_url`` a HTTP URL at which the uploaded logs will
be accessible after upload.
- ``artcl_collect_sosreport`` true/false If true, create and
collect a sosreport for each host.
Example Playbook
----------------
.. code:: yaml
---
- name: Gather logs
hosts: all:!localhost
roles:
- collect-logs
Templated Bash to rST Conversion Notes
--------------------------------------
Templated bash scripts used during deployment are converted to rST files
during the ``create-docs`` portion of the roles call. Shell scripts are
fed into an awk script and output as restructured text. The awk script
has several simple rules:
1. Only lines between ``### ---start_docs`` and ``### ---stop_docs``
will be parsed.
2. Lines containing ``# nodoc`` will be excluded.
3. Lines containing ``## ::`` indicate subsequent lines should be
formatted as code blocks
4. Other lines beginning with ``## <anything else>`` will have the
prepended ``##`` removed. This is how and where general rST
formatting is added.
5. All other lines, including shell comments, will be indented by four
spaces.
Enabling sosreport Collection
-----------------------------
`sosreport <https://github.com/sosreport/sos>`__ is a unified tool for
collecting system logs and other debug information. To enable creation
of sosreport(s) with this role, create a custom config (you can use
centosci-logs.yml as a template) and ensure that
``artcl_collect_sosreport: true`` is set.
License
-------
Apache 2.0
Author Information
------------------
RDO-CI Team

8
ansible.cfg Normal file
View File

@@ -0,0 +1,8 @@
[defaults]
gathering = smart
command_warnings = False
retry_files_enabled = False
callback_whitelist = profile_tasks
# Attempt to load custom modules whether it's installed system-wide or from a virtual environment
roles_path = roles:$VIRTUAL_ENV/usr/share/local/ansible/roles/tripleo-collect-logs:$VIRTUAL_ENV/usr/local/share/local

View File

@@ -107,4 +107,3 @@ ready section to copy custom nic-configs files.
The ``ansible-role-tripleo-overcloud-prep-config`` repo includes a task that copies the nic-configs
files if they are defined,
<https://github.com/redhat-openstack/ansible-role-tripleo-overcloud-prep-config/blob/master/tasks/main.yml#L15>

View File

@@ -18,4 +18,3 @@ to triple-quickstart as in the following example:
ovs_bridge: br-ctlplane
ovs_options: '"tag=102"'
tag: 102

View File

@@ -18,4 +18,3 @@ but can be overwritten by passing custom settings to tripleo-quickstart in a set
undercloud_dhcp_start: 10.0.5.5
undercloud_dhcp_end: 10.0.5.24
undercloud_inspection_iprange: 10.0.5.100,10.0.5.120

View File

@@ -40,4 +40,3 @@ on baremetal overcloud nodes:
--release $RELEASE \
$VIRTHOST
popd

View File

@@ -13,4 +13,3 @@ Some examples of additional steps are:
- Adding disk hints per node, supporting all Ironic hints
- Adjusting MTU values
- Rerunning introspection on failure

View File

@@ -94,4 +94,3 @@ Explanation of Directory Contents
- requirements_files (required)
Multiple requirements files can be passed to quickstart.sh to include additional repos. For example, to include IPMI validation, the requirements files would need to include are `here <https://github.com/redhat-openstack/ansible-role-tripleo-validate-ipmi>`_

View File

@@ -18,4 +18,3 @@ You can download the `quickstart.sh` script with `wget`:
::
wget https://raw.githubusercontent.com/openstack/tripleo-quickstart/master/quickstart.sh

View File

@@ -20,4 +20,3 @@ The undercloud VM or baremetal machine requires:
* 1 quad core CPU
* 16 GB free memory
* 80 GB disk space

View File

@@ -14,4 +14,3 @@ The advantages of using a VM undercloud are:
.. note:: When using a VM undercloud, but baremetal nodes for the overcloud
deployment, the ``overcloud_nodes`` variable in tripleo-quickstart
must overwritten and set to empty.

View File

@@ -85,4 +85,3 @@ Below are a list of example settings (overwriting defaults) that would be passed
public_net_pool_start: 10.0.0.50
public_net_pool_end: 10.0.0.100
public_net_gateway: 10.0.0.1

View File

@@ -12,4 +12,3 @@ You can download the `quickstart.sh` script with `wget`:
::
wget https://raw.githubusercontent.com/openstack/tripleo-quickstart/master/quickstart.sh

View File

@@ -81,5 +81,3 @@ tripleo-quickstart undercloud and overcloud deployment with network isolation:
## Uncomment to deploy a quintupleo environment without an undercloud.
# OS::OVB::UndercloudEnvironment: OS::Heat::None

View File

@@ -12,6 +12,10 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import ast
import datetime
import socket
DOCUMENTATION = '''
---
module: ara_graphite
@@ -51,9 +55,6 @@ EXAMPLES = '''
ara_mapping:
- "Name of task that deploys overcloud": overcloud.deploy.seconds
'''
import ast
import datetime
import socket
def stamp(x):

View File

@@ -1,3 +1,27 @@
---
dependencies:
- extras-common
galaxy_info:
author: OpenStack
description: n Ansible role for aggregating logs from different nodes
company: Red Hat
license: Apache 2.0
min_ansible_version: 2.5
platforms:
- name: EL
versions:
- 7
- name: Fedora
versions:
- 28
galaxy_tags:
- docker
- buildah
- container
- openstack
- tripleo
- packaging
- system
dependencies: []

2
requirements.txt Normal file
View File

@@ -0,0 +1,2 @@
pbr>=1.6
ansible>=2.5

42
setup.cfg Normal file
View File

@@ -0,0 +1,42 @@
[metadata]
name = ansible-role-tripleo-collect-logs
summary = ansible-role-tripleo-collect-logs - Ansible collect-logs role for the TripleO project.
description-file =
README.rst
author = TripleO Team
author-email = openstack-discuss@lists.openstack.org
home-page = https://git.openstack.org/cgit/openstack/ansible-role-tripleo-collect-logs
classifier =
License :: OSI Approved :: Apache Software License
Development Status :: 4 - Beta
Intended Audience :: Developers
Intended Audience :: System Administrators
Intended Audience :: Information Technology
Topic :: Utilities
[global]
setup-hooks =
pbr.hooks.setup_hook
[files]
data_files =
usr/local/share/ansible/roles/collect-logs/defaults = defaults/*
usr/local/share/ansible/roles/collect-logs/meta = meta/*
usr/local/share/ansible/roles/collect-logs/tasks = tasks/*
usr/local/share/ansible/roles/collect-logs/templates = templates/*
usr/local/share/ansible/roles/collect-logs/files = files/*
usr/local/share/ansible/roles/collect-logs/library = library/*
usr/local/share/ansible/roles/collect-logs/scripts = scripts/*
[wheel]
universal = 1
[pbr]
skip_authors = True
skip_changelog = True
[flake8]
# E123, E125 skipped as they are invalid PEP-8.
show-source = True
ignore = E123,E125
builtins = _

19
setup.py Normal file
View File

@@ -0,0 +1,19 @@
# Copyright Red Hat, Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import setuptools
setuptools.setup(
setup_requires=['pbr'],
pbr=True)

View File

@@ -9,7 +9,7 @@
influxdb_port: "{{ influxdb_port }}"
influxdb_user: "{{ influxdb_user }}"
influxdb_password: "{{ influxdb_password }}"
influxdb_db: "{{ influxdb_dbname}}"
influxdb_db: "{{ influxdb_dbname }}"
ara_data: "{{ ara_data.stdout }}"
measurement: "{{ influxdb_measurement }}"
data_file: "{{ influxdb_data_file_path }}"
@@ -26,7 +26,7 @@
influxdb_port: "{{ influxdb_port }}"
influxdb_user: "{{ influxdb_user }}"
influxdb_password: "{{ influxdb_password }}"
influxdb_db: "{{ influxdb_dbname}}"
influxdb_db: "{{ influxdb_dbname }}"
ara_data: "{{ ara_root_data.stdout }}"
measurement: "undercloud"
data_file: "{{ influxdb_data_file_path }}"
@@ -48,7 +48,7 @@
influxdb_port: "{{ influxdb_port }}"
influxdb_user: "{{ influxdb_user }}"
influxdb_password: "{{ influxdb_password }}"
influxdb_db: "{{ influxdb_dbname}}"
influxdb_db: "{{ influxdb_dbname }}"
ara_data: "{{ ara_oc_data.stdout }}"
measurement: "overcloud"
data_file: "{{ influxdb_data_file_path }}"

View File

@@ -21,7 +21,7 @@
shell: "{{ ansible_pkg_mgr }} list installed >/var/log/extra/package-list-installed.txt"
- name: Collecting /proc/cpuinfo|meminfo|swaps
shell: "cat /proc/{{item}} &> /var/log/extra/{{item}}.txt"
shell: "cat /proc/{{ item }} &> /var/log/extra/{{ item }}.txt"
with_items:
- cpuinfo
- meminfo

View File

@@ -13,7 +13,7 @@
set -o pipefail &&
curl -k "{{ lookup('env', 'BUILD_URL') }}/timestamps/?time=yyyy-MM-dd%20HH:mm:ss.SSS%20|&appendLog&locale=en_GB"
| gzip > {{ artcl_collect_dir }}/console.txt.gz
when: lookup('env', 'BUILD_URL') != ""
when: lookup('env', 'BUILD_URL')
- when: ara_generate_html|bool
block:
@@ -126,7 +126,10 @@
when: zuul is defined
- name: upload to the artifact server using pubkey auth
shell: rsync -av --quiet -e "ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" {{ artcl_collect_dir }}/ {{ artcl_rsync_path }}/{{ lookup('env', 'BUILD_TAG') }}
shell: >
rsync -av
--quiet -e "ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null"
{{ artcl_collect_dir }}/ {{ artcl_rsync_path }}/{{ lookup('env', 'BUILD_TAG') }}
async: "{{ artcl_publish_timeout }}"
poll: 15
retries: 5
@@ -158,7 +161,9 @@
when: artcl_use_swift|bool
- name: use zuul_swift_upload.py to publish the files
shell: "{{ artcl_zuul_swift_upload_path }}/zuul_swift_upload.py --name {{ artcl_swift_container }} --delete-after {{ artcl_swift_delete_after }} {{ artcl_collect_dir }}"
shell: >
"{{ artcl_zuul_swift_upload_path }}/zuul_swift_upload.py --name {{ artcl_swift_container }}
--delete-after {{ artcl_swift_delete_after }} {{ artcl_collect_dir }}"
async: "{{ artcl_publish_timeout }}"
poll: 15
when: artcl_use_zuul_swift_upload|bool

1
test-requirements.txt Normal file
View File

@@ -0,0 +1 @@
pre-commit # MIT

59
tox.ini Normal file
View File

@@ -0,0 +1,59 @@
[tox]
minversion = 2.0
envlist = docs, linters
skipdist = True
[testenv]
usedevelop = True
install_command = pip install -c{env:UPPER_CONSTRAINTS_FILE:https://git.openstack.org/cgit/openstack/requirements/plain/upper-constraints.txt} {opts} {packages}
setenv = VIRTUAL_ENV={envdir}
deps = -r{toxinidir}/test-requirements.txt
whitelist_externals = bash
[testenv:bindep]
basepython = python3
# Do not install any requirements. We want this to be fast and work even if
# system dependencies are missing, since it's used to tell you what system
# dependencies are missing! This also means that bindep must be installed
# separately, outside of the requirements files.
deps = bindep
commands = bindep test
[testenv:pep8]
envdir = {toxworkdir}/linters
commands =
python -m pre_commit run flake8 -a
[testenv:ansible-lint]
setenv =
ANSIBLE_LIBRARY=./library
envdir = {toxworkdir}/linters
commands =
python -m pre_commit run ansible-lint -a
[testenv:linters]
basepython = python3
commands =
# check only modified files:
python -m pre_commit run -a
[testenv:releasenotes]
basepython = python3
whitelist_externals = bash
commands = bash -c ci-scripts/releasenotes_tox.sh
[testenv:bashate]
envdir = {toxworkdir}/linters
commands =
python -m pre_commit run bashate -a
[testenv:venv]
basepython = python3
commands = {posargs}
[flake8]
# E123, E125 skipped as they are invalid PEP-8.
# E265 deals with spaces inside of comments
show-source = True
ignore = E123,E125,E265
builtins = _

12
zuul.d/layout.yaml Normal file
View File

@@ -0,0 +1,12 @@
---
- project:
check:
jobs:
- openstack-tox-linters
gate:
jobs:
- openstack-tox-linters
queue: tripleo
post:
jobs:
- publish-openstack-python-branch-tarball