status
stringclasses 1
value | repo_name
stringclasses 31
values | repo_url
stringclasses 31
values | issue_id
int64 1
104k
| title
stringlengths 4
369
| body
stringlengths 0
254k
⌀ | issue_url
stringlengths 37
56
| pull_url
stringlengths 37
54
| before_fix_sha
stringlengths 40
40
| after_fix_sha
stringlengths 40
40
| report_datetime
timestamp[us, tz=UTC] | language
stringclasses 5
values | commit_datetime
timestamp[us, tz=UTC] | updated_file
stringlengths 4
188
| file_content
stringlengths 0
5.12M
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 59,617 |
grafana_dashboard: add dashboard report "send() got multiple values for keyword argument 'MESSAGE'"
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
Add dashboard report "send() got multiple values for keyword argument 'MESSAGE'"
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
grafana_dashboard
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.8.3
config file = /home/namnh/workspace/ansible-setup/ansible.cfg
configured module search path = ['/home/namnh/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/namnh/.venvs/ansible/lib/python3.6/site-packages/ansible
executable location = /home/namnh/.venvs/ansible/bin/ansible
python version = 3.6.7 (default, Oct 22 2018, 11:32:17) [GCC 8.2.0]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
DEFAULT_HOST_LIST(/home/namnh/workspace/ansible-setup/ansible.cfg) = ['/home/namnh/workspace/ansible-setup/inventory']
DEFAULT_STDOUT_CALLBACK(/home/namnh/workspace/ansible-setup/ansible.cfg) = yaml
INTERPRETER_PYTHON(/home/namnh/workspace/ansible-setup/ansible.cfg) = /usr/bin/python3
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
- hosts: localhost
tasks:
- name: Import Grafana telegraf dashboard
ignore_errors: yes
become: false
grafana_dashboard:
grafana_url: "{{ grafana_url }}"
grafana_user: "{{ grafana_admin_user }}"
grafana_password: "{{ grafana_admin_password }}"
message: telegraf
overwrite: yes
state: present
path: "{{ playbook_dir }}/configs/telegraf-system.json"
delegate_to: localhost
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
New datasource is added
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
Traceback (most recent call last):
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 114, in <module>
_ansiballz_main()
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 106, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 49, in invoke_module
imp.load_module('__main__', mod, module, MOD_DESC)
File "/usr/lib/python3.6/imp.py", line 235, in load_module
return load_source(name, filename, file)
File "/usr/lib/python3.6/imp.py", line 170, in load_source
module = _exec(spec, sys.modules[name])
File "<frozen importlib._bootstrap>", line 618, in _exec
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/__main__.py", line 451, in <module>
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/__main__.py", line 408, in main
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 691, in __init__
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 1946, in _log_invocation
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 1904, in log
TypeError: send() got multiple values for keyword argument 'MESSAGE'
msg: |-
MODULE FAILURE
See stdout/stderr for the exact error
rc: 1
```
|
https://github.com/ansible/ansible/issues/59617
|
https://github.com/ansible/ansible/pull/60051
|
00bed0eb1c2ed22a7b56078b9e7911756182ac92
|
b6753b46a987a319ff062a8adcdcd4e0000353ed
| 2019-07-26T04:36:45Z |
python
| 2020-02-18T12:00:16Z |
lib/ansible/modules/monitoring/bigpanda.py
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright: Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'}
DOCUMENTATION = '''
---
module: bigpanda
author: "Hagai Kariti (@hkariti)"
short_description: Notify BigPanda about deployments
version_added: "1.8"
description:
- Notify BigPanda when deployments start and end (successfully or not). Returns a deployment object containing all the parameters for future module calls.
options:
component:
description:
- "The name of the component being deployed. Ex: billing"
required: true
aliases: ['name']
version:
description:
- The deployment version.
required: true
token:
description:
- API token.
required: true
state:
description:
- State of the deployment.
required: true
choices: ['started', 'finished', 'failed']
hosts:
description:
- Name of affected host name. Can be a list.
required: false
default: machine's hostname
aliases: ['host']
env:
description:
- The environment name, typically 'production', 'staging', etc.
required: false
owner:
description:
- The person responsible for the deployment.
required: false
description:
description:
- Free text description of the deployment.
required: false
url:
description:
- Base URL of the API server.
required: False
default: https://api.bigpanda.io
validate_certs:
description:
- If C(no), SSL certificates for the target url will not be validated. This should only be used
on personally controlled sites using self-signed certificates.
required: false
default: 'yes'
type: bool
# informational: requirements for nodes
requirements: [ ]
'''
EXAMPLES = '''
- bigpanda:
component: myapp
version: '1.3'
token: '{{ bigpanda_token }}'
state: started
- bigpanda:
component: myapp
version: '1.3'
token: '{{ bigpanda_token }}'
state: finished
# If outside servers aren't reachable from your machine, use delegate_to and override hosts:
- bigpanda:
component: myapp
version: '1.3'
token: '{{ bigpanda_token }}'
hosts: '{{ ansible_hostname }}'
state: started
delegate_to: localhost
register: deployment
- bigpanda:
component: '{{ deployment.component }}'
version: '{{ deployment.version }}'
token: '{{ deployment.token }}'
state: finished
delegate_to: localhost
'''
# ===========================================
# Module execution.
#
import json
import socket
import traceback
from ansible.module_utils.basic import AnsibleModule
from ansible.module_utils._text import to_native
from ansible.module_utils.urls import fetch_url
def main():
module = AnsibleModule(
argument_spec=dict(
component=dict(required=True, aliases=['name']),
version=dict(required=True),
token=dict(required=True, no_log=True),
state=dict(required=True, choices=['started', 'finished', 'failed']),
hosts=dict(required=False, default=[socket.gethostname()], aliases=['host']),
env=dict(required=False),
owner=dict(required=False),
description=dict(required=False),
message=dict(required=False),
source_system=dict(required=False, default='ansible'),
validate_certs=dict(default='yes', type='bool'),
url=dict(required=False, default='https://api.bigpanda.io'),
),
supports_check_mode=True,
)
token = module.params['token']
state = module.params['state']
url = module.params['url']
# Build the common request body
body = dict()
for k in ('component', 'version', 'hosts'):
v = module.params[k]
if v is not None:
body[k] = v
if not isinstance(body['hosts'], list):
body['hosts'] = [body['hosts']]
# Insert state-specific attributes to body
if state == 'started':
for k in ('source_system', 'env', 'owner', 'description'):
v = module.params[k]
if v is not None:
body[k] = v
request_url = url + '/data/events/deployments/start'
else:
message = module.params['message']
if message is not None:
body['errorMessage'] = message
if state == 'finished':
body['status'] = 'success'
else:
body['status'] = 'failure'
request_url = url + '/data/events/deployments/end'
# Build the deployment object we return
deployment = dict(token=token, url=url)
deployment.update(body)
if 'errorMessage' in deployment:
message = deployment.pop('errorMessage')
deployment['message'] = message
# If we're in check mode, just exit pretending like we succeeded
if module.check_mode:
module.exit_json(changed=True, **deployment)
# Send the data to bigpanda
data = json.dumps(body)
headers = {'Authorization': 'Bearer %s' % token, 'Content-Type': 'application/json'}
try:
response, info = fetch_url(module, request_url, data=data, headers=headers)
if info['status'] == 200:
module.exit_json(changed=True, **deployment)
else:
module.fail_json(msg=json.dumps(info))
except Exception as e:
module.fail_json(msg=to_native(e), exception=traceback.format_exc())
if __name__ == '__main__':
main()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 59,617 |
grafana_dashboard: add dashboard report "send() got multiple values for keyword argument 'MESSAGE'"
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
Add dashboard report "send() got multiple values for keyword argument 'MESSAGE'"
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
grafana_dashboard
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.8.3
config file = /home/namnh/workspace/ansible-setup/ansible.cfg
configured module search path = ['/home/namnh/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/namnh/.venvs/ansible/lib/python3.6/site-packages/ansible
executable location = /home/namnh/.venvs/ansible/bin/ansible
python version = 3.6.7 (default, Oct 22 2018, 11:32:17) [GCC 8.2.0]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
DEFAULT_HOST_LIST(/home/namnh/workspace/ansible-setup/ansible.cfg) = ['/home/namnh/workspace/ansible-setup/inventory']
DEFAULT_STDOUT_CALLBACK(/home/namnh/workspace/ansible-setup/ansible.cfg) = yaml
INTERPRETER_PYTHON(/home/namnh/workspace/ansible-setup/ansible.cfg) = /usr/bin/python3
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
- hosts: localhost
tasks:
- name: Import Grafana telegraf dashboard
ignore_errors: yes
become: false
grafana_dashboard:
grafana_url: "{{ grafana_url }}"
grafana_user: "{{ grafana_admin_user }}"
grafana_password: "{{ grafana_admin_password }}"
message: telegraf
overwrite: yes
state: present
path: "{{ playbook_dir }}/configs/telegraf-system.json"
delegate_to: localhost
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
New datasource is added
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
Traceback (most recent call last):
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 114, in <module>
_ansiballz_main()
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 106, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 49, in invoke_module
imp.load_module('__main__', mod, module, MOD_DESC)
File "/usr/lib/python3.6/imp.py", line 235, in load_module
return load_source(name, filename, file)
File "/usr/lib/python3.6/imp.py", line 170, in load_source
module = _exec(spec, sys.modules[name])
File "<frozen importlib._bootstrap>", line 618, in _exec
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/__main__.py", line 451, in <module>
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/__main__.py", line 408, in main
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 691, in __init__
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 1946, in _log_invocation
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 1904, in log
TypeError: send() got multiple values for keyword argument 'MESSAGE'
msg: |-
MODULE FAILURE
See stdout/stderr for the exact error
rc: 1
```
|
https://github.com/ansible/ansible/issues/59617
|
https://github.com/ansible/ansible/pull/60051
|
00bed0eb1c2ed22a7b56078b9e7911756182ac92
|
b6753b46a987a319ff062a8adcdcd4e0000353ed
| 2019-07-26T04:36:45Z |
python
| 2020-02-18T12:00:16Z |
lib/ansible/modules/monitoring/datadog/datadog_monitor.py
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright: (c) 2015, Sebastian Kornehl <[email protected]>
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'}
DOCUMENTATION = '''
---
module: datadog_monitor
short_description: Manages Datadog monitors
description:
- Manages monitors within Datadog.
- Options as described on https://docs.datadoghq.com/api/.
version_added: "2.0"
author: Sebastian Kornehl (@skornehl)
requirements: [datadog]
options:
api_key:
description:
- Your Datadog API key.
required: true
type: str
api_host:
description:
- The URL to the Datadog API. Default value is C(https://api.datadoghq.com).
- This value can also be set with the C(DATADOG_HOST) environment variable.
required: false
type: str
version_added: '2.10'
app_key:
description:
- Your Datadog app key.
required: true
type: str
state:
description:
- The designated state of the monitor.
required: true
choices: ['present', 'absent', 'mute', 'unmute']
type: str
tags:
description:
- A list of tags to associate with your monitor when creating or updating.
- This can help you categorize and filter monitors.
version_added: "2.2"
type: list
type:
description:
- The type of the monitor.
choices: ['metric alert', 'service check', 'event alert', 'process alert']
type: str
query:
description:
- The monitor query to notify on.
- Syntax varies depending on what type of monitor you are creating.
type: str
name:
description:
- The name of the alert.
required: true
type: str
message:
description:
- A message to include with notifications for this monitor.
- Email notifications can be sent to specific users by using the same '@username' notation as events.
- Monitor message template variables can be accessed by using double square brackets, i.e '[[' and ']]'.
type: str
silenced:
description:
- Dictionary of scopes to silence, with timestamps or None.
- Each scope will be muted until the given POSIX timestamp or forever if the value is None.
default: ""
notify_no_data:
description:
- Whether this monitor will notify when data stops reporting.
type: bool
default: 'no'
no_data_timeframe:
description:
- The number of minutes before a monitor will notify when data stops reporting.
- Must be at least 2x the monitor timeframe for metric alerts or 2 minutes for service checks.
default: 2x timeframe for metric, 2 minutes for service
type: str
timeout_h:
description:
- The number of hours of the monitor not reporting data before it will automatically resolve from a triggered state.
type: str
renotify_interval:
description:
- The number of minutes after the last notification before a monitor will re-notify on the current status.
- It will only re-notify if it is not resolved.
type: str
escalation_message:
description:
- A message to include with a re-notification. Supports the '@username' notification we allow elsewhere.
- Not applicable if I(renotify_interval=None).
type: str
notify_audit:
description:
- Whether tagged users will be notified on changes to this monitor.
type: bool
default: 'no'
thresholds:
description:
- A dictionary of thresholds by status.
- Only available for service checks and metric alerts.
- Because each of them can have multiple thresholds, we do not define them directly in the query.
default: {'ok': 1, 'critical': 1, 'warning': 1}
locked:
description:
- Whether changes to this monitor should be restricted to the creator or admins.
type: bool
default: 'no'
version_added: "2.2"
require_full_window:
description:
- Whether this monitor needs a full window of data before it gets evaluated.
- We highly recommend you set this to False for sparse metrics, otherwise some evaluations will be skipped.
version_added: "2.3"
type: bool
new_host_delay:
description:
- A positive integer representing the number of seconds to wait before evaluating the monitor for new hosts.
- This gives the host time to fully initialize.
version_added: "2.4"
type: str
evaluation_delay:
description:
- Time to delay evaluation (in seconds).
- Effective for sparse values.
version_added: "2.7"
type: str
id:
description:
- The ID of the alert.
- If set, will be used instead of the name to locate the alert.
version_added: "2.3"
type: str
'''
EXAMPLES = '''
# Create a metric monitor
- datadog_monitor:
type: "metric alert"
name: "Test monitor"
state: "present"
query: "datadog.agent.up.over('host:host1').last(2).count_by_status()"
message: "Host [[host.name]] with IP [[host.ip]] is failing to report to datadog."
api_key: "9775a026f1ca7d1c6c5af9d94d9595a4"
app_key: "87ce4a24b5553d2e482ea8a8500e71b8ad4554ff"
# Deletes a monitor
- datadog_monitor:
name: "Test monitor"
state: "absent"
api_key: "9775a026f1ca7d1c6c5af9d94d9595a4"
app_key: "87ce4a24b5553d2e482ea8a8500e71b8ad4554ff"
# Mutes a monitor
- datadog_monitor:
name: "Test monitor"
state: "mute"
silenced: '{"*":None}'
api_key: "9775a026f1ca7d1c6c5af9d94d9595a4"
app_key: "87ce4a24b5553d2e482ea8a8500e71b8ad4554ff"
# Unmutes a monitor
- datadog_monitor:
name: "Test monitor"
state: "unmute"
api_key: "9775a026f1ca7d1c6c5af9d94d9595a4"
app_key: "87ce4a24b5553d2e482ea8a8500e71b8ad4554ff"
# Use datadoghq.eu platform instead of datadoghq.com
- datadog_monitor:
name: "Test monitor"
state: "absent"
api_host: https://api.datadoghq.eu
api_key: "9775a026f1ca7d1c6c5af9d94d9595a4"
app_key: "87ce4a24b5553d2e482ea8a8500e71b8ad4554ff"
'''
import traceback
# Import Datadog
DATADOG_IMP_ERR = None
try:
from datadog import initialize, api
HAS_DATADOG = True
except Exception:
DATADOG_IMP_ERR = traceback.format_exc()
HAS_DATADOG = False
from ansible.module_utils.basic import AnsibleModule, missing_required_lib
from ansible.module_utils._text import to_native
def main():
module = AnsibleModule(
argument_spec=dict(
api_key=dict(required=True, no_log=True),
api_host=dict(required=False),
app_key=dict(required=True, no_log=True),
state=dict(required=True, choices=['present', 'absent', 'mute', 'unmute']),
type=dict(required=False, choices=['metric alert', 'service check', 'event alert', 'process alert']),
name=dict(required=True),
query=dict(required=False),
message=dict(required=False, default=None),
silenced=dict(required=False, default=None, type='dict'),
notify_no_data=dict(required=False, default=False, type='bool'),
no_data_timeframe=dict(required=False, default=None),
timeout_h=dict(required=False, default=None),
renotify_interval=dict(required=False, default=None),
escalation_message=dict(required=False, default=None),
notify_audit=dict(required=False, default=False, type='bool'),
thresholds=dict(required=False, type='dict', default=None),
tags=dict(required=False, type='list', default=None),
locked=dict(required=False, default=False, type='bool'),
require_full_window=dict(required=False, default=None, type='bool'),
new_host_delay=dict(required=False, default=None),
evaluation_delay=dict(required=False, default=None),
id=dict(required=False)
)
)
# Prepare Datadog
if not HAS_DATADOG:
module.fail_json(msg=missing_required_lib('datadogpy'), exception=DATADOG_IMP_ERR)
options = {
'api_key': module.params['api_key'],
'api_host': module.params['api_host'],
'app_key': module.params['app_key']
}
initialize(**options)
# Check if api_key and app_key is correct or not
# if not, then fail here.
response = api.Monitor.get_all()
if isinstance(response, dict):
msg = response.get('errors', None)
if msg:
module.fail_json(msg="Failed to connect Datadog server using given app_key and api_key : {0}".format(msg[0]))
if module.params['state'] == 'present':
install_monitor(module)
elif module.params['state'] == 'absent':
delete_monitor(module)
elif module.params['state'] == 'mute':
mute_monitor(module)
elif module.params['state'] == 'unmute':
unmute_monitor(module)
def _fix_template_vars(message):
if message:
return message.replace('[[', '{{').replace(']]', '}}')
return message
def _get_monitor(module):
if module.params['id'] is not None:
monitor = api.Monitor.get(module.params['id'])
if 'errors' in monitor:
module.fail_json(msg="Failed to retrieve monitor with id %s, errors are %s" % (module.params['id'], str(monitor['errors'])))
return monitor
else:
monitors = api.Monitor.get_all()
for monitor in monitors:
if monitor['name'] == _fix_template_vars(module.params['name']):
return monitor
return {}
def _post_monitor(module, options):
try:
kwargs = dict(type=module.params['type'], query=module.params['query'],
name=_fix_template_vars(module.params['name']),
message=_fix_template_vars(module.params['message']),
escalation_message=_fix_template_vars(module.params['escalation_message']),
options=options)
if module.params['tags'] is not None:
kwargs['tags'] = module.params['tags']
msg = api.Monitor.create(**kwargs)
if 'errors' in msg:
module.fail_json(msg=str(msg['errors']))
else:
module.exit_json(changed=True, msg=msg)
except Exception as e:
module.fail_json(msg=to_native(e), exception=traceback.format_exc())
def _equal_dicts(a, b, ignore_keys):
ka = set(a).difference(ignore_keys)
kb = set(b).difference(ignore_keys)
return ka == kb and all(a[k] == b[k] for k in ka)
def _update_monitor(module, monitor, options):
try:
kwargs = dict(id=monitor['id'], query=module.params['query'],
name=_fix_template_vars(module.params['name']),
message=_fix_template_vars(module.params['message']),
escalation_message=_fix_template_vars(module.params['escalation_message']),
options=options)
if module.params['tags'] is not None:
kwargs['tags'] = module.params['tags']
msg = api.Monitor.update(**kwargs)
if 'errors' in msg:
module.fail_json(msg=str(msg['errors']))
elif _equal_dicts(msg, monitor, ['creator', 'overall_state', 'modified', 'matching_downtimes', 'overall_state_modified']):
module.exit_json(changed=False, msg=msg)
else:
module.exit_json(changed=True, msg=msg)
except Exception as e:
module.fail_json(msg=to_native(e), exception=traceback.format_exc())
def install_monitor(module):
options = {
"silenced": module.params['silenced'],
"notify_no_data": module.boolean(module.params['notify_no_data']),
"no_data_timeframe": module.params['no_data_timeframe'],
"timeout_h": module.params['timeout_h'],
"renotify_interval": module.params['renotify_interval'],
"escalation_message": module.params['escalation_message'],
"notify_audit": module.boolean(module.params['notify_audit']),
"locked": module.boolean(module.params['locked']),
"require_full_window": module.params['require_full_window'],
"new_host_delay": module.params['new_host_delay'],
"evaluation_delay": module.params['evaluation_delay']
}
if module.params['type'] == "service check":
options["thresholds"] = module.params['thresholds'] or {'ok': 1, 'critical': 1, 'warning': 1}
if module.params['type'] in ["metric alert", "log alert"] and module.params['thresholds'] is not None:
options["thresholds"] = module.params['thresholds']
monitor = _get_monitor(module)
if not monitor:
_post_monitor(module, options)
else:
_update_monitor(module, monitor, options)
def delete_monitor(module):
monitor = _get_monitor(module)
if not monitor:
module.exit_json(changed=False)
try:
msg = api.Monitor.delete(monitor['id'])
module.exit_json(changed=True, msg=msg)
except Exception as e:
module.fail_json(msg=to_native(e), exception=traceback.format_exc())
def mute_monitor(module):
monitor = _get_monitor(module)
if not monitor:
module.fail_json(msg="Monitor %s not found!" % module.params['name'])
elif monitor['options']['silenced']:
module.fail_json(msg="Monitor is already muted. Datadog does not allow to modify muted alerts, consider unmuting it first.")
elif (module.params['silenced'] is not None and len(set(monitor['options']['silenced']) ^ set(module.params['silenced'])) == 0):
module.exit_json(changed=False)
try:
if module.params['silenced'] is None or module.params['silenced'] == "":
msg = api.Monitor.mute(id=monitor['id'])
else:
msg = api.Monitor.mute(id=monitor['id'], silenced=module.params['silenced'])
module.exit_json(changed=True, msg=msg)
except Exception as e:
module.fail_json(msg=to_native(e), exception=traceback.format_exc())
def unmute_monitor(module):
monitor = _get_monitor(module)
if not monitor:
module.fail_json(msg="Monitor %s not found!" % module.params['name'])
elif not monitor['options']['silenced']:
module.exit_json(changed=False)
try:
msg = api.Monitor.unmute(monitor['id'])
module.exit_json(changed=True, msg=msg)
except Exception as e:
module.fail_json(msg=to_native(e), exception=traceback.format_exc())
if __name__ == '__main__':
main()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 59,617 |
grafana_dashboard: add dashboard report "send() got multiple values for keyword argument 'MESSAGE'"
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
Add dashboard report "send() got multiple values for keyword argument 'MESSAGE'"
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
grafana_dashboard
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.8.3
config file = /home/namnh/workspace/ansible-setup/ansible.cfg
configured module search path = ['/home/namnh/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/namnh/.venvs/ansible/lib/python3.6/site-packages/ansible
executable location = /home/namnh/.venvs/ansible/bin/ansible
python version = 3.6.7 (default, Oct 22 2018, 11:32:17) [GCC 8.2.0]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
DEFAULT_HOST_LIST(/home/namnh/workspace/ansible-setup/ansible.cfg) = ['/home/namnh/workspace/ansible-setup/inventory']
DEFAULT_STDOUT_CALLBACK(/home/namnh/workspace/ansible-setup/ansible.cfg) = yaml
INTERPRETER_PYTHON(/home/namnh/workspace/ansible-setup/ansible.cfg) = /usr/bin/python3
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
- hosts: localhost
tasks:
- name: Import Grafana telegraf dashboard
ignore_errors: yes
become: false
grafana_dashboard:
grafana_url: "{{ grafana_url }}"
grafana_user: "{{ grafana_admin_user }}"
grafana_password: "{{ grafana_admin_password }}"
message: telegraf
overwrite: yes
state: present
path: "{{ playbook_dir }}/configs/telegraf-system.json"
delegate_to: localhost
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
New datasource is added
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
Traceback (most recent call last):
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 114, in <module>
_ansiballz_main()
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 106, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 49, in invoke_module
imp.load_module('__main__', mod, module, MOD_DESC)
File "/usr/lib/python3.6/imp.py", line 235, in load_module
return load_source(name, filename, file)
File "/usr/lib/python3.6/imp.py", line 170, in load_source
module = _exec(spec, sys.modules[name])
File "<frozen importlib._bootstrap>", line 618, in _exec
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/__main__.py", line 451, in <module>
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/__main__.py", line 408, in main
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 691, in __init__
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 1946, in _log_invocation
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 1904, in log
TypeError: send() got multiple values for keyword argument 'MESSAGE'
msg: |-
MODULE FAILURE
See stdout/stderr for the exact error
rc: 1
```
|
https://github.com/ansible/ansible/issues/59617
|
https://github.com/ansible/ansible/pull/60051
|
00bed0eb1c2ed22a7b56078b9e7911756182ac92
|
b6753b46a987a319ff062a8adcdcd4e0000353ed
| 2019-07-26T04:36:45Z |
python
| 2020-02-18T12:00:16Z |
lib/ansible/modules/monitoring/grafana/grafana_dashboard.py
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright: (c) 2017, Thierry Sallé (@seuf)
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
ANSIBLE_METADATA = {
'status': ['preview'],
'supported_by': 'community',
'metadata_version': '1.1'
}
DOCUMENTATION = '''
---
module: grafana_dashboard
author:
- Thierry Sallé (@seuf)
version_added: "2.5"
short_description: Manage Grafana dashboards
description:
- Create, update, delete, export Grafana dashboards via API.
options:
url:
description:
- The Grafana URL.
required: true
aliases: [ grafana_url ]
version_added: 2.7
url_username:
description:
- The Grafana API user.
default: admin
aliases: [ grafana_user ]
version_added: 2.7
url_password:
description:
- The Grafana API password.
default: admin
aliases: [ grafana_password ]
version_added: 2.7
grafana_api_key:
description:
- The Grafana API key.
- If set, I(grafana_user) and I(grafana_password) will be ignored.
org_id:
description:
- The Grafana Organisation ID where the dashboard will be imported / exported.
- Not used when I(grafana_api_key) is set, because the grafana_api_key only belongs to one organisation..
default: 1
folder:
description:
- The Grafana folder where this dashboard will be imported to.
default: General
version_added: '2.10'
state:
description:
- State of the dashboard.
required: true
choices: [ absent, export, present ]
default: present
slug:
description:
- Deprecated since Grafana 5. Use grafana dashboard uid instead.
- slug of the dashboard. It's the friendly url name of the dashboard.
- When C(state) is C(present), this parameter can override the slug in the meta section of the json file.
- If you want to import a json dashboard exported directly from the interface (not from the api),
you have to specify the slug parameter because there is no meta section in the exported json.
uid:
version_added: 2.7
description:
- uid of the dashboard to export when C(state) is C(export) or C(absent).
path:
description:
- The path to the json file containing the Grafana dashboard to import or export.
overwrite:
description:
- Override existing dashboard when state is present.
type: bool
default: 'no'
message:
description:
- Set a commit message for the version history.
- Only used when C(state) is C(present).
validate_certs:
description:
- If C(no), SSL certificates will not be validated.
- This should only be used on personally controlled sites using self-signed certificates.
type: bool
default: 'yes'
client_cert:
description:
- PEM formatted certificate chain file to be used for SSL client authentication.
- This file can also include the key as well, and if the key is included, client_key is not required
version_added: 2.7
client_key:
description:
- PEM formatted file that contains your private key to be used for SSL client
- authentication. If client_cert contains both the certificate and key, this option is not required
version_added: 2.7
use_proxy:
description:
- Boolean of whether or not to use proxy.
default: 'yes'
type: bool
version_added: 2.7
'''
EXAMPLES = '''
- hosts: localhost
connection: local
tasks:
- name: Import Grafana dashboard foo
grafana_dashboard:
grafana_url: http://grafana.company.com
grafana_api_key: "{{ grafana_api_key }}"
state: present
message: Updated by ansible
overwrite: yes
path: /path/to/dashboards/foo.json
- name: Export dashboard
grafana_dashboard:
grafana_url: http://grafana.company.com
grafana_user: "admin"
grafana_password: "{{ grafana_password }}"
org_id: 1
state: export
uid: "000000653"
path: "/path/to/dashboards/000000653.json"
'''
RETURN = '''
---
uid:
description: uid or slug of the created / deleted / exported dashboard.
returned: success
type: str
sample: 000000063
'''
import json
from ansible.module_utils.basic import AnsibleModule
from ansible.module_utils.urls import fetch_url, url_argument_spec
from ansible.module_utils.six.moves.urllib.parse import urlencode
from ansible.module_utils._text import to_native
from ansible.module_utils._text import to_text
__metaclass__ = type
class GrafanaAPIException(Exception):
pass
class GrafanaMalformedJson(Exception):
pass
class GrafanaExportException(Exception):
pass
class GrafanaDeleteException(Exception):
pass
def grafana_switch_organisation(module, grafana_url, org_id, headers):
r, info = fetch_url(module, '%s/api/user/using/%s' % (grafana_url, org_id), headers=headers, method='POST')
if info['status'] != 200:
raise GrafanaAPIException('Unable to switch to organization %s : %s' % (org_id, info))
def grafana_headers(module, data):
headers = {'content-type': 'application/json; charset=utf8'}
if 'grafana_api_key' in data and data['grafana_api_key']:
headers['Authorization'] = "Bearer %s" % data['grafana_api_key']
else:
module.params['force_basic_auth'] = True
grafana_switch_organisation(module, data['grafana_url'], data['org_id'], headers)
return headers
def get_grafana_version(module, grafana_url, headers):
grafana_version = None
r, info = fetch_url(module, '%s/api/frontend/settings' % grafana_url, headers=headers, method='GET')
if info['status'] == 200:
try:
settings = json.loads(to_text(r.read()))
grafana_version = settings['buildInfo']['version'].split('.')[0]
except UnicodeError as e:
raise GrafanaAPIException('Unable to decode version string to Unicode')
except Exception as e:
raise GrafanaAPIException(e)
else:
raise GrafanaAPIException('Unable to get grafana version : %s' % info)
return int(grafana_version)
def grafana_folder_exists(module, grafana_url, folder_name, headers):
# the 'General' folder is a special case, it's ID is always '0'
if folder_name == 'General':
return True, 0
try:
r, info = fetch_url(module, '%s/api/folders' % grafana_url, headers=headers, method='GET')
if info['status'] != 200:
raise GrafanaAPIException("Unable to query Grafana API for folders (name: %s): %d" % (folder_name, info['status']))
folders = json.loads(r.read())
for folder in folders:
if folder['title'] == folder_name:
return True, folder['id']
except Exception as e:
raise GrafanaAPIException(e)
return False, 0
def grafana_dashboard_exists(module, grafana_url, uid, headers):
dashboard_exists = False
dashboard = {}
grafana_version = get_grafana_version(module, grafana_url, headers)
if grafana_version >= 5:
uri = '%s/api/dashboards/uid/%s' % (grafana_url, uid)
else:
uri = '%s/api/dashboards/db/%s' % (grafana_url, uid)
r, info = fetch_url(module, uri, headers=headers, method='GET')
if info['status'] == 200:
dashboard_exists = True
try:
dashboard = json.loads(r.read())
except Exception as e:
raise GrafanaAPIException(e)
elif info['status'] == 404:
dashboard_exists = False
else:
raise GrafanaAPIException('Unable to get dashboard %s : %s' % (uid, info))
return dashboard_exists, dashboard
def grafana_dashboard_search(module, grafana_url, folder_id, title, headers):
# search by title
uri = '%s/api/search?%s' % (grafana_url, urlencode({
'folderIds': folder_id,
'query': title,
'type': 'dash-db'
}))
r, info = fetch_url(module, uri, headers=headers, method='GET')
if info['status'] == 200:
try:
dashboards = json.loads(r.read())
for d in dashboards:
if d['title'] == title:
return grafana_dashboard_exists(module, grafana_url, d['uid'], headers)
except Exception as e:
raise GrafanaAPIException(e)
else:
raise GrafanaAPIException('Unable to search dashboard %s : %s' % (title, info))
return False, None
# for comparison, we sometimes need to ignore a few keys
def grafana_dashboard_changed(payload, dashboard):
# you don't need to set the version, but '0' is incremented to '1' by Grafana's API
if payload['dashboard']['version'] == 0:
del(payload['dashboard']['version'])
del(dashboard['dashboard']['version'])
# the meta key is not part of the 'payload' ever
if 'meta' in dashboard:
del(dashboard['meta'])
# new dashboards don't require an id attribute (or, it can be 'null'), Grafana's API will generate it
if payload['dashboard']['id'] is None:
del(dashboard['dashboard']['id'])
del(payload['dashboard']['id'])
if payload == dashboard:
return True
return False
def grafana_create_dashboard(module, data):
# define data payload for grafana API
try:
with open(data['path'], 'r') as json_file:
payload = json.load(json_file)
except Exception as e:
raise GrafanaAPIException("Can't load json file %s" % to_native(e))
# Check that the dashboard JSON is nested under the 'dashboard' key
if 'dashboard' not in payload:
payload = {'dashboard': payload}
# define http header
headers = grafana_headers(module, data)
grafana_version = get_grafana_version(module, data['grafana_url'], headers)
if grafana_version < 5:
if data.get('slug'):
uid = data['slug']
elif 'meta' in payload and 'slug' in payload['meta']:
uid = payload['meta']['slug']
else:
raise GrafanaMalformedJson('No slug found in json. Needed with grafana < 5')
else:
if data.get('uid'):
uid = data['uid']
elif 'uid' in payload['dashboard']:
uid = payload['dashboard']['uid']
else:
uid = None
result = {}
# test if the folder exists
if grafana_version >= 5:
folder_exists, folder_id = grafana_folder_exists(module, data['grafana_url'], data['folder'], headers)
if folder_exists is False:
result['msg'] = "Dashboard folder '%s' does not exist." % data['folder']
result['uid'] = uid
result['changed'] = False
return result
payload['folderId'] = folder_id
# test if dashboard already exists
if uid:
dashboard_exists, dashboard = grafana_dashboard_exists(
module, data['grafana_url'], uid, headers=headers)
else:
dashboard_exists, dashboard = grafana_dashboard_search(
module, data['grafana_url'], folder_id, payload['dashboard']['title'], headers=headers)
if dashboard_exists is True:
if grafana_dashboard_changed(payload, dashboard):
# update
if 'overwrite' in data and data['overwrite']:
payload['overwrite'] = True
if 'message' in data and data['message']:
payload['message'] = data['message']
r, info = fetch_url(module, '%s/api/dashboards/db' % data['grafana_url'],
data=json.dumps(payload), headers=headers, method='POST')
if info['status'] == 200:
if grafana_version >= 5:
try:
dashboard = json.loads(r.read())
uid = dashboard['uid']
except Exception as e:
raise GrafanaAPIException(e)
result['uid'] = uid
result['msg'] = "Dashboard %s updated" % payload['dashboard']['title']
result['changed'] = True
else:
body = json.loads(info['body'])
raise GrafanaAPIException('Unable to update the dashboard %s : %s (HTTP: %d)' %
(uid, body['message'], info['status']))
else:
# unchanged
result['uid'] = uid
result['msg'] = "Dashboard %s unchanged." % payload['dashboard']['title']
result['changed'] = False
else:
# create
if folder_exists is True:
payload['folderId'] = folder_id
r, info = fetch_url(module, '%s/api/dashboards/db' % data['grafana_url'],
data=json.dumps(payload), headers=headers, method='POST')
if info['status'] == 200:
result['msg'] = "Dashboard %s created" % payload['dashboard']['title']
result['changed'] = True
if grafana_version >= 5:
try:
dashboard = json.loads(r.read())
uid = dashboard['uid']
except Exception as e:
raise GrafanaAPIException(e)
result['uid'] = uid
else:
raise GrafanaAPIException('Unable to create the new dashboard %s : %s - %s.' %
(payload['dashboard']['title'], info['status'], info))
return result
def grafana_delete_dashboard(module, data):
# define http headers
headers = grafana_headers(module, data)
grafana_version = get_grafana_version(module, data['grafana_url'], headers)
if grafana_version < 5:
if data.get('slug'):
uid = data['slug']
else:
raise GrafanaMalformedJson('No slug parameter. Needed with grafana < 5')
else:
if data.get('uid'):
uid = data['uid']
else:
raise GrafanaDeleteException('No uid specified %s')
# test if dashboard already exists
dashboard_exists, dashboard = grafana_dashboard_exists(module, data['grafana_url'], uid, headers=headers)
result = {}
if dashboard_exists is True:
# delete
if grafana_version < 5:
r, info = fetch_url(module, '%s/api/dashboards/db/%s' % (data['grafana_url'], uid), headers=headers, method='DELETE')
else:
r, info = fetch_url(module, '%s/api/dashboards/uid/%s' % (data['grafana_url'], uid), headers=headers, method='DELETE')
if info['status'] == 200:
result['msg'] = "Dashboard %s deleted" % uid
result['changed'] = True
result['uid'] = uid
else:
raise GrafanaAPIException('Unable to update the dashboard %s : %s' % (uid, info))
else:
# dashboard does not exist, do nothing
result = {'msg': "Dashboard %s does not exist." % uid,
'changed': False,
'uid': uid}
return result
def grafana_export_dashboard(module, data):
# define http headers
headers = grafana_headers(module, data)
grafana_version = get_grafana_version(module, data['grafana_url'], headers)
if grafana_version < 5:
if data.get('slug'):
uid = data['slug']
else:
raise GrafanaMalformedJson('No slug parameter. Needed with grafana < 5')
else:
if data.get('uid'):
uid = data['uid']
else:
raise GrafanaExportException('No uid specified')
# test if dashboard already exists
dashboard_exists, dashboard = grafana_dashboard_exists(module, data['grafana_url'], uid, headers=headers)
if dashboard_exists is True:
try:
with open(data['path'], 'w') as f:
f.write(json.dumps(dashboard))
except Exception as e:
raise GrafanaExportException("Can't write json file : %s" % to_native(e))
result = {'msg': "Dashboard %s exported to %s" % (uid, data['path']),
'uid': uid,
'changed': True}
else:
result = {'msg': "Dashboard %s does not exist." % uid,
'uid': uid,
'changed': False}
return result
def main():
# use the predefined argument spec for url
argument_spec = url_argument_spec()
# remove unnecessary arguments
del argument_spec['force']
del argument_spec['force_basic_auth']
del argument_spec['http_agent']
argument_spec.update(
state=dict(choices=['present', 'absent', 'export'], default='present'),
url=dict(aliases=['grafana_url'], required=True),
url_username=dict(aliases=['grafana_user'], default='admin'),
url_password=dict(aliases=['grafana_password'], default='admin', no_log=True),
grafana_api_key=dict(type='str', no_log=True),
org_id=dict(default=1, type='int'),
folder=dict(type='str', default='General'),
uid=dict(type='str'),
slug=dict(type='str'),
path=dict(type='str'),
overwrite=dict(type='bool', default=False),
message=dict(type='str'),
)
module = AnsibleModule(
argument_spec=argument_spec,
supports_check_mode=False,
required_together=[['url_username', 'url_password', 'org_id']],
mutually_exclusive=[['grafana_user', 'grafana_api_key'], ['uid', 'slug']],
)
try:
if module.params['state'] == 'present':
result = grafana_create_dashboard(module, module.params)
elif module.params['state'] == 'absent':
result = grafana_delete_dashboard(module, module.params)
else:
result = grafana_export_dashboard(module, module.params)
except GrafanaAPIException as e:
module.fail_json(
failed=True,
msg="error : %s" % to_native(e)
)
return
except GrafanaMalformedJson as e:
module.fail_json(
failed=True,
msg="error : %s" % to_native(e)
)
return
except GrafanaDeleteException as e:
module.fail_json(
failed=True,
msg="error : Can't delete dashboard : %s" % to_native(e)
)
return
except GrafanaExportException as e:
module.fail_json(
failed=True,
msg="error : Can't export dashboard : %s" % to_native(e)
)
return
module.exit_json(
failed=False,
**result
)
return
if __name__ == '__main__':
main()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 59,617 |
grafana_dashboard: add dashboard report "send() got multiple values for keyword argument 'MESSAGE'"
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
Add dashboard report "send() got multiple values for keyword argument 'MESSAGE'"
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
grafana_dashboard
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.8.3
config file = /home/namnh/workspace/ansible-setup/ansible.cfg
configured module search path = ['/home/namnh/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/namnh/.venvs/ansible/lib/python3.6/site-packages/ansible
executable location = /home/namnh/.venvs/ansible/bin/ansible
python version = 3.6.7 (default, Oct 22 2018, 11:32:17) [GCC 8.2.0]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
DEFAULT_HOST_LIST(/home/namnh/workspace/ansible-setup/ansible.cfg) = ['/home/namnh/workspace/ansible-setup/inventory']
DEFAULT_STDOUT_CALLBACK(/home/namnh/workspace/ansible-setup/ansible.cfg) = yaml
INTERPRETER_PYTHON(/home/namnh/workspace/ansible-setup/ansible.cfg) = /usr/bin/python3
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
- hosts: localhost
tasks:
- name: Import Grafana telegraf dashboard
ignore_errors: yes
become: false
grafana_dashboard:
grafana_url: "{{ grafana_url }}"
grafana_user: "{{ grafana_admin_user }}"
grafana_password: "{{ grafana_admin_password }}"
message: telegraf
overwrite: yes
state: present
path: "{{ playbook_dir }}/configs/telegraf-system.json"
delegate_to: localhost
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
New datasource is added
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
Traceback (most recent call last):
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 114, in <module>
_ansiballz_main()
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 106, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 49, in invoke_module
imp.load_module('__main__', mod, module, MOD_DESC)
File "/usr/lib/python3.6/imp.py", line 235, in load_module
return load_source(name, filename, file)
File "/usr/lib/python3.6/imp.py", line 170, in load_source
module = _exec(spec, sys.modules[name])
File "<frozen importlib._bootstrap>", line 618, in _exec
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/__main__.py", line 451, in <module>
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/__main__.py", line 408, in main
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 691, in __init__
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 1946, in _log_invocation
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 1904, in log
TypeError: send() got multiple values for keyword argument 'MESSAGE'
msg: |-
MODULE FAILURE
See stdout/stderr for the exact error
rc: 1
```
|
https://github.com/ansible/ansible/issues/59617
|
https://github.com/ansible/ansible/pull/60051
|
00bed0eb1c2ed22a7b56078b9e7911756182ac92
|
b6753b46a987a319ff062a8adcdcd4e0000353ed
| 2019-07-26T04:36:45Z |
python
| 2020-02-18T12:00:16Z |
test/lib/ansible_test/_data/sanity/validate-modules/validate_modules/main.py
|
# -*- coding: utf-8 -*-
#
# Copyright (C) 2015 Matt Martz <[email protected]>
# Copyright (C) 2015 Rackspace US, Inc.
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import abc
import argparse
import ast
import json
import errno
import os
import re
import subprocess
import sys
import tempfile
import traceback
from collections import OrderedDict
from contextlib import contextmanager
from distutils.version import StrictVersion
from fnmatch import fnmatch
from ansible import __version__ as ansible_version
from ansible.executor.module_common import REPLACER_WINDOWS
from ansible.module_utils.common._collections_compat import Mapping
from ansible.module_utils._text import to_bytes
from ansible.plugins.loader import fragment_loader
from ansible.utils.collection_loader import AnsibleCollectionLoader
from ansible.utils.plugin_docs import BLACKLIST, add_fragments, get_docstring
from .module_args import AnsibleModuleImportError, AnsibleModuleNotInitialized, get_argument_spec
from .schema import ansible_module_kwargs_schema, doc_schema, metadata_1_1_schema, return_schema
from .utils import CaptureStd, NoArgsAnsibleModule, compare_unordered_lists, is_empty, parse_yaml
from voluptuous.humanize import humanize_error
from ansible.module_utils.six import PY3, with_metaclass
if PY3:
# Because there is no ast.TryExcept in Python 3 ast module
TRY_EXCEPT = ast.Try
# REPLACER_WINDOWS from ansible.executor.module_common is byte
# string but we need unicode for Python 3
REPLACER_WINDOWS = REPLACER_WINDOWS.decode('utf-8')
else:
TRY_EXCEPT = ast.TryExcept
BLACKLIST_DIRS = frozenset(('.git', 'test', '.github', '.idea'))
INDENT_REGEX = re.compile(r'([\t]*)')
TYPE_REGEX = re.compile(r'.*(if|or)(\s+[^"\']*|\s+)(?<!_)(?<!str\()type\([^)].*')
SYS_EXIT_REGEX = re.compile(r'[^#]*sys.exit\s*\(.*')
BLACKLIST_IMPORTS = {
'requests': {
'new_only': True,
'error': {
'code': 'use-module-utils-urls',
'msg': ('requests import found, should use '
'ansible.module_utils.urls instead')
}
},
r'boto(?:\.|$)': {
'new_only': True,
'error': {
'code': 'use-boto3',
'msg': 'boto import found, new modules should use boto3'
}
},
}
SUBPROCESS_REGEX = re.compile(r'subprocess\.Po.*')
OS_CALL_REGEX = re.compile(r'os\.call.*')
class ReporterEncoder(json.JSONEncoder):
def default(self, o):
if isinstance(o, Exception):
return str(o)
return json.JSONEncoder.default(self, o)
class Reporter:
def __init__(self):
self.files = OrderedDict()
def _ensure_default_entry(self, path):
try:
self.files[path]
except KeyError:
self.files[path] = {
'errors': [],
'warnings': [],
'traces': [],
'warning_traces': []
}
def _log(self, path, code, msg, level='error', line=0, column=0):
self._ensure_default_entry(path)
lvl_dct = self.files[path]['%ss' % level]
lvl_dct.append({
'code': code,
'msg': msg,
'line': line,
'column': column
})
def error(self, *args, **kwargs):
self._log(*args, level='error', **kwargs)
def warning(self, *args, **kwargs):
self._log(*args, level='warning', **kwargs)
def trace(self, path, tracebk):
self._ensure_default_entry(path)
self.files[path]['traces'].append(tracebk)
def warning_trace(self, path, tracebk):
self._ensure_default_entry(path)
self.files[path]['warning_traces'].append(tracebk)
@staticmethod
@contextmanager
def _output_handle(output):
if output != '-':
handle = open(output, 'w+')
else:
handle = sys.stdout
yield handle
handle.flush()
handle.close()
@staticmethod
def _filter_out_ok(reports):
temp_reports = OrderedDict()
for path, report in reports.items():
if report['errors'] or report['warnings']:
temp_reports[path] = report
return temp_reports
def plain(self, warnings=False, output='-'):
"""Print out the test results in plain format
output is ignored here for now
"""
ret = []
for path, report in Reporter._filter_out_ok(self.files).items():
traces = report['traces'][:]
if warnings and report['warnings']:
traces.extend(report['warning_traces'])
for trace in traces:
print('TRACE:')
print('\n '.join((' %s' % trace).splitlines()))
for error in report['errors']:
error['path'] = path
print('%(path)s:%(line)d:%(column)d: E%(code)s %(msg)s' % error)
ret.append(1)
if warnings:
for warning in report['warnings']:
warning['path'] = path
print('%(path)s:%(line)d:%(column)d: W%(code)s %(msg)s' % warning)
return 3 if ret else 0
def json(self, warnings=False, output='-'):
"""Print out the test results in json format
warnings is not respected in this output
"""
ret = [len(r['errors']) for r in self.files.values()]
with Reporter._output_handle(output) as handle:
print(json.dumps(Reporter._filter_out_ok(self.files), indent=4, cls=ReporterEncoder), file=handle)
return 3 if sum(ret) else 0
class Validator(with_metaclass(abc.ABCMeta, object)):
"""Validator instances are intended to be run on a single object. if you
are scanning multiple objects for problems, you'll want to have a separate
Validator for each one."""
def __init__(self, reporter=None):
self.reporter = reporter
@abc.abstractproperty
def object_name(self):
"""Name of the object we validated"""
pass
@abc.abstractproperty
def object_path(self):
"""Path of the object we validated"""
pass
@abc.abstractmethod
def validate(self):
"""Run this method to generate the test results"""
pass
class ModuleValidator(Validator):
BLACKLIST_PATTERNS = ('.git*', '*.pyc', '*.pyo', '.*', '*.md', '*.rst', '*.txt')
BLACKLIST_FILES = frozenset(('.git', '.gitignore', '.travis.yml',
'shippable.yml',
'.gitattributes', '.gitmodules', 'COPYING',
'__init__.py', 'VERSION', 'test-docs.sh'))
BLACKLIST = BLACKLIST_FILES.union(BLACKLIST['MODULE'])
PS_DOC_BLACKLIST = frozenset((
'async_status.ps1',
'slurp.ps1',
'setup.ps1'
))
PS_ARG_VALIDATE_BLACKLIST = frozenset((
'win_dsc.ps1', # win_dsc is a dynamic arg spec, the docs won't ever match
))
WHITELIST_FUTURE_IMPORTS = frozenset(('absolute_import', 'division', 'print_function'))
def __init__(self, path, analyze_arg_spec=False, collection=None, base_branch=None, git_cache=None, reporter=None):
super(ModuleValidator, self).__init__(reporter=reporter or Reporter())
self.path = path
self.basename = os.path.basename(self.path)
self.name = os.path.splitext(self.basename)[0]
self.analyze_arg_spec = analyze_arg_spec
self.collection = collection
self.base_branch = base_branch
self.git_cache = git_cache or GitCache()
self._python_module_override = False
with open(path) as f:
self.text = f.read()
self.length = len(self.text.splitlines())
try:
self.ast = ast.parse(self.text)
except Exception:
self.ast = None
if base_branch:
self.base_module = self._get_base_file()
else:
self.base_module = None
def __enter__(self):
return self
def __exit__(self, exc_type, exc_value, traceback):
if not self.base_module:
return
try:
os.remove(self.base_module)
except Exception:
pass
@property
def object_name(self):
return self.basename
@property
def object_path(self):
return self.path
def _get_collection_meta(self):
"""Implement if we need this for version_added comparisons
"""
pass
def _python_module(self):
if self.path.endswith('.py') or self._python_module_override:
return True
return False
def _powershell_module(self):
if self.path.endswith('.ps1'):
return True
return False
def _just_docs(self):
"""Module can contain just docs and from __future__ boilerplate
"""
try:
for child in self.ast.body:
if not isinstance(child, ast.Assign):
# allowed from __future__ imports
if isinstance(child, ast.ImportFrom) and child.module == '__future__':
for future_import in child.names:
if future_import.name not in self.WHITELIST_FUTURE_IMPORTS:
break
else:
continue
return False
return True
except AttributeError:
return False
def _get_base_branch_module_path(self):
"""List all paths within lib/ansible/modules to try and match a moved module"""
return self.git_cache.base_module_paths.get(self.object_name)
def _has_alias(self):
"""Return true if the module has any aliases."""
return self.object_name in self.git_cache.head_aliased_modules
def _get_base_file(self):
# In case of module moves, look for the original location
base_path = self._get_base_branch_module_path()
command = ['git', 'show', '%s:%s' % (self.base_branch, base_path or self.path)]
p = subprocess.Popen(command, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
stdout, stderr = p.communicate()
if int(p.returncode) != 0:
return None
t = tempfile.NamedTemporaryFile(delete=False)
t.write(stdout)
t.close()
return t.name
def _is_new_module(self):
if self._has_alias():
return False
return not self.object_name.startswith('_') and bool(self.base_branch) and not bool(self.base_module)
def _check_interpreter(self, powershell=False):
if powershell:
if not self.text.startswith('#!powershell\n'):
self.reporter.error(
path=self.object_path,
code='missing-powershell-interpreter',
msg='Interpreter line is not "#!powershell"'
)
return
if not self.text.startswith('#!/usr/bin/python'):
self.reporter.error(
path=self.object_path,
code='missing-python-interpreter',
msg='Interpreter line is not "#!/usr/bin/python"',
)
def _check_type_instead_of_isinstance(self, powershell=False):
if powershell:
return
for line_no, line in enumerate(self.text.splitlines()):
typekeyword = TYPE_REGEX.match(line)
if typekeyword:
# TODO: add column
self.reporter.error(
path=self.object_path,
code='unidiomatic-typecheck',
msg=('Type comparison using type() found. '
'Use isinstance() instead'),
line=line_no + 1
)
def _check_for_sys_exit(self):
# Optimize out the happy path
if 'sys.exit' not in self.text:
return
for line_no, line in enumerate(self.text.splitlines()):
sys_exit_usage = SYS_EXIT_REGEX.match(line)
if sys_exit_usage:
# TODO: add column
self.reporter.error(
path=self.object_path,
code='use-fail-json-not-sys-exit',
msg='sys.exit() call found. Should be exit_json/fail_json',
line=line_no + 1
)
def _check_gpl3_header(self):
header = '\n'.join(self.text.split('\n')[:20])
if ('GNU General Public License' not in header or
('version 3' not in header and 'v3.0' not in header)):
self.reporter.error(
path=self.object_path,
code='missing-gplv3-license',
msg='GPLv3 license header not found in the first 20 lines of the module'
)
elif self._is_new_module():
if len([line for line in header
if 'GNU General Public License' in line]) > 1:
self.reporter.error(
path=self.object_path,
code='use-short-gplv3-license',
msg='Found old style GPLv3 license header: '
'https://docs.ansible.com/ansible/devel/dev_guide/developing_modules_documenting.html#copyright'
)
def _check_for_subprocess(self):
for child in self.ast.body:
if isinstance(child, ast.Import):
if child.names[0].name == 'subprocess':
for line_no, line in enumerate(self.text.splitlines()):
sp_match = SUBPROCESS_REGEX.search(line)
if sp_match:
self.reporter.error(
path=self.object_path,
code='use-run-command-not-popen',
msg=('subprocess.Popen call found. Should be module.run_command'),
line=(line_no + 1),
column=(sp_match.span()[0] + 1)
)
def _check_for_os_call(self):
if 'os.call' in self.text:
for line_no, line in enumerate(self.text.splitlines()):
os_call_match = OS_CALL_REGEX.search(line)
if os_call_match:
self.reporter.error(
path=self.object_path,
code='use-run-command-not-os-call',
msg=('os.call() call found. Should be module.run_command'),
line=(line_no + 1),
column=(os_call_match.span()[0] + 1)
)
def _find_blacklist_imports(self):
for child in self.ast.body:
names = []
if isinstance(child, ast.Import):
names.extend(child.names)
elif isinstance(child, TRY_EXCEPT):
bodies = child.body
for handler in child.handlers:
bodies.extend(handler.body)
for grandchild in bodies:
if isinstance(grandchild, ast.Import):
names.extend(grandchild.names)
for name in names:
# TODO: Add line/col
for blacklist_import, options in BLACKLIST_IMPORTS.items():
if re.search(blacklist_import, name.name):
new_only = options['new_only']
if self._is_new_module() and new_only:
self.reporter.error(
path=self.object_path,
**options['error']
)
elif not new_only:
self.reporter.error(
path=self.object_path,
**options['error']
)
def _find_module_utils(self, main):
linenos = []
found_basic = False
for child in self.ast.body:
if isinstance(child, (ast.Import, ast.ImportFrom)):
names = []
try:
names.append(child.module)
if child.module.endswith('.basic'):
found_basic = True
except AttributeError:
pass
names.extend([n.name for n in child.names])
if [n for n in names if n.startswith('ansible.module_utils')]:
linenos.append(child.lineno)
for name in child.names:
if ('module_utils' in getattr(child, 'module', '') and
isinstance(name, ast.alias) and
name.name == '*'):
msg = (
'module-utils-specific-import',
('module_utils imports should import specific '
'components, not "*"')
)
if self._is_new_module():
self.reporter.error(
path=self.object_path,
code=msg[0],
msg=msg[1],
line=child.lineno
)
else:
self.reporter.warning(
path=self.object_path,
code=msg[0],
msg=msg[1],
line=child.lineno
)
if (isinstance(name, ast.alias) and
name.name == 'basic'):
found_basic = True
if not found_basic:
self.reporter.warning(
path=self.object_path,
code='missing-module-utils-basic-import',
msg='Did not find "ansible.module_utils.basic" import'
)
return linenos
def _get_first_callable(self):
linenos = []
for child in self.ast.body:
if isinstance(child, (ast.FunctionDef, ast.ClassDef)):
linenos.append(child.lineno)
return min(linenos)
def _find_main_call(self, look_for="main"):
""" Ensure that the module ends with:
if __name__ == '__main__':
main()
OR, in the case of modules that are in the docs-only deprecation phase
if __name__ == '__main__':
removed_module()
"""
lineno = False
if_bodies = []
for child in self.ast.body:
if isinstance(child, ast.If):
try:
if child.test.left.id == '__name__':
if_bodies.extend(child.body)
except AttributeError:
pass
bodies = self.ast.body
bodies.extend(if_bodies)
for child in bodies:
# validate that the next to last line is 'if __name__ == "__main__"'
if child.lineno == (self.length - 1):
mainchecked = False
try:
if isinstance(child, ast.If) and \
child.test.left.id == '__name__' and \
len(child.test.ops) == 1 and \
isinstance(child.test.ops[0], ast.Eq) and \
child.test.comparators[0].s == '__main__':
mainchecked = True
except Exception:
pass
if not mainchecked:
self.reporter.error(
path=self.object_path,
code='missing-if-name-main',
msg='Next to last line should be: if __name__ == "__main__":',
line=child.lineno
)
# validate that the final line is a call to main()
if isinstance(child, ast.Expr):
if isinstance(child.value, ast.Call):
if (isinstance(child.value.func, ast.Name) and
child.value.func.id == look_for):
lineno = child.lineno
if lineno < self.length - 1:
self.reporter.error(
path=self.object_path,
code='last-line-main-call',
msg=('Call to %s() not the last line' % look_for),
line=lineno
)
if not lineno:
self.reporter.error(
path=self.object_path,
code='missing-main-call',
msg=('Did not find a call to %s()' % look_for)
)
return lineno or 0
def _find_has_import(self):
for child in self.ast.body:
found_try_except_import = False
found_has = False
if isinstance(child, TRY_EXCEPT):
bodies = child.body
for handler in child.handlers:
bodies.extend(handler.body)
for grandchild in bodies:
if isinstance(grandchild, ast.Import):
found_try_except_import = True
if isinstance(grandchild, ast.Assign):
for target in grandchild.targets:
if target.id.lower().startswith('has_'):
found_has = True
if found_try_except_import and not found_has:
# TODO: Add line/col
self.reporter.warning(
path=self.object_path,
code='try-except-missing-has',
msg='Found Try/Except block without HAS_ assignment'
)
def _ensure_imports_below_docs(self, doc_info, first_callable):
try:
min_doc_line = min(
[doc_info[key]['lineno'] for key in doc_info if doc_info[key]['lineno']]
)
except ValueError:
# We can't perform this validation, as there are no DOCs provided at all
return
max_doc_line = max(
[doc_info[key]['end_lineno'] for key in doc_info if doc_info[key]['end_lineno']]
)
import_lines = []
for child in self.ast.body:
if isinstance(child, (ast.Import, ast.ImportFrom)):
if isinstance(child, ast.ImportFrom) and child.module == '__future__':
# allowed from __future__ imports
for future_import in child.names:
if future_import.name not in self.WHITELIST_FUTURE_IMPORTS:
self.reporter.error(
path=self.object_path,
code='illegal-future-imports',
msg=('Only the following from __future__ imports are allowed: %s'
% ', '.join(self.WHITELIST_FUTURE_IMPORTS)),
line=child.lineno
)
break
else: # for-else. If we didn't find a problem nad break out of the loop, then this is a legal import
continue
import_lines.append(child.lineno)
if child.lineno < min_doc_line:
self.reporter.error(
path=self.object_path,
code='import-before-documentation',
msg=('Import found before documentation variables. '
'All imports must appear below '
'DOCUMENTATION/EXAMPLES/RETURN/ANSIBLE_METADATA.'),
line=child.lineno
)
break
elif isinstance(child, TRY_EXCEPT):
bodies = child.body
for handler in child.handlers:
bodies.extend(handler.body)
for grandchild in bodies:
if isinstance(grandchild, (ast.Import, ast.ImportFrom)):
import_lines.append(grandchild.lineno)
if grandchild.lineno < min_doc_line:
self.reporter.error(
path=self.object_path,
code='import-before-documentation',
msg=('Import found before documentation '
'variables. All imports must appear below '
'DOCUMENTATION/EXAMPLES/RETURN/'
'ANSIBLE_METADATA.'),
line=child.lineno
)
break
for import_line in import_lines:
if not (max_doc_line < import_line < first_callable):
msg = (
'import-placement',
('Imports should be directly below DOCUMENTATION/EXAMPLES/'
'RETURN/ANSIBLE_METADATA.')
)
if self._is_new_module():
self.reporter.error(
path=self.object_path,
code=msg[0],
msg=msg[1],
line=import_line
)
else:
self.reporter.warning(
path=self.object_path,
code=msg[0],
msg=msg[1],
line=import_line
)
def _validate_ps_replacers(self):
# loop all (for/else + error)
# get module list for each
# check "shape" of each module name
module_requires = r'(?im)^#\s*requires\s+\-module(?:s?)\s*(Ansible\.ModuleUtils\..+)'
csharp_requires = r'(?im)^#\s*ansiblerequires\s+\-csharputil\s*(Ansible\..+)'
found_requires = False
for req_stmt in re.finditer(module_requires, self.text):
found_requires = True
# this will bomb on dictionary format - "don't do that"
module_list = [x.strip() for x in req_stmt.group(1).split(',')]
if len(module_list) > 1:
self.reporter.error(
path=self.object_path,
code='multiple-utils-per-requires',
msg='Ansible.ModuleUtils requirements do not support multiple modules per statement: "%s"' % req_stmt.group(0)
)
continue
module_name = module_list[0]
if module_name.lower().endswith('.psm1'):
self.reporter.error(
path=self.object_path,
code='invalid-requires-extension',
msg='Module #Requires should not end in .psm1: "%s"' % module_name
)
for req_stmt in re.finditer(csharp_requires, self.text):
found_requires = True
# this will bomb on dictionary format - "don't do that"
module_list = [x.strip() for x in req_stmt.group(1).split(',')]
if len(module_list) > 1:
self.reporter.error(
path=self.object_path,
code='multiple-csharp-utils-per-requires',
msg='Ansible C# util requirements do not support multiple utils per statement: "%s"' % req_stmt.group(0)
)
continue
module_name = module_list[0]
if module_name.lower().endswith('.cs'):
self.reporter.error(
path=self.object_path,
code='illegal-extension-cs',
msg='Module #AnsibleRequires -CSharpUtil should not end in .cs: "%s"' % module_name
)
# also accept the legacy #POWERSHELL_COMMON replacer signal
if not found_requires and REPLACER_WINDOWS not in self.text:
self.reporter.error(
path=self.object_path,
code='missing-module-utils-import-csharp-requirements',
msg='No Ansible.ModuleUtils or C# Ansible util requirements/imports found'
)
def _find_ps_docs_py_file(self):
if self.object_name in self.PS_DOC_BLACKLIST:
return
py_path = self.path.replace('.ps1', '.py')
if not os.path.isfile(py_path):
self.reporter.error(
path=self.object_path,
code='missing-python-doc',
msg='Missing python documentation file'
)
return py_path
def _get_docs(self):
docs = {
'DOCUMENTATION': {
'value': None,
'lineno': 0,
'end_lineno': 0,
},
'EXAMPLES': {
'value': None,
'lineno': 0,
'end_lineno': 0,
},
'RETURN': {
'value': None,
'lineno': 0,
'end_lineno': 0,
},
'ANSIBLE_METADATA': {
'value': None,
'lineno': 0,
'end_lineno': 0,
}
}
for child in self.ast.body:
if isinstance(child, ast.Assign):
for grandchild in child.targets:
if not isinstance(grandchild, ast.Name):
continue
if grandchild.id == 'DOCUMENTATION':
docs['DOCUMENTATION']['value'] = child.value.s
docs['DOCUMENTATION']['lineno'] = child.lineno
docs['DOCUMENTATION']['end_lineno'] = (
child.lineno + len(child.value.s.splitlines())
)
elif grandchild.id == 'EXAMPLES':
docs['EXAMPLES']['value'] = child.value.s
docs['EXAMPLES']['lineno'] = child.lineno
docs['EXAMPLES']['end_lineno'] = (
child.lineno + len(child.value.s.splitlines())
)
elif grandchild.id == 'RETURN':
docs['RETURN']['value'] = child.value.s
docs['RETURN']['lineno'] = child.lineno
docs['RETURN']['end_lineno'] = (
child.lineno + len(child.value.s.splitlines())
)
elif grandchild.id == 'ANSIBLE_METADATA':
docs['ANSIBLE_METADATA']['value'] = child.value
docs['ANSIBLE_METADATA']['lineno'] = child.lineno
try:
docs['ANSIBLE_METADATA']['end_lineno'] = (
child.lineno + len(child.value.s.splitlines())
)
except AttributeError:
docs['ANSIBLE_METADATA']['end_lineno'] = (
child.value.values[-1].lineno
)
return docs
def _validate_docs_schema(self, doc, schema, name, error_code):
# TODO: Add line/col
errors = []
try:
schema(doc)
except Exception as e:
for error in e.errors:
error.data = doc
errors.extend(e.errors)
for error in errors:
path = [str(p) for p in error.path]
if isinstance(error.data, dict):
error_message = humanize_error(error.data, error)
else:
error_message = error
if path:
combined_path = '%s.%s' % (name, '.'.join(path))
else:
combined_path = name
self.reporter.error(
path=self.object_path,
code=error_code,
msg='%s: %s' % (combined_path, error_message)
)
def _validate_docs(self):
doc_info = self._get_docs()
doc = None
documentation_exists = False
examples_exist = False
returns_exist = False
# We have three ways of marking deprecated/removed files. Have to check each one
# individually and then make sure they all agree
filename_deprecated_or_removed = False
deprecated = False
removed = False
doc_deprecated = None # doc legally might not exist
if self.object_name.startswith('_') and not os.path.islink(self.object_path):
filename_deprecated_or_removed = True
# Have to check the metadata first so that we know if the module is removed or deprecated
metadata = None
if not self.collection:
if not bool(doc_info['ANSIBLE_METADATA']['value']):
self.reporter.error(
path=self.object_path,
code='missing-metadata',
msg='No ANSIBLE_METADATA provided'
)
else:
if isinstance(doc_info['ANSIBLE_METADATA']['value'], ast.Dict):
metadata = ast.literal_eval(
doc_info['ANSIBLE_METADATA']['value']
)
else:
self.reporter.error(
path=self.object_path,
code='missing-metadata-format',
msg='ANSIBLE_METADATA was not provided as a dict, YAML not supported'
)
if metadata:
self._validate_docs_schema(metadata, metadata_1_1_schema(),
'ANSIBLE_METADATA', 'invalid-metadata-type')
# We could validate these via the schema if we knew what the values are ahead of
# time. We can figure that out for deprecated but we can't for removed. Only the
# metadata has that information.
if 'removed' in metadata['status']:
removed = True
if 'deprecated' in metadata['status']:
deprecated = True
if (deprecated or removed) and len(metadata['status']) > 1:
self.reporter.error(
path=self.object_path,
code='missing-metadata-status',
msg='ANSIBLE_METADATA.status must be exactly one of "deprecated" or "removed"'
)
if not removed:
if not bool(doc_info['DOCUMENTATION']['value']):
self.reporter.error(
path=self.object_path,
code='missing-documentation',
msg='No DOCUMENTATION provided'
)
else:
documentation_exists = True
doc, errors, traces = parse_yaml(
doc_info['DOCUMENTATION']['value'],
doc_info['DOCUMENTATION']['lineno'],
self.name, 'DOCUMENTATION'
)
for error in errors:
self.reporter.error(
path=self.object_path,
code='documentation-syntax-error',
**error
)
for trace in traces:
self.reporter.trace(
path=self.object_path,
tracebk=trace
)
if not errors and not traces:
missing_fragment = False
with CaptureStd():
try:
get_docstring(self.path, fragment_loader, verbose=True)
except AssertionError:
fragment = doc['extends_documentation_fragment']
self.reporter.error(
path=self.object_path,
code='missing-doc-fragment',
msg='DOCUMENTATION fragment missing: %s' % fragment
)
missing_fragment = True
except Exception as e:
self.reporter.trace(
path=self.object_path,
tracebk=traceback.format_exc()
)
self.reporter.error(
path=self.object_path,
code='documentation-error',
msg='Unknown DOCUMENTATION error, see TRACE: %s' % e
)
if not missing_fragment:
add_fragments(doc, self.object_path, fragment_loader=fragment_loader)
if 'options' in doc and doc['options'] is None:
self.reporter.error(
path=self.object_path,
code='invalid-documentation-options',
msg='DOCUMENTATION.options must be a dictionary/hash when used',
)
if 'deprecated' in doc and doc.get('deprecated'):
doc_deprecated = True
else:
doc_deprecated = False
if os.path.islink(self.object_path):
# This module has an alias, which we can tell as it's a symlink
# Rather than checking for `module: $filename` we need to check against the true filename
self._validate_docs_schema(
doc,
doc_schema(
os.readlink(self.object_path).split('.')[0],
version_added=not bool(self.collection)
),
'DOCUMENTATION',
'invalid-documentation',
)
else:
# This is the normal case
self._validate_docs_schema(
doc,
doc_schema(
self.object_name.split('.')[0],
version_added=not bool(self.collection)
),
'DOCUMENTATION',
'invalid-documentation',
)
if not self.collection:
existing_doc = self._check_for_new_args(doc, metadata)
self._check_version_added(doc, existing_doc)
if not bool(doc_info['EXAMPLES']['value']):
self.reporter.error(
path=self.object_path,
code='missing-examples',
msg='No EXAMPLES provided'
)
else:
_doc, errors, traces = parse_yaml(doc_info['EXAMPLES']['value'],
doc_info['EXAMPLES']['lineno'],
self.name, 'EXAMPLES', load_all=True)
for error in errors:
self.reporter.error(
path=self.object_path,
code='invalid-examples',
**error
)
for trace in traces:
self.reporter.trace(
path=self.object_path,
tracebk=trace
)
if not bool(doc_info['RETURN']['value']):
if self._is_new_module():
self.reporter.error(
path=self.object_path,
code='missing-return',
msg='No RETURN provided'
)
else:
self.reporter.warning(
path=self.object_path,
code='missing-return-legacy',
msg='No RETURN provided'
)
else:
data, errors, traces = parse_yaml(doc_info['RETURN']['value'],
doc_info['RETURN']['lineno'],
self.name, 'RETURN')
self._validate_docs_schema(data, return_schema, 'RETURN', 'return-syntax-error')
for error in errors:
self.reporter.error(
path=self.object_path,
code='return-syntax-error',
**error
)
for trace in traces:
self.reporter.trace(
path=self.object_path,
tracebk=trace
)
# Check for mismatched deprecation
mismatched_deprecation = True
if not (filename_deprecated_or_removed or removed or deprecated or doc_deprecated):
mismatched_deprecation = False
else:
if (filename_deprecated_or_removed and deprecated and doc_deprecated):
mismatched_deprecation = False
if (filename_deprecated_or_removed and removed and not (documentation_exists or examples_exist or returns_exist)):
mismatched_deprecation = False
if mismatched_deprecation:
self.reporter.error(
path=self.object_path,
code='deprecation-mismatch',
msg='Module deprecation/removed must agree in Metadata, by prepending filename with'
' "_", and setting DOCUMENTATION.deprecated for deprecation or by removing all'
' documentation for removed'
)
return doc_info, doc
def _check_version_added(self, doc, existing_doc):
version_added_raw = doc.get('version_added')
try:
version_added = StrictVersion(str(doc.get('version_added', '0.0') or '0.0'))
except ValueError:
version_added = doc.get('version_added', '0.0')
if self._is_new_module() or version_added != 'historical':
self.reporter.error(
path=self.object_path,
code='module-invalid-version-added',
msg='version_added is not a valid version number: %r' % version_added
)
return
if existing_doc and str(version_added_raw) != str(existing_doc.get('version_added')):
self.reporter.error(
path=self.object_path,
code='module-incorrect-version-added',
msg='version_added should be %r. Currently %r' % (existing_doc.get('version_added'),
version_added_raw)
)
if not self._is_new_module():
return
should_be = '.'.join(ansible_version.split('.')[:2])
strict_ansible_version = StrictVersion(should_be)
if (version_added < strict_ansible_version or
strict_ansible_version < version_added):
self.reporter.error(
path=self.object_path,
code='module-incorrect-version-added',
msg='version_added should be %r. Currently %r' % (should_be, version_added_raw)
)
def _validate_ansible_module_call(self, docs):
try:
spec, args, kwargs = get_argument_spec(self.path, self.collection)
except AnsibleModuleNotInitialized:
self.reporter.error(
path=self.object_path,
code='ansible-module-not-initialized',
msg="Execution of the module did not result in initialization of AnsibleModule",
)
return
except AnsibleModuleImportError as e:
self.reporter.error(
path=self.object_path,
code='import-error',
msg="Exception attempting to import module for argument_spec introspection, '%s'" % e
)
self.reporter.trace(
path=self.object_path,
tracebk=traceback.format_exc()
)
return
self._validate_docs_schema(kwargs, ansible_module_kwargs_schema(), 'AnsibleModule', 'invalid-ansiblemodule-schema')
self._validate_argument_spec(docs, spec, kwargs)
def _validate_argument_spec(self, docs, spec, kwargs, context=None):
if not self.analyze_arg_spec:
return
if docs is None:
docs = {}
if context is None:
context = []
try:
if not context:
add_fragments(docs, self.object_path, fragment_loader=fragment_loader)
except Exception:
# Cannot merge fragments
return
# Use this to access type checkers later
module = NoArgsAnsibleModule({})
provider_args = set()
args_from_argspec = set()
deprecated_args_from_argspec = set()
doc_options = docs.get('options', {})
if doc_options is None:
doc_options = {}
for arg, data in spec.items():
if not isinstance(data, dict):
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " must be a dictionary/hash when used"
self.reporter.error(
path=self.object_path,
code='invalid-argument-spec',
msg=msg,
)
continue
aliases = data.get('aliases', [])
if arg in aliases:
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " is specified as its own alias"
self.reporter.error(
path=self.object_path,
code='parameter-alias-self',
msg=msg
)
if len(aliases) > len(set(aliases)):
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " has at least one alias specified multiple times in aliases"
self.reporter.error(
path=self.object_path,
code='parameter-alias-repeated',
msg=msg
)
if not context and arg == 'state':
bad_states = set(['list', 'info', 'get']) & set(data.get('choices', set()))
for bad_state in bad_states:
self.reporter.error(
path=self.object_path,
code='parameter-state-invalid-choice',
msg="Argument 'state' includes the value '%s' as a choice" % bad_state)
if not data.get('removed_in_version', None):
args_from_argspec.add(arg)
args_from_argspec.update(aliases)
else:
deprecated_args_from_argspec.add(arg)
deprecated_args_from_argspec.update(aliases)
if arg == 'provider' and self.object_path.startswith('lib/ansible/modules/network/'):
if data.get('options') is not None and not isinstance(data.get('options'), Mapping):
self.reporter.error(
path=self.object_path,
code='invalid-argument-spec-options',
msg="Argument 'options' in argument_spec['provider'] must be a dictionary/hash when used",
)
elif data.get('options'):
# Record provider options from network modules, for later comparison
for provider_arg, provider_data in data.get('options', {}).items():
provider_args.add(provider_arg)
provider_args.update(provider_data.get('aliases', []))
if data.get('required') and data.get('default', object) != object:
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " is marked as required but specifies a default. Arguments with a" \
" default should not be marked as required"
self.reporter.error(
path=self.object_path,
code='no-default-for-required-parameter',
msg=msg
)
if arg in provider_args:
# Provider args are being removed from network module top level
# don't validate docs<->arg_spec checks below
continue
_type = data.get('type', 'str')
if callable(_type):
_type_checker = _type
else:
_type_checker = module._CHECK_ARGUMENT_TYPES_DISPATCHER.get(_type)
_elements = data.get('elements')
if (_type == 'list') and not _elements:
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " defines type as list but elements is not defined"
self.reporter.error(
path=self.object_path,
code='parameter-list-no-elements',
msg=msg
)
if _elements:
if not callable(_elements):
module._CHECK_ARGUMENT_TYPES_DISPATCHER.get(_elements)
if _type != 'list':
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " defines elements as %s but it is valid only when value of parameter type is list" % _elements
self.reporter.error(
path=self.object_path,
code='parameter-invalid-elements',
msg=msg
)
arg_default = None
if 'default' in data and not is_empty(data['default']):
try:
with CaptureStd():
arg_default = _type_checker(data['default'])
except (Exception, SystemExit):
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " defines default as (%r) but this is incompatible with parameter type %r" % (data['default'], _type)
self.reporter.error(
path=self.object_path,
code='incompatible-default-type',
msg=msg
)
continue
elif data.get('default') is None and _type == 'bool' and 'options' not in data:
arg_default = False
doc_options_args = []
for alias in sorted(set([arg] + list(aliases))):
if alias in doc_options:
doc_options_args.append(alias)
if len(doc_options_args) == 0:
# Undocumented arguments will be handled later (search for undocumented-parameter)
doc_options_arg = {}
else:
doc_options_arg = doc_options[doc_options_args[0]]
if len(doc_options_args) > 1:
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " with aliases %s is documented multiple times, namely as %s" % (
", ".join([("'%s'" % alias) for alias in aliases]),
", ".join([("'%s'" % alias) for alias in doc_options_args])
)
self.reporter.error(
path=self.object_path,
code='parameter-documented-multiple-times',
msg=msg
)
try:
doc_default = None
if 'default' in doc_options_arg and not is_empty(doc_options_arg['default']):
with CaptureStd():
doc_default = _type_checker(doc_options_arg['default'])
elif doc_options_arg.get('default') is None and _type == 'bool' and 'suboptions' not in doc_options_arg:
doc_default = False
except (Exception, SystemExit):
msg = "Argument '%s' in documentation" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " defines default as (%r) but this is incompatible with parameter type %r" % (doc_options_arg.get('default'), _type)
self.reporter.error(
path=self.object_path,
code='doc-default-incompatible-type',
msg=msg
)
continue
if arg_default != doc_default:
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " defines default as (%r) but documentation defines default as (%r)" % (arg_default, doc_default)
self.reporter.error(
path=self.object_path,
code='doc-default-does-not-match-spec',
msg=msg
)
doc_type = doc_options_arg.get('type')
if 'type' in data and data['type'] is not None:
if doc_type is None:
if not arg.startswith('_'): # hidden parameter, for example _raw_params
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " defines type as %r but documentation doesn't define type" % (data['type'])
self.reporter.error(
path=self.object_path,
code='parameter-type-not-in-doc',
msg=msg
)
elif data['type'] != doc_type:
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " defines type as %r but documentation defines type as %r" % (data['type'], doc_type)
self.reporter.error(
path=self.object_path,
code='doc-type-does-not-match-spec',
msg=msg
)
else:
if doc_type is None:
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " uses default type ('str') but documentation doesn't define type"
self.reporter.error(
path=self.object_path,
code='doc-missing-type',
msg=msg
)
elif doc_type != 'str':
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += "implies type as 'str' but documentation defines as %r" % doc_type
self.reporter.error(
path=self.object_path,
code='implied-parameter-type-mismatch',
msg=msg
)
doc_choices = []
try:
for choice in doc_options_arg.get('choices', []):
try:
with CaptureStd():
doc_choices.append(_type_checker(choice))
except (Exception, SystemExit):
msg = "Argument '%s' in documentation" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " defines choices as (%r) but this is incompatible with argument type %r" % (choice, _type)
self.reporter.error(
path=self.object_path,
code='doc-choices-incompatible-type',
msg=msg
)
raise StopIteration()
except StopIteration:
continue
arg_choices = []
try:
for choice in data.get('choices', []):
try:
with CaptureStd():
arg_choices.append(_type_checker(choice))
except (Exception, SystemExit):
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " defines choices as (%r) but this is incompatible with argument type %r" % (choice, _type)
self.reporter.error(
path=self.object_path,
code='incompatible-choices',
msg=msg
)
raise StopIteration()
except StopIteration:
continue
if not compare_unordered_lists(arg_choices, doc_choices):
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " defines choices as (%r) but documentation defines choices as (%r)" % (arg_choices, doc_choices)
self.reporter.error(
path=self.object_path,
code='doc-choices-do-not-match-spec',
msg=msg
)
doc_required = doc_options_arg.get('required', False)
data_required = data.get('required', False)
if (doc_required or data_required) and not (doc_required and data_required):
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
if doc_required:
msg += " is not required, but is documented as being required"
else:
msg += " is required, but is not documented as being required"
self.reporter.error(
path=self.object_path,
code='doc-required-mismatch',
msg=msg
)
doc_elements = doc_options_arg.get('elements', None)
doc_type = doc_options_arg.get('type', 'str')
data_elements = data.get('elements', None)
if (doc_elements and not doc_type == 'list'):
msg = "Argument '%s " % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " defines parameter elements as %s but it is valid only when value of parameter type is list" % doc_elements
self.reporter.error(
path=self.object_path,
code='doc-elements-invalid',
msg=msg
)
if (doc_elements or data_elements) and not (doc_elements == data_elements):
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
if data_elements:
msg += " specifies elements as %s," % data_elements
else:
msg += " does not specify elements,"
if doc_elements:
msg += "but elements is documented as being %s" % doc_elements
else:
msg += "but elements is not documented"
self.reporter.error(
path=self.object_path,
code='doc-elements-mismatch',
msg=msg
)
spec_suboptions = data.get('options')
doc_suboptions = doc_options_arg.get('suboptions', {})
if spec_suboptions:
if not doc_suboptions:
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " has sub-options but documentation does not define it"
self.reporter.error(
path=self.object_path,
code='missing-suboption-docs',
msg=msg
)
self._validate_argument_spec({'options': doc_suboptions}, spec_suboptions, kwargs, context=context + [arg])
for arg in args_from_argspec:
if not str(arg).isidentifier():
msg = "Argument '%s' in argument_spec" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " is not a valid python identifier"
self.reporter.error(
path=self.object_path,
code='parameter-invalid',
msg=msg
)
if docs:
args_from_docs = set()
for arg, data in doc_options.items():
args_from_docs.add(arg)
args_from_docs.update(data.get('aliases', []))
args_missing_from_docs = args_from_argspec.difference(args_from_docs)
docs_missing_from_args = args_from_docs.difference(args_from_argspec | deprecated_args_from_argspec)
for arg in args_missing_from_docs:
if arg in provider_args:
# Provider args are being removed from network module top level
# So they are likely not documented on purpose
continue
msg = "Argument '%s'" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " is listed in the argument_spec, but not documented in the module documentation"
self.reporter.error(
path=self.object_path,
code='undocumented-parameter',
msg=msg
)
for arg in docs_missing_from_args:
msg = "Argument '%s'" % arg
if context:
msg += " found in %s" % " -> ".join(context)
msg += " is listed in DOCUMENTATION.options, but not accepted by the module argument_spec"
self.reporter.error(
path=self.object_path,
code='nonexistent-parameter-documented',
msg=msg
)
def _check_for_new_args(self, doc, metadata):
if not self.base_branch or self._is_new_module():
return
with CaptureStd():
try:
existing_doc, dummy_examples, dummy_return, existing_metadata = get_docstring(self.base_module, fragment_loader, verbose=True)
existing_options = existing_doc.get('options', {}) or {}
except AssertionError:
fragment = doc['extends_documentation_fragment']
self.reporter.warning(
path=self.object_path,
code='missing-existing-doc-fragment',
msg='Pre-existing DOCUMENTATION fragment missing: %s' % fragment
)
return
except Exception as e:
self.reporter.warning_trace(
path=self.object_path,
tracebk=e
)
self.reporter.warning(
path=self.object_path,
code='unknown-doc-fragment',
msg=('Unknown pre-existing DOCUMENTATION error, see TRACE. Submodule refs may need updated')
)
return
try:
mod_version_added = StrictVersion()
mod_version_added.parse(
str(existing_doc.get('version_added', '0.0'))
)
except ValueError:
mod_version_added = StrictVersion('0.0')
if self.base_branch and 'stable-' in self.base_branch:
metadata.pop('metadata_version', None)
metadata.pop('version', None)
if metadata != existing_metadata:
self.reporter.error(
path=self.object_path,
code='metadata-changed',
msg=('ANSIBLE_METADATA cannot be changed in a point release for a stable branch')
)
options = doc.get('options', {}) or {}
should_be = '.'.join(ansible_version.split('.')[:2])
strict_ansible_version = StrictVersion(should_be)
for option, details in options.items():
try:
names = [option] + details.get('aliases', [])
except (TypeError, AttributeError):
# Reporting of this syntax error will be handled by schema validation.
continue
if any(name in existing_options for name in names):
for name in names:
existing_version = existing_options.get(name, {}).get('version_added')
if existing_version:
break
current_version = details.get('version_added')
if str(current_version) != str(existing_version):
self.reporter.error(
path=self.object_path,
code='option-incorrect-version-added',
msg=('version_added for new option (%s) should '
'be %r. Currently %r' %
(option, existing_version, current_version))
)
continue
try:
version_added = StrictVersion()
version_added.parse(
str(details.get('version_added', '0.0'))
)
except ValueError:
version_added = details.get('version_added', '0.0')
self.reporter.error(
path=self.object_path,
code='module-invalid-version-added-number',
msg=('version_added for new option (%s) '
'is not a valid version number: %r' %
(option, version_added))
)
continue
except Exception:
# If there is any other exception it should have been caught
# in schema validation, so we won't duplicate errors by
# listing it again
continue
if (strict_ansible_version != mod_version_added and
(version_added < strict_ansible_version or
strict_ansible_version < version_added)):
self.reporter.error(
path=self.object_path,
code='option-incorrect-version-added',
msg=('version_added for new option (%s) should '
'be %r. Currently %r' %
(option, should_be, version_added))
)
return existing_doc
@staticmethod
def is_blacklisted(path):
base_name = os.path.basename(path)
file_name = os.path.splitext(base_name)[0]
if file_name.startswith('_') and os.path.islink(path):
return True
if not frozenset((base_name, file_name)).isdisjoint(ModuleValidator.BLACKLIST):
return True
for pat in ModuleValidator.BLACKLIST_PATTERNS:
if fnmatch(base_name, pat):
return True
return False
def validate(self):
super(ModuleValidator, self).validate()
if not self._python_module() and not self._powershell_module():
self.reporter.error(
path=self.object_path,
code='invalid-extension',
msg=('Official Ansible modules must have a .py '
'extension for python modules or a .ps1 '
'for powershell modules')
)
self._python_module_override = True
if self._python_module() and self.ast is None:
self.reporter.error(
path=self.object_path,
code='python-syntax-error',
msg='Python SyntaxError while parsing module'
)
try:
compile(self.text, self.path, 'exec')
except Exception:
self.reporter.trace(
path=self.object_path,
tracebk=traceback.format_exc()
)
return
end_of_deprecation_should_be_removed_only = False
if self._python_module():
doc_info, docs = self._validate_docs()
# See if current version => deprecated.removed_in, ie, should be docs only
if isinstance(doc_info['ANSIBLE_METADATA']['value'], ast.Dict) and 'removed' in ast.literal_eval(doc_info['ANSIBLE_METADATA']['value'])['status']:
end_of_deprecation_should_be_removed_only = True
elif docs and 'deprecated' in docs and docs['deprecated'] is not None:
try:
removed_in = StrictVersion(str(docs.get('deprecated')['removed_in']))
except ValueError:
end_of_deprecation_should_be_removed_only = False
else:
strict_ansible_version = StrictVersion('.'.join(ansible_version.split('.')[:2]))
end_of_deprecation_should_be_removed_only = strict_ansible_version >= removed_in
if self._python_module() and not self._just_docs() and not end_of_deprecation_should_be_removed_only:
self._validate_ansible_module_call(docs)
self._check_for_sys_exit()
self._find_blacklist_imports()
main = self._find_main_call()
self._find_module_utils(main)
self._find_has_import()
first_callable = self._get_first_callable()
self._ensure_imports_below_docs(doc_info, first_callable)
self._check_for_subprocess()
self._check_for_os_call()
if self._powershell_module():
self._validate_ps_replacers()
docs_path = self._find_ps_docs_py_file()
# We can only validate PowerShell arg spec if it is using the new Ansible.Basic.AnsibleModule util
pattern = r'(?im)^#\s*ansiblerequires\s+\-csharputil\s*Ansible\.Basic'
if re.search(pattern, self.text) and self.object_name not in self.PS_ARG_VALIDATE_BLACKLIST:
with ModuleValidator(docs_path, base_branch=self.base_branch, git_cache=self.git_cache) as docs_mv:
docs = docs_mv._validate_docs()[1]
self._validate_ansible_module_call(docs)
self._check_gpl3_header()
if not self._just_docs() and not end_of_deprecation_should_be_removed_only:
self._check_interpreter(powershell=self._powershell_module())
self._check_type_instead_of_isinstance(
powershell=self._powershell_module()
)
if end_of_deprecation_should_be_removed_only:
# Ensure that `if __name__ == '__main__':` calls `removed_module()` which ensure that the module has no code in
main = self._find_main_call('removed_module')
# FIXME: Ensure that the version in the call to removed_module is less than +2.
# Otherwise it's time to remove the file (This may need to be done in another test to
# avoid breaking whenever the Ansible version bumps)
class PythonPackageValidator(Validator):
BLACKLIST_FILES = frozenset(('__pycache__',))
def __init__(self, path, reporter=None):
super(PythonPackageValidator, self).__init__(reporter=reporter or Reporter())
self.path = path
self.basename = os.path.basename(path)
@property
def object_name(self):
return self.basename
@property
def object_path(self):
return self.path
def validate(self):
super(PythonPackageValidator, self).validate()
if self.basename in self.BLACKLIST_FILES:
return
init_file = os.path.join(self.path, '__init__.py')
if not os.path.exists(init_file):
self.reporter.error(
path=self.object_path,
code='subdirectory-missing-init',
msg='Ansible module subdirectories must contain an __init__.py'
)
def setup_collection_loader():
def get_source(self, fullname):
mod = sys.modules.get(fullname)
if not mod:
mod = self.load_module(fullname)
with open(to_bytes(mod.__file__), 'rb') as mod_file:
source = mod_file.read()
return source
def get_code(self, fullname):
return compile(source=self.get_source(fullname), filename=self.get_filename(fullname), mode='exec', flags=0, dont_inherit=True)
def is_package(self, fullname):
return self.get_filename(fullname).endswith('__init__.py')
def get_filename(self, fullname):
mod = sys.modules.get(fullname) or self.load_module(fullname)
return mod.__file__
# monkeypatch collection loader to work with runpy
# remove this (and the associated code above) once implemented natively in the collection loader
AnsibleCollectionLoader.get_source = get_source
AnsibleCollectionLoader.get_code = get_code
AnsibleCollectionLoader.is_package = is_package
AnsibleCollectionLoader.get_filename = get_filename
collection_loader = AnsibleCollectionLoader()
# allow importing code from collections when testing a collection
# noinspection PyCallingNonCallable
sys.meta_path.insert(0, collection_loader)
def re_compile(value):
"""
Argparse expects things to raise TypeError, re.compile raises an re.error
exception
This function is a shorthand to convert the re.error exception to a
TypeError
"""
try:
return re.compile(value)
except re.error as e:
raise TypeError(e)
def run():
parser = argparse.ArgumentParser(prog="validate-modules")
parser.add_argument('modules', nargs='+',
help='Path to module or module directory')
parser.add_argument('-w', '--warnings', help='Show warnings',
action='store_true')
parser.add_argument('--exclude', help='RegEx exclusion pattern',
type=re_compile)
parser.add_argument('--arg-spec', help='Analyze module argument spec',
action='store_true', default=False)
parser.add_argument('--base-branch', default=None,
help='Used in determining if new options were added')
parser.add_argument('--format', choices=['json', 'plain'], default='plain',
help='Output format. Default: "%(default)s"')
parser.add_argument('--output', default='-',
help='Output location, use "-" for stdout. '
'Default "%(default)s"')
parser.add_argument('--collection',
help='Specifies the path to the collection, when '
'validating files within a collection. Ensure '
'that ANSIBLE_COLLECTIONS_PATHS is set so the '
'contents of the collection can be located')
args = parser.parse_args()
args.modules[:] = [m.rstrip('/') for m in args.modules]
reporter = Reporter()
git_cache = GitCache(args.base_branch)
check_dirs = set()
if args.collection:
setup_collection_loader()
for module in args.modules:
if os.path.isfile(module):
path = module
if args.exclude and args.exclude.search(path):
continue
if ModuleValidator.is_blacklisted(path):
continue
with ModuleValidator(path, collection=args.collection, analyze_arg_spec=args.arg_spec,
base_branch=args.base_branch, git_cache=git_cache, reporter=reporter) as mv1:
mv1.validate()
check_dirs.add(os.path.dirname(path))
for root, dirs, files in os.walk(module):
basedir = root[len(module) + 1:].split('/', 1)[0]
if basedir in BLACKLIST_DIRS:
continue
for dirname in dirs:
if root == module and dirname in BLACKLIST_DIRS:
continue
path = os.path.join(root, dirname)
if args.exclude and args.exclude.search(path):
continue
check_dirs.add(path)
for filename in files:
path = os.path.join(root, filename)
if args.exclude and args.exclude.search(path):
continue
if ModuleValidator.is_blacklisted(path):
continue
with ModuleValidator(path, collection=args.collection, analyze_arg_spec=args.arg_spec,
base_branch=args.base_branch, git_cache=git_cache, reporter=reporter) as mv2:
mv2.validate()
if not args.collection:
for path in sorted(check_dirs):
pv = PythonPackageValidator(path, reporter=reporter)
pv.validate()
if args.format == 'plain':
sys.exit(reporter.plain(warnings=args.warnings, output=args.output))
else:
sys.exit(reporter.json(warnings=args.warnings, output=args.output))
class GitCache:
def __init__(self, base_branch):
self.base_branch = base_branch
if self.base_branch:
self.base_tree = self._git(['ls-tree', '-r', '--name-only', self.base_branch, 'lib/ansible/modules/'])
else:
self.base_tree = []
try:
self.head_tree = self._git(['ls-tree', '-r', '--name-only', 'HEAD', 'lib/ansible/modules/'])
except GitError as ex:
if ex.status == 128:
# fallback when there is no .git directory
self.head_tree = self._get_module_files()
else:
raise
except OSError as ex:
if ex.errno == errno.ENOENT:
# fallback when git is not installed
self.head_tree = self._get_module_files()
else:
raise
self.base_module_paths = dict((os.path.basename(p), p) for p in self.base_tree if os.path.splitext(p)[1] in ('.py', '.ps1'))
self.base_module_paths.pop('__init__.py', None)
self.head_aliased_modules = set()
for path in self.head_tree:
filename = os.path.basename(path)
if filename.startswith('_') and filename != '__init__.py':
if os.path.islink(path):
self.head_aliased_modules.add(os.path.basename(os.path.realpath(path)))
@staticmethod
def _get_module_files():
module_files = []
for (dir_path, dir_names, file_names) in os.walk('lib/ansible/modules/'):
for file_name in file_names:
module_files.append(os.path.join(dir_path, file_name))
return module_files
@staticmethod
def _git(args):
cmd = ['git'] + args
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout, stderr = p.communicate()
if p.returncode != 0:
raise GitError(stderr, p.returncode)
return stdout.decode('utf-8').splitlines()
class GitError(Exception):
def __init__(self, message, status):
super(GitError, self).__init__(message)
self.status = status
def main():
try:
run()
except KeyboardInterrupt:
pass
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 59,617 |
grafana_dashboard: add dashboard report "send() got multiple values for keyword argument 'MESSAGE'"
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
Add dashboard report "send() got multiple values for keyword argument 'MESSAGE'"
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
grafana_dashboard
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.8.3
config file = /home/namnh/workspace/ansible-setup/ansible.cfg
configured module search path = ['/home/namnh/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/namnh/.venvs/ansible/lib/python3.6/site-packages/ansible
executable location = /home/namnh/.venvs/ansible/bin/ansible
python version = 3.6.7 (default, Oct 22 2018, 11:32:17) [GCC 8.2.0]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
DEFAULT_HOST_LIST(/home/namnh/workspace/ansible-setup/ansible.cfg) = ['/home/namnh/workspace/ansible-setup/inventory']
DEFAULT_STDOUT_CALLBACK(/home/namnh/workspace/ansible-setup/ansible.cfg) = yaml
INTERPRETER_PYTHON(/home/namnh/workspace/ansible-setup/ansible.cfg) = /usr/bin/python3
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
- hosts: localhost
tasks:
- name: Import Grafana telegraf dashboard
ignore_errors: yes
become: false
grafana_dashboard:
grafana_url: "{{ grafana_url }}"
grafana_user: "{{ grafana_admin_user }}"
grafana_password: "{{ grafana_admin_password }}"
message: telegraf
overwrite: yes
state: present
path: "{{ playbook_dir }}/configs/telegraf-system.json"
delegate_to: localhost
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
New datasource is added
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
Traceback (most recent call last):
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 114, in <module>
_ansiballz_main()
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 106, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/home/namnh/.ansible/tmp/ansible-tmp-1564113599.2774515-158744002373584/AnsiballZ_grafana_dashboard.py", line 49, in invoke_module
imp.load_module('__main__', mod, module, MOD_DESC)
File "/usr/lib/python3.6/imp.py", line 235, in load_module
return load_source(name, filename, file)
File "/usr/lib/python3.6/imp.py", line 170, in load_source
module = _exec(spec, sys.modules[name])
File "<frozen importlib._bootstrap>", line 618, in _exec
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/__main__.py", line 451, in <module>
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/__main__.py", line 408, in main
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 691, in __init__
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 1946, in _log_invocation
File "/tmp/ansible_grafana_dashboard_payload_apewcyne/ansible_grafana_dashboard_payload.zip/ansible/module_utils/basic.py", line 1904, in log
TypeError: send() got multiple values for keyword argument 'MESSAGE'
msg: |-
MODULE FAILURE
See stdout/stderr for the exact error
rc: 1
```
|
https://github.com/ansible/ansible/issues/59617
|
https://github.com/ansible/ansible/pull/60051
|
00bed0eb1c2ed22a7b56078b9e7911756182ac92
|
b6753b46a987a319ff062a8adcdcd4e0000353ed
| 2019-07-26T04:36:45Z |
python
| 2020-02-18T12:00:16Z |
test/sanity/ignore.txt
|
contrib/inventory/abiquo.py future-import-boilerplate
contrib/inventory/abiquo.py metaclass-boilerplate
contrib/inventory/apache-libcloud.py future-import-boilerplate
contrib/inventory/apache-libcloud.py metaclass-boilerplate
contrib/inventory/apstra_aos.py future-import-boilerplate
contrib/inventory/apstra_aos.py metaclass-boilerplate
contrib/inventory/azure_rm.py future-import-boilerplate
contrib/inventory/azure_rm.py metaclass-boilerplate
contrib/inventory/brook.py future-import-boilerplate
contrib/inventory/brook.py metaclass-boilerplate
contrib/inventory/cloudforms.py future-import-boilerplate
contrib/inventory/cloudforms.py metaclass-boilerplate
contrib/inventory/cobbler.py future-import-boilerplate
contrib/inventory/cobbler.py metaclass-boilerplate
contrib/inventory/collins.py future-import-boilerplate
contrib/inventory/collins.py metaclass-boilerplate
contrib/inventory/consul_io.py future-import-boilerplate
contrib/inventory/consul_io.py metaclass-boilerplate
contrib/inventory/digital_ocean.py future-import-boilerplate
contrib/inventory/digital_ocean.py metaclass-boilerplate
contrib/inventory/ec2.py future-import-boilerplate
contrib/inventory/ec2.py metaclass-boilerplate
contrib/inventory/fleet.py future-import-boilerplate
contrib/inventory/fleet.py metaclass-boilerplate
contrib/inventory/foreman.py future-import-boilerplate
contrib/inventory/foreman.py metaclass-boilerplate
contrib/inventory/freeipa.py future-import-boilerplate
contrib/inventory/freeipa.py metaclass-boilerplate
contrib/inventory/gce.py future-import-boilerplate
contrib/inventory/gce.py metaclass-boilerplate
contrib/inventory/gce.py pylint:blacklisted-name
contrib/inventory/infoblox.py future-import-boilerplate
contrib/inventory/infoblox.py metaclass-boilerplate
contrib/inventory/jail.py future-import-boilerplate
contrib/inventory/jail.py metaclass-boilerplate
contrib/inventory/landscape.py future-import-boilerplate
contrib/inventory/landscape.py metaclass-boilerplate
contrib/inventory/libvirt_lxc.py future-import-boilerplate
contrib/inventory/libvirt_lxc.py metaclass-boilerplate
contrib/inventory/linode.py future-import-boilerplate
contrib/inventory/linode.py metaclass-boilerplate
contrib/inventory/lxc_inventory.py future-import-boilerplate
contrib/inventory/lxc_inventory.py metaclass-boilerplate
contrib/inventory/lxd.py future-import-boilerplate
contrib/inventory/lxd.py metaclass-boilerplate
contrib/inventory/mdt_dynamic_inventory.py future-import-boilerplate
contrib/inventory/mdt_dynamic_inventory.py metaclass-boilerplate
contrib/inventory/nagios_livestatus.py future-import-boilerplate
contrib/inventory/nagios_livestatus.py metaclass-boilerplate
contrib/inventory/nagios_ndo.py future-import-boilerplate
contrib/inventory/nagios_ndo.py metaclass-boilerplate
contrib/inventory/nsot.py future-import-boilerplate
contrib/inventory/nsot.py metaclass-boilerplate
contrib/inventory/openshift.py future-import-boilerplate
contrib/inventory/openshift.py metaclass-boilerplate
contrib/inventory/openstack_inventory.py future-import-boilerplate
contrib/inventory/openstack_inventory.py metaclass-boilerplate
contrib/inventory/openvz.py future-import-boilerplate
contrib/inventory/openvz.py metaclass-boilerplate
contrib/inventory/ovirt.py future-import-boilerplate
contrib/inventory/ovirt.py metaclass-boilerplate
contrib/inventory/ovirt4.py future-import-boilerplate
contrib/inventory/ovirt4.py metaclass-boilerplate
contrib/inventory/packet_net.py future-import-boilerplate
contrib/inventory/packet_net.py metaclass-boilerplate
contrib/inventory/proxmox.py future-import-boilerplate
contrib/inventory/proxmox.py metaclass-boilerplate
contrib/inventory/rackhd.py future-import-boilerplate
contrib/inventory/rackhd.py metaclass-boilerplate
contrib/inventory/rax.py future-import-boilerplate
contrib/inventory/rax.py metaclass-boilerplate
contrib/inventory/rudder.py future-import-boilerplate
contrib/inventory/rudder.py metaclass-boilerplate
contrib/inventory/scaleway.py future-import-boilerplate
contrib/inventory/scaleway.py metaclass-boilerplate
contrib/inventory/serf.py future-import-boilerplate
contrib/inventory/serf.py metaclass-boilerplate
contrib/inventory/softlayer.py future-import-boilerplate
contrib/inventory/softlayer.py metaclass-boilerplate
contrib/inventory/spacewalk.py future-import-boilerplate
contrib/inventory/spacewalk.py metaclass-boilerplate
contrib/inventory/ssh_config.py future-import-boilerplate
contrib/inventory/ssh_config.py metaclass-boilerplate
contrib/inventory/stacki.py future-import-boilerplate
contrib/inventory/stacki.py metaclass-boilerplate
contrib/inventory/vagrant.py future-import-boilerplate
contrib/inventory/vagrant.py metaclass-boilerplate
contrib/inventory/vbox.py future-import-boilerplate
contrib/inventory/vbox.py metaclass-boilerplate
contrib/inventory/vmware.py future-import-boilerplate
contrib/inventory/vmware.py metaclass-boilerplate
contrib/inventory/vmware_inventory.py future-import-boilerplate
contrib/inventory/vmware_inventory.py metaclass-boilerplate
contrib/inventory/zabbix.py future-import-boilerplate
contrib/inventory/zabbix.py metaclass-boilerplate
contrib/inventory/zone.py future-import-boilerplate
contrib/inventory/zone.py metaclass-boilerplate
contrib/vault/azure_vault.py future-import-boilerplate
contrib/vault/azure_vault.py metaclass-boilerplate
contrib/vault/vault-keyring-client.py future-import-boilerplate
contrib/vault/vault-keyring-client.py metaclass-boilerplate
contrib/vault/vault-keyring.py future-import-boilerplate
contrib/vault/vault-keyring.py metaclass-boilerplate
docs/bin/find-plugin-refs.py future-import-boilerplate
docs/bin/find-plugin-refs.py metaclass-boilerplate
docs/docsite/_extensions/pygments_lexer.py future-import-boilerplate
docs/docsite/_extensions/pygments_lexer.py metaclass-boilerplate
docs/docsite/_themes/sphinx_rtd_theme/__init__.py future-import-boilerplate
docs/docsite/_themes/sphinx_rtd_theme/__init__.py metaclass-boilerplate
docs/docsite/rst/conf.py future-import-boilerplate
docs/docsite/rst/conf.py metaclass-boilerplate
docs/docsite/rst/dev_guide/testing/sanity/no-smart-quotes.rst no-smart-quotes
examples/scripts/ConfigureRemotingForAnsible.ps1 pslint:PSCustomUseLiteralPath
examples/scripts/upgrade_to_ps3.ps1 pslint:PSCustomUseLiteralPath
examples/scripts/upgrade_to_ps3.ps1 pslint:PSUseApprovedVerbs
examples/scripts/uptime.py future-import-boilerplate
examples/scripts/uptime.py metaclass-boilerplate
hacking/build-ansible.py shebang # only run by release engineers, Python 3.6+ required
hacking/build_library/build_ansible/announce.py compile-2.6!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/announce.py compile-2.7!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/announce.py compile-3.5!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/dump_config.py compile-2.6!skip # docs build only, 2.7+ required
hacking/build_library/build_ansible/command_plugins/dump_keywords.py compile-2.6!skip # docs build only, 2.7+ required
hacking/build_library/build_ansible/command_plugins/generate_man.py compile-2.6!skip # docs build only, 2.7+ required
hacking/build_library/build_ansible/command_plugins/plugin_formatter.py compile-2.6!skip # docs build only, 2.7+ required
hacking/build_library/build_ansible/command_plugins/porting_guide.py compile-2.6!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/porting_guide.py compile-2.7!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/porting_guide.py compile-3.5!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/release_announcement.py compile-2.6!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/release_announcement.py compile-2.7!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/release_announcement.py compile-3.5!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/update_intersphinx.py compile-2.6!skip # release process and docs build only, 3.5+ required
hacking/build_library/build_ansible/command_plugins/update_intersphinx.py compile-2.7!skip # release process and docs build only, 3.5+ required
hacking/fix_test_syntax.py future-import-boilerplate
hacking/fix_test_syntax.py metaclass-boilerplate
hacking/get_library.py future-import-boilerplate
hacking/get_library.py metaclass-boilerplate
hacking/report.py future-import-boilerplate
hacking/report.py metaclass-boilerplate
hacking/return_skeleton_generator.py future-import-boilerplate
hacking/return_skeleton_generator.py metaclass-boilerplate
hacking/test-module.py future-import-boilerplate
hacking/test-module.py metaclass-boilerplate
hacking/tests/gen_distribution_version_testcase.py future-import-boilerplate
hacking/tests/gen_distribution_version_testcase.py metaclass-boilerplate
lib/ansible/cli/console.py pylint:blacklisted-name
lib/ansible/cli/scripts/ansible_cli_stub.py shebang
lib/ansible/cli/scripts/ansible_connection_cli_stub.py shebang
lib/ansible/compat/selectors/_selectors2.py future-import-boilerplate # ignore bundled
lib/ansible/compat/selectors/_selectors2.py metaclass-boilerplate # ignore bundled
lib/ansible/compat/selectors/_selectors2.py pylint:blacklisted-name
lib/ansible/config/base.yml no-unwanted-files
lib/ansible/config/module_defaults.yml no-unwanted-files
lib/ansible/executor/playbook_executor.py pylint:blacklisted-name
lib/ansible/executor/powershell/async_watchdog.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/executor/powershell/async_wrapper.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/executor/powershell/exec_wrapper.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/executor/task_queue_manager.py pylint:blacklisted-name
lib/ansible/module_utils/_text.py future-import-boilerplate
lib/ansible/module_utils/_text.py metaclass-boilerplate
lib/ansible/module_utils/alicloud_ecs.py future-import-boilerplate
lib/ansible/module_utils/alicloud_ecs.py metaclass-boilerplate
lib/ansible/module_utils/ansible_tower.py future-import-boilerplate
lib/ansible/module_utils/ansible_tower.py metaclass-boilerplate
lib/ansible/module_utils/api.py future-import-boilerplate
lib/ansible/module_utils/api.py metaclass-boilerplate
lib/ansible/module_utils/azure_rm_common.py future-import-boilerplate
lib/ansible/module_utils/azure_rm_common.py metaclass-boilerplate
lib/ansible/module_utils/azure_rm_common_ext.py future-import-boilerplate
lib/ansible/module_utils/azure_rm_common_ext.py metaclass-boilerplate
lib/ansible/module_utils/azure_rm_common_rest.py future-import-boilerplate
lib/ansible/module_utils/azure_rm_common_rest.py metaclass-boilerplate
lib/ansible/module_utils/basic.py metaclass-boilerplate
lib/ansible/module_utils/cloud.py future-import-boilerplate
lib/ansible/module_utils/cloud.py metaclass-boilerplate
lib/ansible/module_utils/common/network.py future-import-boilerplate
lib/ansible/module_utils/common/network.py metaclass-boilerplate
lib/ansible/module_utils/compat/ipaddress.py future-import-boilerplate
lib/ansible/module_utils/compat/ipaddress.py metaclass-boilerplate
lib/ansible/module_utils/compat/ipaddress.py no-assert
lib/ansible/module_utils/compat/ipaddress.py no-unicode-literals
lib/ansible/module_utils/connection.py future-import-boilerplate
lib/ansible/module_utils/connection.py metaclass-boilerplate
lib/ansible/module_utils/database.py future-import-boilerplate
lib/ansible/module_utils/database.py metaclass-boilerplate
lib/ansible/module_utils/digital_ocean.py future-import-boilerplate
lib/ansible/module_utils/digital_ocean.py metaclass-boilerplate
lib/ansible/module_utils/dimensiondata.py future-import-boilerplate
lib/ansible/module_utils/dimensiondata.py metaclass-boilerplate
lib/ansible/module_utils/distro/__init__.py empty-init # breaks namespacing, bundled, do not override
lib/ansible/module_utils/distro/_distro.py future-import-boilerplate # ignore bundled
lib/ansible/module_utils/distro/_distro.py metaclass-boilerplate # ignore bundled
lib/ansible/module_utils/distro/_distro.py no-assert
lib/ansible/module_utils/distro/_distro.py pep8!skip # bundled code we don't want to modify
lib/ansible/module_utils/f5_utils.py future-import-boilerplate
lib/ansible/module_utils/f5_utils.py metaclass-boilerplate
lib/ansible/module_utils/facts/__init__.py empty-init # breaks namespacing, deprecate and eventually remove
lib/ansible/module_utils/facts/network/linux.py pylint:blacklisted-name
lib/ansible/module_utils/facts/sysctl.py future-import-boilerplate
lib/ansible/module_utils/facts/sysctl.py metaclass-boilerplate
lib/ansible/module_utils/facts/system/distribution.py pylint:ansible-bad-function
lib/ansible/module_utils/facts/utils.py future-import-boilerplate
lib/ansible/module_utils/facts/utils.py metaclass-boilerplate
lib/ansible/module_utils/firewalld.py future-import-boilerplate
lib/ansible/module_utils/firewalld.py metaclass-boilerplate
lib/ansible/module_utils/gcdns.py future-import-boilerplate
lib/ansible/module_utils/gcdns.py metaclass-boilerplate
lib/ansible/module_utils/gce.py future-import-boilerplate
lib/ansible/module_utils/gce.py metaclass-boilerplate
lib/ansible/module_utils/gcp.py future-import-boilerplate
lib/ansible/module_utils/gcp.py metaclass-boilerplate
lib/ansible/module_utils/gcp_utils.py future-import-boilerplate
lib/ansible/module_utils/gcp_utils.py metaclass-boilerplate
lib/ansible/module_utils/gitlab.py future-import-boilerplate
lib/ansible/module_utils/gitlab.py metaclass-boilerplate
lib/ansible/module_utils/hwc_utils.py future-import-boilerplate
lib/ansible/module_utils/hwc_utils.py metaclass-boilerplate
lib/ansible/module_utils/infinibox.py future-import-boilerplate
lib/ansible/module_utils/infinibox.py metaclass-boilerplate
lib/ansible/module_utils/ipa.py future-import-boilerplate
lib/ansible/module_utils/ipa.py metaclass-boilerplate
lib/ansible/module_utils/ismount.py future-import-boilerplate
lib/ansible/module_utils/ismount.py metaclass-boilerplate
lib/ansible/module_utils/json_utils.py future-import-boilerplate
lib/ansible/module_utils/json_utils.py metaclass-boilerplate
lib/ansible/module_utils/k8s/common.py metaclass-boilerplate
lib/ansible/module_utils/k8s/raw.py metaclass-boilerplate
lib/ansible/module_utils/k8s/scale.py metaclass-boilerplate
lib/ansible/module_utils/known_hosts.py future-import-boilerplate
lib/ansible/module_utils/known_hosts.py metaclass-boilerplate
lib/ansible/module_utils/kubevirt.py future-import-boilerplate
lib/ansible/module_utils/kubevirt.py metaclass-boilerplate
lib/ansible/module_utils/linode.py future-import-boilerplate
lib/ansible/module_utils/linode.py metaclass-boilerplate
lib/ansible/module_utils/lxd.py future-import-boilerplate
lib/ansible/module_utils/lxd.py metaclass-boilerplate
lib/ansible/module_utils/manageiq.py future-import-boilerplate
lib/ansible/module_utils/manageiq.py metaclass-boilerplate
lib/ansible/module_utils/memset.py future-import-boilerplate
lib/ansible/module_utils/memset.py metaclass-boilerplate
lib/ansible/module_utils/mysql.py future-import-boilerplate
lib/ansible/module_utils/mysql.py metaclass-boilerplate
lib/ansible/module_utils/net_tools/netbox/netbox_utils.py future-import-boilerplate
lib/ansible/module_utils/net_tools/nios/api.py future-import-boilerplate
lib/ansible/module_utils/net_tools/nios/api.py metaclass-boilerplate
lib/ansible/module_utils/netapp.py future-import-boilerplate
lib/ansible/module_utils/netapp.py metaclass-boilerplate
lib/ansible/module_utils/netapp_elementsw_module.py future-import-boilerplate
lib/ansible/module_utils/netapp_elementsw_module.py metaclass-boilerplate
lib/ansible/module_utils/netapp_module.py future-import-boilerplate
lib/ansible/module_utils/netapp_module.py metaclass-boilerplate
lib/ansible/module_utils/network/a10/a10.py future-import-boilerplate
lib/ansible/module_utils/network/a10/a10.py metaclass-boilerplate
lib/ansible/module_utils/network/aireos/aireos.py future-import-boilerplate
lib/ansible/module_utils/network/aireos/aireos.py metaclass-boilerplate
lib/ansible/module_utils/network/aos/aos.py future-import-boilerplate
lib/ansible/module_utils/network/aos/aos.py metaclass-boilerplate
lib/ansible/module_utils/network/aruba/aruba.py future-import-boilerplate
lib/ansible/module_utils/network/aruba/aruba.py metaclass-boilerplate
lib/ansible/module_utils/network/asa/asa.py future-import-boilerplate
lib/ansible/module_utils/network/asa/asa.py metaclass-boilerplate
lib/ansible/module_utils/network/avi/ansible_utils.py future-import-boilerplate
lib/ansible/module_utils/network/avi/ansible_utils.py metaclass-boilerplate
lib/ansible/module_utils/network/avi/avi.py future-import-boilerplate
lib/ansible/module_utils/network/avi/avi.py metaclass-boilerplate
lib/ansible/module_utils/network/avi/avi_api.py future-import-boilerplate
lib/ansible/module_utils/network/avi/avi_api.py metaclass-boilerplate
lib/ansible/module_utils/network/bigswitch/bigswitch.py future-import-boilerplate
lib/ansible/module_utils/network/bigswitch/bigswitch.py metaclass-boilerplate
lib/ansible/module_utils/network/checkpoint/checkpoint.py metaclass-boilerplate
lib/ansible/module_utils/network/cloudengine/ce.py future-import-boilerplate
lib/ansible/module_utils/network/cloudengine/ce.py metaclass-boilerplate
lib/ansible/module_utils/network/cnos/cnos.py future-import-boilerplate
lib/ansible/module_utils/network/cnos/cnos.py metaclass-boilerplate
lib/ansible/module_utils/network/cnos/cnos_devicerules.py future-import-boilerplate
lib/ansible/module_utils/network/cnos/cnos_devicerules.py metaclass-boilerplate
lib/ansible/module_utils/network/cnos/cnos_errorcodes.py future-import-boilerplate
lib/ansible/module_utils/network/cnos/cnos_errorcodes.py metaclass-boilerplate
lib/ansible/module_utils/network/common/cfg/base.py future-import-boilerplate
lib/ansible/module_utils/network/common/cfg/base.py metaclass-boilerplate
lib/ansible/module_utils/network/common/config.py future-import-boilerplate
lib/ansible/module_utils/network/common/config.py metaclass-boilerplate
lib/ansible/module_utils/network/common/facts/facts.py future-import-boilerplate
lib/ansible/module_utils/network/common/facts/facts.py metaclass-boilerplate
lib/ansible/module_utils/network/common/netconf.py future-import-boilerplate
lib/ansible/module_utils/network/common/netconf.py metaclass-boilerplate
lib/ansible/module_utils/network/common/network.py future-import-boilerplate
lib/ansible/module_utils/network/common/network.py metaclass-boilerplate
lib/ansible/module_utils/network/common/parsing.py future-import-boilerplate
lib/ansible/module_utils/network/common/parsing.py metaclass-boilerplate
lib/ansible/module_utils/network/common/utils.py future-import-boilerplate
lib/ansible/module_utils/network/common/utils.py metaclass-boilerplate
lib/ansible/module_utils/network/dellos10/dellos10.py future-import-boilerplate
lib/ansible/module_utils/network/dellos10/dellos10.py metaclass-boilerplate
lib/ansible/module_utils/network/dellos6/dellos6.py future-import-boilerplate
lib/ansible/module_utils/network/dellos6/dellos6.py metaclass-boilerplate
lib/ansible/module_utils/network/dellos9/dellos9.py future-import-boilerplate
lib/ansible/module_utils/network/dellos9/dellos9.py metaclass-boilerplate
lib/ansible/module_utils/network/edgeos/edgeos.py future-import-boilerplate
lib/ansible/module_utils/network/edgeos/edgeos.py metaclass-boilerplate
lib/ansible/module_utils/network/edgeswitch/edgeswitch.py future-import-boilerplate
lib/ansible/module_utils/network/edgeswitch/edgeswitch.py metaclass-boilerplate
lib/ansible/module_utils/network/edgeswitch/edgeswitch_interface.py future-import-boilerplate
lib/ansible/module_utils/network/edgeswitch/edgeswitch_interface.py metaclass-boilerplate
lib/ansible/module_utils/network/edgeswitch/edgeswitch_interface.py pylint:duplicate-string-formatting-argument
lib/ansible/module_utils/network/enos/enos.py future-import-boilerplate
lib/ansible/module_utils/network/enos/enos.py metaclass-boilerplate
lib/ansible/module_utils/network/eos/eos.py future-import-boilerplate
lib/ansible/module_utils/network/eos/eos.py metaclass-boilerplate
lib/ansible/module_utils/network/eos/providers/cli/config/bgp/address_family.py future-import-boilerplate
lib/ansible/module_utils/network/eos/providers/cli/config/bgp/address_family.py metaclass-boilerplate
lib/ansible/module_utils/network/eos/providers/cli/config/bgp/neighbors.py future-import-boilerplate
lib/ansible/module_utils/network/eos/providers/cli/config/bgp/neighbors.py metaclass-boilerplate
lib/ansible/module_utils/network/eos/providers/cli/config/bgp/process.py future-import-boilerplate
lib/ansible/module_utils/network/eos/providers/cli/config/bgp/process.py metaclass-boilerplate
lib/ansible/module_utils/network/eos/providers/module.py future-import-boilerplate
lib/ansible/module_utils/network/eos/providers/module.py metaclass-boilerplate
lib/ansible/module_utils/network/eos/providers/providers.py future-import-boilerplate
lib/ansible/module_utils/network/eos/providers/providers.py metaclass-boilerplate
lib/ansible/module_utils/network/exos/exos.py future-import-boilerplate
lib/ansible/module_utils/network/exos/exos.py metaclass-boilerplate
lib/ansible/module_utils/network/fortimanager/common.py future-import-boilerplate
lib/ansible/module_utils/network/fortimanager/common.py metaclass-boilerplate
lib/ansible/module_utils/network/fortimanager/fortimanager.py future-import-boilerplate
lib/ansible/module_utils/network/fortimanager/fortimanager.py metaclass-boilerplate
lib/ansible/module_utils/network/fortios/fortios.py future-import-boilerplate
lib/ansible/module_utils/network/fortios/fortios.py metaclass-boilerplate
lib/ansible/module_utils/network/frr/frr.py future-import-boilerplate
lib/ansible/module_utils/network/frr/frr.py metaclass-boilerplate
lib/ansible/module_utils/network/frr/providers/cli/config/base.py future-import-boilerplate
lib/ansible/module_utils/network/frr/providers/cli/config/base.py metaclass-boilerplate
lib/ansible/module_utils/network/frr/providers/cli/config/bgp/address_family.py future-import-boilerplate
lib/ansible/module_utils/network/frr/providers/cli/config/bgp/address_family.py metaclass-boilerplate
lib/ansible/module_utils/network/frr/providers/cli/config/bgp/neighbors.py future-import-boilerplate
lib/ansible/module_utils/network/frr/providers/cli/config/bgp/neighbors.py metaclass-boilerplate
lib/ansible/module_utils/network/frr/providers/cli/config/bgp/process.py future-import-boilerplate
lib/ansible/module_utils/network/frr/providers/cli/config/bgp/process.py metaclass-boilerplate
lib/ansible/module_utils/network/frr/providers/module.py future-import-boilerplate
lib/ansible/module_utils/network/frr/providers/module.py metaclass-boilerplate
lib/ansible/module_utils/network/frr/providers/providers.py future-import-boilerplate
lib/ansible/module_utils/network/frr/providers/providers.py metaclass-boilerplate
lib/ansible/module_utils/network/ftd/common.py future-import-boilerplate
lib/ansible/module_utils/network/ftd/common.py metaclass-boilerplate
lib/ansible/module_utils/network/ftd/configuration.py future-import-boilerplate
lib/ansible/module_utils/network/ftd/configuration.py metaclass-boilerplate
lib/ansible/module_utils/network/ftd/device.py future-import-boilerplate
lib/ansible/module_utils/network/ftd/device.py metaclass-boilerplate
lib/ansible/module_utils/network/ftd/fdm_swagger_client.py future-import-boilerplate
lib/ansible/module_utils/network/ftd/fdm_swagger_client.py metaclass-boilerplate
lib/ansible/module_utils/network/ftd/operation.py future-import-boilerplate
lib/ansible/module_utils/network/ftd/operation.py metaclass-boilerplate
lib/ansible/module_utils/network/ios/ios.py future-import-boilerplate
lib/ansible/module_utils/network/ios/ios.py metaclass-boilerplate
lib/ansible/module_utils/network/ios/providers/cli/config/base.py future-import-boilerplate
lib/ansible/module_utils/network/ios/providers/cli/config/base.py metaclass-boilerplate
lib/ansible/module_utils/network/ios/providers/cli/config/bgp/address_family.py future-import-boilerplate
lib/ansible/module_utils/network/ios/providers/cli/config/bgp/address_family.py metaclass-boilerplate
lib/ansible/module_utils/network/ios/providers/cli/config/bgp/neighbors.py future-import-boilerplate
lib/ansible/module_utils/network/ios/providers/cli/config/bgp/neighbors.py metaclass-boilerplate
lib/ansible/module_utils/network/ios/providers/cli/config/bgp/process.py future-import-boilerplate
lib/ansible/module_utils/network/ios/providers/cli/config/bgp/process.py metaclass-boilerplate
lib/ansible/module_utils/network/ios/providers/module.py future-import-boilerplate
lib/ansible/module_utils/network/ios/providers/module.py metaclass-boilerplate
lib/ansible/module_utils/network/ios/providers/providers.py future-import-boilerplate
lib/ansible/module_utils/network/ios/providers/providers.py metaclass-boilerplate
lib/ansible/module_utils/network/iosxr/iosxr.py future-import-boilerplate
lib/ansible/module_utils/network/iosxr/iosxr.py metaclass-boilerplate
lib/ansible/module_utils/network/iosxr/providers/cli/config/bgp/address_family.py future-import-boilerplate
lib/ansible/module_utils/network/iosxr/providers/cli/config/bgp/address_family.py metaclass-boilerplate
lib/ansible/module_utils/network/iosxr/providers/cli/config/bgp/neighbors.py future-import-boilerplate
lib/ansible/module_utils/network/iosxr/providers/cli/config/bgp/neighbors.py metaclass-boilerplate
lib/ansible/module_utils/network/iosxr/providers/cli/config/bgp/process.py future-import-boilerplate
lib/ansible/module_utils/network/iosxr/providers/cli/config/bgp/process.py metaclass-boilerplate
lib/ansible/module_utils/network/iosxr/providers/module.py future-import-boilerplate
lib/ansible/module_utils/network/iosxr/providers/module.py metaclass-boilerplate
lib/ansible/module_utils/network/iosxr/providers/providers.py future-import-boilerplate
lib/ansible/module_utils/network/iosxr/providers/providers.py metaclass-boilerplate
lib/ansible/module_utils/network/junos/argspec/facts/facts.py future-import-boilerplate
lib/ansible/module_utils/network/junos/argspec/facts/facts.py metaclass-boilerplate
lib/ansible/module_utils/network/junos/facts/facts.py future-import-boilerplate
lib/ansible/module_utils/network/junos/facts/facts.py metaclass-boilerplate
lib/ansible/module_utils/network/junos/facts/legacy/base.py future-import-boilerplate
lib/ansible/module_utils/network/junos/facts/legacy/base.py metaclass-boilerplate
lib/ansible/module_utils/network/junos/junos.py future-import-boilerplate
lib/ansible/module_utils/network/junos/junos.py metaclass-boilerplate
lib/ansible/module_utils/network/meraki/meraki.py future-import-boilerplate
lib/ansible/module_utils/network/meraki/meraki.py metaclass-boilerplate
lib/ansible/module_utils/network/netconf/netconf.py future-import-boilerplate
lib/ansible/module_utils/network/netconf/netconf.py metaclass-boilerplate
lib/ansible/module_utils/network/netscaler/netscaler.py future-import-boilerplate
lib/ansible/module_utils/network/netscaler/netscaler.py metaclass-boilerplate
lib/ansible/module_utils/network/nos/nos.py future-import-boilerplate
lib/ansible/module_utils/network/nos/nos.py metaclass-boilerplate
lib/ansible/module_utils/network/nso/nso.py future-import-boilerplate
lib/ansible/module_utils/network/nso/nso.py metaclass-boilerplate
lib/ansible/module_utils/network/nxos/argspec/facts/facts.py future-import-boilerplate
lib/ansible/module_utils/network/nxos/argspec/facts/facts.py metaclass-boilerplate
lib/ansible/module_utils/network/nxos/facts/facts.py future-import-boilerplate
lib/ansible/module_utils/network/nxos/facts/facts.py metaclass-boilerplate
lib/ansible/module_utils/network/nxos/facts/legacy/base.py future-import-boilerplate
lib/ansible/module_utils/network/nxos/facts/legacy/base.py metaclass-boilerplate
lib/ansible/module_utils/network/nxos/nxos.py future-import-boilerplate
lib/ansible/module_utils/network/nxos/nxos.py metaclass-boilerplate
lib/ansible/module_utils/network/nxos/utils/utils.py future-import-boilerplate
lib/ansible/module_utils/network/nxos/utils/utils.py metaclass-boilerplate
lib/ansible/module_utils/network/onyx/onyx.py future-import-boilerplate
lib/ansible/module_utils/network/onyx/onyx.py metaclass-boilerplate
lib/ansible/module_utils/network/ordnance/ordnance.py future-import-boilerplate
lib/ansible/module_utils/network/ordnance/ordnance.py metaclass-boilerplate
lib/ansible/module_utils/network/restconf/restconf.py future-import-boilerplate
lib/ansible/module_utils/network/restconf/restconf.py metaclass-boilerplate
lib/ansible/module_utils/network/routeros/routeros.py future-import-boilerplate
lib/ansible/module_utils/network/routeros/routeros.py metaclass-boilerplate
lib/ansible/module_utils/network/skydive/api.py future-import-boilerplate
lib/ansible/module_utils/network/skydive/api.py metaclass-boilerplate
lib/ansible/module_utils/network/slxos/slxos.py future-import-boilerplate
lib/ansible/module_utils/network/slxos/slxos.py metaclass-boilerplate
lib/ansible/module_utils/network/sros/sros.py future-import-boilerplate
lib/ansible/module_utils/network/sros/sros.py metaclass-boilerplate
lib/ansible/module_utils/network/voss/voss.py future-import-boilerplate
lib/ansible/module_utils/network/voss/voss.py metaclass-boilerplate
lib/ansible/module_utils/network/vyos/vyos.py future-import-boilerplate
lib/ansible/module_utils/network/vyos/vyos.py metaclass-boilerplate
lib/ansible/module_utils/oneandone.py future-import-boilerplate
lib/ansible/module_utils/oneandone.py metaclass-boilerplate
lib/ansible/module_utils/oneview.py metaclass-boilerplate
lib/ansible/module_utils/opennebula.py future-import-boilerplate
lib/ansible/module_utils/opennebula.py metaclass-boilerplate
lib/ansible/module_utils/openstack.py future-import-boilerplate
lib/ansible/module_utils/openstack.py metaclass-boilerplate
lib/ansible/module_utils/oracle/oci_utils.py future-import-boilerplate
lib/ansible/module_utils/oracle/oci_utils.py metaclass-boilerplate
lib/ansible/module_utils/ovirt.py future-import-boilerplate
lib/ansible/module_utils/ovirt.py metaclass-boilerplate
lib/ansible/module_utils/parsing/convert_bool.py future-import-boilerplate
lib/ansible/module_utils/parsing/convert_bool.py metaclass-boilerplate
lib/ansible/module_utils/postgres.py future-import-boilerplate
lib/ansible/module_utils/postgres.py metaclass-boilerplate
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.ArgvParser.psm1 pslint:PSUseApprovedVerbs
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.CommandUtil.psm1 pslint:PSProvideCommentHelp # need to agree on best format for comment location
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.CommandUtil.psm1 pslint:PSUseApprovedVerbs
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.FileUtil.psm1 pslint:PSCustomUseLiteralPath
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.FileUtil.psm1 pslint:PSProvideCommentHelp
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.Legacy.psm1 pslint:PSCustomUseLiteralPath
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.Legacy.psm1 pslint:PSUseApprovedVerbs
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.LinkUtil.psm1 pslint:PSUseApprovedVerbs
lib/ansible/module_utils/pure.py future-import-boilerplate
lib/ansible/module_utils/pure.py metaclass-boilerplate
lib/ansible/module_utils/pycompat24.py future-import-boilerplate
lib/ansible/module_utils/pycompat24.py metaclass-boilerplate
lib/ansible/module_utils/pycompat24.py no-get-exception
lib/ansible/module_utils/rax.py future-import-boilerplate
lib/ansible/module_utils/rax.py metaclass-boilerplate
lib/ansible/module_utils/redhat.py future-import-boilerplate
lib/ansible/module_utils/redhat.py metaclass-boilerplate
lib/ansible/module_utils/remote_management/dellemc/dellemc_idrac.py future-import-boilerplate
lib/ansible/module_utils/remote_management/intersight.py future-import-boilerplate
lib/ansible/module_utils/remote_management/intersight.py metaclass-boilerplate
lib/ansible/module_utils/remote_management/lxca/common.py future-import-boilerplate
lib/ansible/module_utils/remote_management/lxca/common.py metaclass-boilerplate
lib/ansible/module_utils/remote_management/ucs.py future-import-boilerplate
lib/ansible/module_utils/remote_management/ucs.py metaclass-boilerplate
lib/ansible/module_utils/scaleway.py future-import-boilerplate
lib/ansible/module_utils/scaleway.py metaclass-boilerplate
lib/ansible/module_utils/service.py future-import-boilerplate
lib/ansible/module_utils/service.py metaclass-boilerplate
lib/ansible/module_utils/six/__init__.py empty-init # breaks namespacing, bundled, do not override
lib/ansible/module_utils/six/__init__.py future-import-boilerplate # ignore bundled
lib/ansible/module_utils/six/__init__.py metaclass-boilerplate # ignore bundled
lib/ansible/module_utils/six/__init__.py no-basestring
lib/ansible/module_utils/six/__init__.py no-dict-iteritems
lib/ansible/module_utils/six/__init__.py no-dict-iterkeys
lib/ansible/module_utils/six/__init__.py no-dict-itervalues
lib/ansible/module_utils/six/__init__.py replace-urlopen
lib/ansible/module_utils/splitter.py future-import-boilerplate
lib/ansible/module_utils/splitter.py metaclass-boilerplate
lib/ansible/module_utils/storage/hpe3par/hpe3par.py future-import-boilerplate
lib/ansible/module_utils/storage/hpe3par/hpe3par.py metaclass-boilerplate
lib/ansible/module_utils/univention_umc.py future-import-boilerplate
lib/ansible/module_utils/univention_umc.py metaclass-boilerplate
lib/ansible/module_utils/urls.py future-import-boilerplate
lib/ansible/module_utils/urls.py metaclass-boilerplate
lib/ansible/module_utils/urls.py pylint:blacklisted-name
lib/ansible/module_utils/urls.py replace-urlopen
lib/ansible/module_utils/vca.py future-import-boilerplate
lib/ansible/module_utils/vca.py metaclass-boilerplate
lib/ansible/module_utils/vexata.py future-import-boilerplate
lib/ansible/module_utils/vexata.py metaclass-boilerplate
lib/ansible/module_utils/yumdnf.py future-import-boilerplate
lib/ansible/module_utils/yumdnf.py metaclass-boilerplate
lib/ansible/modules/cloud/alicloud/ali_instance.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/alicloud/ali_instance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/alicloud/ali_instance_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/alicloud/ali_instance_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/alicloud/ali_instance_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/alicloud/ali_instance_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/amazon/aws_acm_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_acm_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_batch_compute_environment.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_batch_compute_environment.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_batch_job_definition.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_batch_job_definition.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_batch_job_queue.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_batch_job_queue.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_codebuild.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_codebuild.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_codepipeline.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_codepipeline.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_config_aggregator.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_config_aggregator.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_direct_connect_virtual_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_direct_connect_virtual_interface.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_eks_cluster.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_eks_cluster.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_glue_connection.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_glue_connection.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_glue_job.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_glue_job.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_kms.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_kms.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_netapp_cvs_FileSystems.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_s3.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_s3.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_s3_cors.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_waf_condition.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_waf_condition.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_waf_rule.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_waf_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/aws_waf_web_acl.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/aws_waf_web_acl.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/cloudformation.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/cloudformation.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/cloudformation_stack_set.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/cloudformation_stack_set.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/cloudfront_distribution.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/cloudfront_distribution.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/cloudfront_invalidation.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/cloudfront_invalidation.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/cloudwatchevent_rule.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/cloudwatchevent_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/data_pipeline.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/data_pipeline.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/dynamodb_table.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/dynamodb_table.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_ami.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_ami.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_ami_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_ami_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_asg.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_asg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_customer_gateway_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_customer_gateway_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_elb.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_elb_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_elb_lb.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_eni.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_eni.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_group.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_instance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_instance_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_launch_template.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_launch_template.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_lc.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_lc.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_lc_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_lc_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_metric_alarm.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_metric_alarm.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_placement_group_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_placement_group_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_snapshot_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_snapshot_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_tag.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/cloud/amazon/ec2_transit_gateway_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vol.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/cloud/amazon/ec2_vpc_dhcp_option.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_dhcp_option.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_dhcp_option_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_dhcp_option_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_endpoint.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_endpoint.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_endpoint_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_endpoint_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_igw_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_igw_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_nacl.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_nacl_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_nat_gateway_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_nat_gateway_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_net.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_net.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_net_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_net_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_peering_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_peering_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_route_table.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_route_table.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_subnet_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_subnet_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_vgw_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_vgw_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_vpn.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_vpn.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ec2_vpc_vpn_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ec2_vpc_vpn_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ecs_attribute.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ecs_attribute.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ecs_service.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ecs_service.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ecs_service_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ecs_service_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ecs_task.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ecs_task.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/ecs_taskdefinition.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/ecs_taskdefinition.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/efs.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/efs.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/efs_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/efs_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/elasticache.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/elasticache.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/elasticache_subnet_group.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/elasticache_subnet_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/elb_application_lb.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/elb_application_lb.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/elb_application_lb_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/elb_classic_lb.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/elb_classic_lb_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/elb_instance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/elb_network_lb.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/elb_network_lb.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/elb_target_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/elb_target_group_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/iam.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/iam_group.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/iam_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/iam_role.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/iam_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/lambda.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/lambda.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/rds.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/rds.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/rds_instance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/rds_subnet_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/redshift.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/redshift.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/redshift_subnet_group.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/redshift_subnet_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/route53.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/route53.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/cloud/amazon/route53_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/route53_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/s3_bucket_notification.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/s3_bucket_notification.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/s3_lifecycle.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/amazon/sns_topic.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/amazon/sns_topic.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/atomic/atomic_container.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/atomic/atomic_container.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/atomic/atomic_container.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/cloud/atomic/atomic_container.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/atomic/atomic_container.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/atomic/atomic_host.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/atomic/atomic_image.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_acs.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_acs.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_aks.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_aks.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_aks.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_aks.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_aks.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_aks.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/azure/azure_rm_aks_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_aks_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_aksversion_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_appgateway.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_appgateway.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_appgateway.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_appgateway.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_applicationsecuritygroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_applicationsecuritygroup_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_applicationsecuritygroup_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_appserviceplan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_appserviceplan_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_appserviceplan_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_automationaccount_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_autoscale.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_autoscale.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_autoscale.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/azure/azure_rm_autoscale_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_autoscale_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_availabilityset.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_availabilityset_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_availabilityset_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_azurefirewall.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/azure/azure_rm_azurefirewall.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/azure/azure_rm_azurefirewall.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_azurefirewall.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_azurefirewall.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/azure/azure_rm_batchaccount.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/azure/azure_rm_batchaccount.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_cdnendpoint.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_cdnendpoint.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_cdnendpoint.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_cdnendpoint_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_cdnendpoint_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_cdnendpoint_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_cdnprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_cdnprofile_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_cdnprofile_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_containerinstance.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_containerinstance.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_containerinstance.py validate-modules:doc-type-does-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_containerinstance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_containerinstance.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_containerinstance_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_containerinstance_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_containerregistry.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_containerregistry_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_containerregistry_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_cosmosdbaccount.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_cosmosdbaccount.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/cloud/azure/azure_rm_cosmosdbaccount.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_cosmosdbaccount.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_cosmosdbaccount.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/azure/azure_rm_cosmosdbaccount_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_cosmosdbaccount_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_deployment.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_deployment.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_deployment.py yamllint:unparsable-with-libyaml
lib/ansible/modules/cloud/azure/azure_rm_deployment_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_deployment_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_devtestlab.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_devtestlab_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_devtestlab_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_devtestlabarmtemplate_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_devtestlabartifact_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_devtestlabartifactsource.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_devtestlabartifactsource_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_devtestlabartifactsource_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_devtestlabcustomimage.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_devtestlabcustomimage_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_devtestlabcustomimage_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_devtestlabcustomimage_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_devtestlabenvironment.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_devtestlabenvironment.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_devtestlabenvironment_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_devtestlabenvironment_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_devtestlabpolicy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_devtestlabpolicy_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_devtestlabpolicy_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_devtestlabschedule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_devtestlabschedule_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_devtestlabschedule_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_devtestlabvirtualmachine.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/cloud/azure/azure_rm_devtestlabvirtualmachine.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_devtestlabvirtualmachine.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_devtestlabvirtualmachine.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/azure/azure_rm_devtestlabvirtualmachine_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_devtestlabvirtualmachine_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_devtestlabvirtualnetwork.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_dnsrecordset.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_dnsrecordset.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/azure/azure_rm_dnsrecordset.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/azure/azure_rm_dnsrecordset.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_dnsrecordset_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_dnsrecordset_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_dnszone.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_dnszone.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_dnszone_info.py validate-modules:doc-type-does-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_dnszone_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_dnszone_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_dnszone_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_functionapp.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_functionapp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_functionapp_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_functionapp_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_gallery.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/azure/azure_rm_galleryimage.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_galleryimage.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/azure/azure_rm_galleryimage.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_galleryimage_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_galleryimageversion.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_galleryimageversion.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_galleryimageversion.py validate-modules:doc-type-does-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_galleryimageversion.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/azure/azure_rm_galleryimageversion.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_galleryimageversion.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_hdinsightcluster.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_hdinsightcluster.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_hdinsightcluster_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_hdinsightcluster_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_hdinsightcluster_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_image.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_image.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_image.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_image_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_image_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_image_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_iothub.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_iothub_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_iothubconsumergroup.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_keyvault.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_keyvault.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_keyvault.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_keyvault.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_keyvault_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_keyvault_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_keyvaultkey.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_keyvaultkey_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_keyvaultkey_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_keyvaultsecret.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_loadbalancer.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_loadbalancer.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_loadbalancer.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_loadbalancer.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_loadbalancer.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_loadbalancer_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_loadbalancer_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_lock_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_loganalyticsworkspace.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_loganalyticsworkspace_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_loganalyticsworkspace_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_manageddisk.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_manageddisk_info.py validate-modules:doc-type-does-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_manageddisk_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_mariadbconfiguration.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_mariadbconfiguration_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_mariadbdatabase.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_mariadbfirewallrule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_mariadbserver.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_mariadbserver_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_monitorlogprofile.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_mysqlconfiguration.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_mysqlconfiguration_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_mysqldatabase.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_mysqlfirewallrule.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_mysqlfirewallrule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_mysqlserver.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_mysqlserver_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_networkinterface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_networkinterface.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/azure/azure_rm_networkinterface.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_networkinterface.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_networkinterface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_networkinterface_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_networkinterface_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_networkinterface_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_postgresqlconfiguration.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_postgresqlconfiguration_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_postgresqldatabase.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_postgresqlfirewallrule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_postgresqlserver.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_postgresqlserver_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_publicipaddress.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_publicipaddress.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_publicipaddress.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_publicipaddress_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_publicipaddress_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_rediscache.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_rediscache.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_rediscache.py validate-modules:doc-type-does-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_rediscache.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_rediscache_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_rediscache_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_rediscachefirewallrule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_resource.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_resource.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_resource_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_resource_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_resourcegroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_resourcegroup_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_resourcegroup_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_roleassignment.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_roleassignment_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_roledefinition.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_roledefinition.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/azure/azure_rm_roledefinition.py validate-modules:invalid-argument-spec
lib/ansible/modules/cloud/azure/azure_rm_roledefinition.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/azure/azure_rm_roledefinition.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_roledefinition.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_roledefinition_info.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/azure/azure_rm_roledefinition_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_roledefinition_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_route.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_routetable.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_routetable_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_routetable_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_securitygroup.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_securitygroup.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_securitygroup.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_securitygroup.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_securitygroup.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/azure/azure_rm_securitygroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_securitygroup.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/azure/azure_rm_securitygroup_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_securitygroup_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_servicebus.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_servicebus_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_servicebus_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_servicebus_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_servicebusqueue.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_servicebussaspolicy.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_servicebussaspolicy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_servicebustopic.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_servicebustopic.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_servicebustopicsubscription.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_snapshot.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_snapshot.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/azure/azure_rm_sqldatabase.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/azure/azure_rm_sqldatabase.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_sqldatabase_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_sqldatabase_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_sqlfirewallrule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_sqlfirewallrule_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_sqlserver.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_sqlserver_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_storageaccount.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_storageaccount.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/azure/azure_rm_storageaccount.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_storageaccount.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_storageaccount_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_storageaccount_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_storageaccount_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/azure/azure_rm_storageblob.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_subnet.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_subnet.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_subnet_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_trafficmanagerendpoint.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_trafficmanagerendpoint.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_trafficmanagerendpoint_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_trafficmanagerprofile.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_trafficmanagerprofile.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_trafficmanagerprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_trafficmanagerprofile.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/azure/azure_rm_trafficmanagerprofile_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_trafficmanagerprofile_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualmachine.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_virtualmachine.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_virtualmachine.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_virtualmachine.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualmachine_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_virtualmachine_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualmachineextension.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualmachineextension_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_virtualmachineextension_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualmachineimage_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualmachinescaleset.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_virtualmachinescaleset.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_virtualmachinescaleset.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualmachinescaleset_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_virtualmachinescaleset_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualmachinescalesetextension.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_virtualmachinescalesetextension.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualmachinescalesetextension_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualmachinescalesetinstance.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_virtualmachinescalesetinstance.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualmachinescalesetinstance_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_virtualmachinescalesetinstance_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualnetwork.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_virtualnetwork.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualnetwork_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_virtualnetwork_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualnetworkgateway.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_virtualnetworkgateway.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/azure/azure_rm_virtualnetworkgateway.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_virtualnetworkgateway.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/azure/azure_rm_virtualnetworkgateway.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_virtualnetworkgateway.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualnetworkpeering.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_virtualnetworkpeering.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_virtualnetworkpeering_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_webapp.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_webapp.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_webapp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_webapp_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/azure/azure_rm_webapp_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/azure/azure_rm_webappslot.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/azure/azure_rm_webappslot.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/azure/azure_rm_webappslot.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/centurylink/clc_aa_policy.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/centurylink/clc_aa_policy.py yamllint:unparsable-with-libyaml
lib/ansible/modules/cloud/centurylink/clc_alert_policy.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/centurylink/clc_alert_policy.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/cloud/centurylink/clc_alert_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/centurylink/clc_alert_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/centurylink/clc_alert_policy.py yamllint:unparsable-with-libyaml
lib/ansible/modules/cloud/centurylink/clc_blueprint_package.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/centurylink/clc_blueprint_package.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/cloud/centurylink/clc_blueprint_package.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/centurylink/clc_blueprint_package.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/centurylink/clc_firewall_policy.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/centurylink/clc_firewall_policy.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/centurylink/clc_firewall_policy.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/centurylink/clc_firewall_policy.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/cloud/centurylink/clc_firewall_policy.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/cloud/centurylink/clc_firewall_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/centurylink/clc_firewall_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/centurylink/clc_firewall_policy.py yamllint:unparsable-with-libyaml
lib/ansible/modules/cloud/centurylink/clc_group.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/centurylink/clc_group.py yamllint:unparsable-with-libyaml
lib/ansible/modules/cloud/centurylink/clc_loadbalancer.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/centurylink/clc_loadbalancer.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/centurylink/clc_loadbalancer.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/centurylink/clc_modify_server.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/centurylink/clc_modify_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/centurylink/clc_modify_server.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/centurylink/clc_publicip.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/centurylink/clc_publicip.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/centurylink/clc_publicip.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/centurylink/clc_server.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/centurylink/clc_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/centurylink/clc_server.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/centurylink/clc_server_snapshot.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/centurylink/clc_server_snapshot.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/cloud/centurylink/clc_server_snapshot.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/centurylink/clc_server_snapshot.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/cloudscale/cloudscale_floating_ip.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/cloudscale/cloudscale_server.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/cloudscale/cloudscale_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudscale/cloudscale_server_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/cloudscale/cloudscale_volume.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/cloudscale/cloudscale_volume.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_disk_offering.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_firewall.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_host.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_instance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_ip_address.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_iso.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_loadbalancer_rule.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/cloudstack/cs_loadbalancer_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_loadbalancer_rule_member.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_network.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_network_acl_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_network_offering.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_physical_network.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_portforward.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_project.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_resourcelimit.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/cloudstack/cs_service_offering.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_storage_pool.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_template.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_vmsnapshot.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_volume.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_vpc.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_vpc_offering.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/cloudstack/cs_vpn_customer_gateway.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/digital_ocean/_digital_ocean.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/digital_ocean/_digital_ocean.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/digital_ocean/_digital_ocean.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/_digital_ocean.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/digital_ocean/digital_ocean_block_storage.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/digital_ocean/digital_ocean_block_storage.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/digital_ocean/digital_ocean_block_storage.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_certificate.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/digital_ocean/digital_ocean_certificate.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/digital_ocean/digital_ocean_certificate.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_certificate_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_domain.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/digital_ocean/digital_ocean_domain.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_domain_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_droplet.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/digital_ocean/digital_ocean_droplet.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/digital_ocean/digital_ocean_droplet.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/digital_ocean/digital_ocean_droplet.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_firewall_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_floating_ip.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/digital_ocean/digital_ocean_floating_ip.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/digital_ocean/digital_ocean_floating_ip.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_floating_ip.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/digital_ocean/digital_ocean_image_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_load_balancer_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_snapshot_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_sshkey.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/digital_ocean/digital_ocean_sshkey.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/digital_ocean/digital_ocean_sshkey.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_sshkey.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/digital_ocean/digital_ocean_tag.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/digital_ocean/digital_ocean_tag.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_tag_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/digital_ocean/digital_ocean_volume_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/dimensiondata/dimensiondata_network.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/dimensiondata/dimensiondata_network.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/dimensiondata/dimensiondata_network.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/dimensiondata/dimensiondata_vlan.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/dimensiondata/dimensiondata_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/dimensiondata/dimensiondata_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/dimensiondata/dimensiondata_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/docker/docker_container.py use-argspec-type-path # uses colon-separated paths, can't use type=path
lib/ansible/modules/cloud/docker/docker_stack.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/_gcdns_record.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/_gcdns_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/_gcdns_zone.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/_gce.py pylint:blacklisted-name
lib/ansible/modules/cloud/google/_gce.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/google/_gce.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/_gce.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/google/_gce.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/_gce.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/_gce.py yamllint:unparsable-with-libyaml
lib/ansible/modules/cloud/google/_gcp_backend_service.py pylint:blacklisted-name
lib/ansible/modules/cloud/google/_gcp_backend_service.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/google/_gcp_backend_service.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/google/_gcp_backend_service.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/_gcp_backend_service.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/_gcp_backend_service.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/_gcp_backend_service.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/google/_gcp_forwarding_rule.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/google/_gcp_forwarding_rule.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/google/_gcp_forwarding_rule.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/_gcp_forwarding_rule.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/google/_gcp_forwarding_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/_gcp_forwarding_rule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/_gcp_forwarding_rule.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/google/_gcp_healthcheck.py pylint:blacklisted-name
lib/ansible/modules/cloud/google/_gcp_healthcheck.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/google/_gcp_healthcheck.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/google/_gcp_healthcheck.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/_gcp_healthcheck.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/google/_gcp_healthcheck.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/_gcp_healthcheck.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/_gcp_target_proxy.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/google/_gcp_target_proxy.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/_gcp_target_proxy.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/google/_gcp_target_proxy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/_gcp_target_proxy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/_gcp_target_proxy.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/google/_gcp_url_map.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/google/_gcp_url_map.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/google/_gcp_url_map.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/_gcp_url_map.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/_gcp_url_map.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/_gcp_url_map.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/google/_gcspanner.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/_gcspanner.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/google/gc_storage.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/google/gc_storage.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/google/gc_storage.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/gc_storage.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/gc_storage.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/google/gce_eip.py pylint:blacklisted-name
lib/ansible/modules/cloud/google/gce_eip.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/gce_eip.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gce_eip.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/gce_eip.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/google/gce_img.py pylint:blacklisted-name
lib/ansible/modules/cloud/google/gce_img.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/gce_img.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/gce_instance_template.py pylint:blacklisted-name
lib/ansible/modules/cloud/google/gce_instance_template.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/google/gce_instance_template.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/google/gce_instance_template.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/gce_instance_template.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gce_instance_template.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/gce_instance_template.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/google/gce_labels.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/google/gce_labels.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/google/gce_labels.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/gce_labels.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gce_labels.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/gce_labels.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/google/gce_lb.py pylint:blacklisted-name
lib/ansible/modules/cloud/google/gce_lb.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/google/gce_lb.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/gce_lb.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/cloud/google/gce_lb.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gce_lb.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/gce_mig.py pylint:blacklisted-name
lib/ansible/modules/cloud/google/gce_mig.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/gce_mig.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gce_mig.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/gce_mig.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/google/gce_net.py pylint:blacklisted-name
lib/ansible/modules/cloud/google/gce_net.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/google/gce_net.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/gce_net.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/cloud/google/gce_net.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gce_net.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/gce_pd.py pylint:blacklisted-name
lib/ansible/modules/cloud/google/gce_pd.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/google/gce_pd.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/gce_pd.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gce_pd.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/gce_pd.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/google/gce_snapshot.py pylint:blacklisted-name
lib/ansible/modules/cloud/google/gce_snapshot.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/google/gce_snapshot.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/gce_snapshot.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/google/gce_snapshot.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gce_snapshot.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/gce_tag.py pylint:blacklisted-name
lib/ansible/modules/cloud/google/gce_tag.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gce_tag.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/gcp_appengine_firewall_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_appengine_firewall_rule_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_bigquery_dataset.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_bigquery_dataset.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_bigquery_dataset_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_bigquery_table.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/google/gcp_bigquery_table.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_bigquery_table.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_bigquery_table_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_cloudbuild_trigger.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_cloudbuild_trigger.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_cloudbuild_trigger_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_cloudfunctions_cloud_function.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_cloudfunctions_cloud_function_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_cloudscheduler_job.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_cloudscheduler_job_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_cloudtasks_queue.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_cloudtasks_queue_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_address.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_address_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_address_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_autoscaler.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_autoscaler.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_autoscaler_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_autoscaler_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_backend_bucket.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_backend_bucket_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_backend_bucket_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_backend_service.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_backend_service.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_backend_service_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_backend_service_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_disk.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_disk.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_disk_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_disk_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_firewall.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_firewall.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_firewall_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_firewall_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_forwarding_rule.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_forwarding_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_forwarding_rule_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_forwarding_rule_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_global_address.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_global_address_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_global_address_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_global_forwarding_rule.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_global_forwarding_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_global_forwarding_rule_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_global_forwarding_rule_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_health_check.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_health_check_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_health_check_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_http_health_check.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_http_health_check_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_http_health_check_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_https_health_check.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_https_health_check_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_https_health_check_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_image.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_image.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_image_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_image_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_instance.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_instance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_instance_group.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_instance_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_instance_group_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_instance_group_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_instance_group_manager.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_instance_group_manager.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_instance_group_manager_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_instance_group_manager_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_instance_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_instance_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_instance_template.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_instance_template.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_instance_template_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_instance_template_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_interconnect_attachment.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_interconnect_attachment.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_interconnect_attachment_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_interconnect_attachment_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_network.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_network_endpoint_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_network_endpoint_group_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_network_endpoint_group_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_network_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_network_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_node_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_node_group_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_node_group_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_node_template.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_node_template_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_node_template_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_region_backend_service.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_region_backend_service.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_region_backend_service_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_region_backend_service_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_region_disk.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_region_disk.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_region_disk_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_region_disk_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_reservation.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_reservation.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_reservation_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_reservation_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_route.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_route.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_route_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_route_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_router.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_router.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_router_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_router_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_snapshot.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_snapshot_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_snapshot_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_ssl_certificate.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_ssl_certificate_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_ssl_certificate_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_ssl_policy.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_ssl_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_ssl_policy_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_ssl_policy_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_subnetwork.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_subnetwork.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_subnetwork_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_subnetwork_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_http_proxy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_http_proxy_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_target_http_proxy_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_https_proxy.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_target_https_proxy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_https_proxy_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_target_https_proxy_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_instance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_instance_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_target_instance_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_pool.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_target_pool.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_pool_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_target_pool_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_ssl_proxy.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_target_ssl_proxy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_ssl_proxy_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_target_ssl_proxy_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_tcp_proxy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_tcp_proxy_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_target_tcp_proxy_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_vpn_gateway.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_target_vpn_gateway_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_target_vpn_gateway_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_url_map.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_url_map.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_url_map_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_url_map_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_vpn_tunnel.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_vpn_tunnel.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_compute_vpn_tunnel_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_compute_vpn_tunnel_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_container_cluster.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_container_cluster.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_container_cluster_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_container_node_pool.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_container_node_pool.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_container_node_pool_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_dns_managed_zone.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_dns_managed_zone.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_dns_managed_zone_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_dns_managed_zone_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_dns_resource_record_set.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_dns_resource_record_set.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_dns_resource_record_set_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_filestore_instance.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_filestore_instance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_filestore_instance_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_iam_role.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_iam_role.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_iam_role_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_iam_service_account.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_iam_service_account_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_iam_service_account_key.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_kms_crypto_key.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_kms_crypto_key_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_kms_key_ring.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_kms_key_ring_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_logging_metric.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_logging_metric.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_logging_metric_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_mlengine_model.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_mlengine_model.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_mlengine_model_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_mlengine_version.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_mlengine_version_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_pubsub_subscription.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_pubsub_subscription_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_pubsub_topic.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_pubsub_topic.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_pubsub_topic_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_redis_instance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_redis_instance_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_resourcemanager_project.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_resourcemanager_project_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_runtimeconfig_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_runtimeconfig_config_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_runtimeconfig_variable.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_runtimeconfig_variable_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_serviceusage_service.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_serviceusage_service_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_sourcerepo_repository.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_sourcerepo_repository_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_spanner_database.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_spanner_database.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_spanner_database_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_spanner_instance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_spanner_instance_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_sql_database.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_sql_database_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_sql_instance.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_sql_instance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_sql_instance_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_sql_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_sql_user_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_storage_bucket.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/google/gcp_storage_bucket.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_storage_bucket_access_control.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_storage_object.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_tpu_node.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcp_tpu_node_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcpubsub.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/cloud/google/gcpubsub.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/google/gcpubsub.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/google/gcpubsub.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/google/gcpubsub_info.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/google/gcpubsub_info.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/google/gcpubsub_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/google/gcpubsub_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/google/gcpubsub_info.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/cloud/google/gcpubsub_info.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/hcloud/hcloud_network_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/hcloud/hcloud_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/hcloud/hcloud_server_network.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/heroku/heroku_collaborator.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/heroku/heroku_collaborator.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/huawei/hwc_ecs_instance.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/huawei/hwc_vpc_port.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/huawei/hwc_vpc_subnet.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/kubevirt/kubevirt_cdi_upload.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/kubevirt/kubevirt_cdi_upload.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/kubevirt/kubevirt_cdi_upload.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/kubevirt/kubevirt_preset.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/kubevirt/kubevirt_preset.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/kubevirt/kubevirt_preset.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/kubevirt/kubevirt_pvc.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/kubevirt/kubevirt_pvc.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/kubevirt/kubevirt_pvc.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/kubevirt/kubevirt_pvc.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/kubevirt/kubevirt_rs.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/kubevirt/kubevirt_rs.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/kubevirt/kubevirt_rs.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/kubevirt/kubevirt_rs.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/kubevirt/kubevirt_template.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/kubevirt/kubevirt_template.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/kubevirt/kubevirt_vm.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/kubevirt/kubevirt_vm.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/kubevirt/kubevirt_vm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/linode/linode.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/linode/linode.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/linode/linode.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/linode/linode.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/linode/linode_v4.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/linode/linode_v4.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/lxc/lxc_container.py pylint:blacklisted-name
lib/ansible/modules/cloud/lxc/lxc_container.py use-argspec-type-path
lib/ansible/modules/cloud/lxc/lxc_container.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/lxc/lxc_container.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/lxc/lxc_container.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/lxc/lxc_container.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/lxc/lxc_container.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/lxc/lxc_container.py validate-modules:use-run-command-not-popen
lib/ansible/modules/cloud/lxd/lxd_container.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/lxd/lxd_container.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/lxd/lxd_container.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/lxd/lxd_container.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/lxd/lxd_container.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/lxd/lxd_container.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/lxd/lxd_profile.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/lxd/lxd_profile.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/lxd/lxd_profile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/memset/memset_dns_reload.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/memset/memset_memstore_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/memset/memset_server_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/memset/memset_zone.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/memset/memset_zone_domain.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/memset/memset_zone_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/misc/cloud_init_data_facts.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/misc/helm.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/misc/helm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/misc/ovirt.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/misc/ovirt.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/misc/ovirt.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/misc/proxmox.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/misc/proxmox.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/misc/proxmox_kvm.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/misc/proxmox_kvm.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/misc/proxmox_kvm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/misc/proxmox_kvm.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/misc/proxmox_template.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/misc/proxmox_template.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/misc/proxmox_template.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/cloud/misc/proxmox_template.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/misc/rhevm.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/misc/rhevm.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/misc/rhevm.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/cloud/misc/serverless.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/misc/terraform.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/misc/terraform.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/misc/terraform.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/misc/terraform.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/misc/terraform.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/misc/virt.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/misc/virt.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/misc/virt.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/misc/virt_net.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/misc/virt_net.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/misc/virt_pool.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/misc/virt_pool.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/oneandone/oneandone_firewall_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/oneandone/oneandone_firewall_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/oneandone/oneandone_firewall_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/oneandone/oneandone_load_balancer.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/oneandone/oneandone_load_balancer.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/oneandone/oneandone_load_balancer.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/oneandone/oneandone_load_balancer.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/oneandone/oneandone_load_balancer.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/oneandone/oneandone_monitoring_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/oneandone/oneandone_monitoring_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/oneandone/oneandone_monitoring_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/oneandone/oneandone_private_network.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/oneandone/oneandone_private_network.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/oneandone/oneandone_private_network.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/oneandone/oneandone_private_network.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/oneandone/oneandone_private_network.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/oneandone/oneandone_public_ip.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/oneandone/oneandone_public_ip.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/oneandone/oneandone_public_ip.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/oneandone/oneandone_public_ip.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/oneandone/oneandone_public_ip.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/oneandone/oneandone_server.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/oneandone/oneandone_server.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/oneandone/oneandone_server.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/oneandone/oneandone_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/oneandone/oneandone_server.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/online/_online_server_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/online/_online_server_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/online/_online_user_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/online/_online_user_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/online/online_server_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/online/online_server_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/online/online_user_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/online/online_user_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/opennebula/one_host.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/opennebula/one_host.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/opennebula/one_host.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/opennebula/one_image.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/opennebula/one_image_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/opennebula/one_image_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/opennebula/one_service.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/opennebula/one_vm.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/opennebula/one_vm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_auth.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_client_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/openstack/os_client_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_coe_cluster.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_coe_cluster.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_coe_cluster_template.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_coe_cluster_template.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/openstack/os_coe_cluster_template.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_flavor_info.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/openstack/os_flavor_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_flavor_info.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/cloud/openstack/os_flavor_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_floating_ip.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_group.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_group_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/openstack/os_image.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/openstack/os_image.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/openstack/os_image.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_image.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_image_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_ironic.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/openstack/os_ironic.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_ironic.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/openstack/os_ironic.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/cloud/openstack/os_ironic.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/openstack/os_ironic.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_ironic.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/openstack/os_ironic_inspect.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_ironic_node.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/openstack/os_ironic_node.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/openstack/os_ironic_node.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_ironic_node.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/cloud/openstack/os_ironic_node.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_ironic_node.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/openstack/os_keypair.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_keystone_domain.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_keystone_domain_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_keystone_domain_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_keystone_endpoint.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_keystone_endpoint.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_keystone_endpoint.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/openstack/os_keystone_role.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_keystone_service.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_listener.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_listener.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_loadbalancer.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_loadbalancer.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/openstack/os_loadbalancer.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_member.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_member.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_network.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_network.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_networks_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_networks_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_nova_flavor.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_nova_flavor.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/openstack/os_nova_flavor.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_nova_host_aggregate.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_nova_host_aggregate.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/openstack/os_nova_host_aggregate.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_object.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_pool.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_port.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_port.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/openstack/os_port.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/openstack/os_port.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_port_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_port_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_project.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_project_access.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_project_access.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/openstack/os_project_access.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_project_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_project_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/openstack/os_project_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_quota.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/openstack/os_quota.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_quota.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/cloud/openstack/os_quota.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_quota.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/openstack/os_quota.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/openstack/os_recordset.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_recordset.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/openstack/os_recordset.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/openstack/os_recordset.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_router.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_router.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/openstack/os_router.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_security_group.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_security_group_rule.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_security_group_rule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_server.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/openstack/os_server.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_server.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/openstack/os_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/openstack/os_server.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_server.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/openstack/os_server_action.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/openstack/os_server_action.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_server_action.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/openstack/os_server_group.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_server_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/openstack/os_server_group.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_server_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_server_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_server_metadata.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_server_metadata.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_server_volume.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_stack.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/openstack/os_stack.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_stack.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/openstack/os_stack.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_subnet.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/openstack/os_subnet.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_subnet.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/openstack/os_subnet.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_subnets_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_subnets_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_user.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_user_group.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_user_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_user_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/openstack/os_user_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_user_role.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_volume.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_volume.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/openstack/os_volume.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/openstack/os_volume_snapshot.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_zone.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/openstack/os_zone.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/openstack/os_zone.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/oracle/oci_vcn.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/oracle/oci_vcn.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovh/ovh_ip_failover.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovh/ovh_ip_failover.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovh/ovh_ip_loadbalancing_backend.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovh/ovh_ip_loadbalancing_backend.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_affinity_group.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_affinity_group.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_affinity_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/ovirt/ovirt_affinity_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_affinity_group.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_affinity_label.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_affinity_label.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_affinity_label.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_affinity_label.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/cloud/ovirt/ovirt_affinity_label.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_affinity_label.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_affinity_label_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_affinity_label_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_affinity_label_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_affinity_label_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_api_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_auth.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_auth.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_auth.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/ovirt/ovirt_auth.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_auth.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/ovirt/ovirt_auth.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_auth.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/ovirt/ovirt_cluster.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_cluster.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_cluster.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/ovirt/ovirt_cluster.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_cluster.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/cloud/ovirt/ovirt_cluster.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_cluster.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_cluster.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/ovirt/ovirt_cluster_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_cluster_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_cluster_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_cluster_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_datacenter.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_datacenter.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_datacenter.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_datacenter.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/cloud/ovirt/ovirt_datacenter.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_datacenter_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_datacenter_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_datacenter_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_datacenter_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_disk.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_disk.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_disk.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/ovirt/ovirt_disk.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/ovirt/ovirt_disk.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_disk.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_disk.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_disk.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/ovirt/ovirt_disk_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_disk_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_disk_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_disk_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_event.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_event_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_external_provider.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_external_provider.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_external_provider.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/ovirt/ovirt_external_provider.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_external_provider.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/ovirt/ovirt_external_provider.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/cloud/ovirt/ovirt_external_provider.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_external_provider.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_external_provider.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/ovirt/ovirt_external_provider_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_external_provider_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_external_provider_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_external_provider_info.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/cloud/ovirt/ovirt_external_provider_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_external_provider_info.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/ovirt/ovirt_group.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_group.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_group.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_group_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_group_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_group_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_group_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_host.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_host.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_host.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_host.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/cloud/ovirt/ovirt_host.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_host.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_host_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_host_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_host_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_host_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_host_network.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_host_network.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_host_network.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_host_network.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_host_network.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_host_pm.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_host_pm.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_host_pm.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_host_pm.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/cloud/ovirt/ovirt_host_pm.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_host_pm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_host_storage_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_host_storage_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_host_storage_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_host_storage_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_host_storage_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_instance_type.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_instance_type.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_instance_type.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_instance_type.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_job.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_job.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_job.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_job.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/ovirt/ovirt_job.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_mac_pool.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_mac_pool.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_mac_pool.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_mac_pool.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_mac_pool.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_network.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_network.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_network.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_network.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/ovirt/ovirt_network.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_network.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_network_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_network_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_network_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_network_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_nic.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_nic.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_nic.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_nic.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_nic.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_nic_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_nic_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_nic_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_nic_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_permission.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_permission.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_permission.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_permission.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_permission_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_permission_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_permission_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_permission_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_quota.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_quota.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_quota.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_quota.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_quota.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_quota_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_quota_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_quota_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_quota_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_role.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_role.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_role.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_role.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_role.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_scheduling_policy_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_scheduling_policy_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_scheduling_policy_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_scheduling_policy_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/ovirt/ovirt_scheduling_policy_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_snapshot.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_snapshot.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_snapshot.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_snapshot.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_snapshot.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_snapshot_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_snapshot_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_snapshot_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_snapshot_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_storage_connection.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_storage_connection.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_storage_connection.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_storage_connection.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_storage_connection.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_storage_domain.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_storage_domain.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_storage_domain.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_storage_domain.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_storage_domain.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_storage_domain_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_storage_domain_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_storage_domain_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_storage_domain_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_storage_template_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_storage_template_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_storage_template_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_storage_template_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_storage_template_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_storage_vm_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_storage_vm_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_storage_vm_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_storage_vm_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_storage_vm_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_tag.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_tag.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_tag.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_tag.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_tag.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_tag_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_tag_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_tag_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_tag_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_template.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_template.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_template.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_template.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_template.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_template_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_template_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_template_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_template_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_user.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_user.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_user.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_user_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_user_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_user_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_user_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_vm.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_vm.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_vm.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_vm.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_vm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_vm_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_vm_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_vm_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_vm_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_vm_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_vmpool.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_vmpool.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_vmpool.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_vmpool.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_vmpool.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_vmpool_info.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_vmpool_info.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_vmpool_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/ovirt/ovirt_vmpool_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_vnic_profile.py future-import-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_vnic_profile.py metaclass-boilerplate
lib/ansible/modules/cloud/ovirt/ovirt_vnic_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/ovirt/ovirt_vnic_profile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/ovirt/ovirt_vnic_profile_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/packet/packet_device.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/packet/packet_device.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/packet/packet_device.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/packet/packet_ip_subnet.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/packet/packet_sshkey.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/packet/packet_sshkey.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/packet/packet_sshkey.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/packet/packet_volume_attachment.py pylint:ansible-bad-function
lib/ansible/modules/cloud/packet/packet_volume_attachment.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/podman/podman_image.py validate-modules:doc-type-does-not-match-spec
lib/ansible/modules/cloud/podman/podman_image.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/podman/podman_image.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/podman/podman_image.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/podman/podman_image_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/podman/podman_image_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/profitbricks/profitbricks.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/profitbricks/profitbricks.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/profitbricks/profitbricks.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/profitbricks/profitbricks.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/profitbricks/profitbricks.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/profitbricks/profitbricks.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/profitbricks/profitbricks.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/profitbricks/profitbricks_datacenter.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/profitbricks/profitbricks_datacenter.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/profitbricks/profitbricks_datacenter.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/profitbricks/profitbricks_datacenter.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/profitbricks/profitbricks_nic.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/profitbricks/profitbricks_nic.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/profitbricks/profitbricks_nic.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/profitbricks/profitbricks_nic.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/profitbricks/profitbricks_nic.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/profitbricks/profitbricks_volume.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/profitbricks/profitbricks_volume.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/profitbricks/profitbricks_volume.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/profitbricks/profitbricks_volume.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/profitbricks/profitbricks_volume.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/profitbricks/profitbricks_volume.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/profitbricks/profitbricks_volume_attachments.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/profitbricks/profitbricks_volume_attachments.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/profitbricks/profitbricks_volume_attachments.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/profitbricks/profitbricks_volume_attachments.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/pubnub/pubnub_blocks.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/pubnub/pubnub_blocks.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/pubnub/pubnub_blocks.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax.py use-argspec-type-path # fix needed
lib/ansible/modules/cloud/rackspace/rax.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/rackspace/rax.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/rackspace/rax_cbs.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_cbs.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_cbs.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/rackspace/rax_cbs.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_cbs_attachments.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_cbs_attachments.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_cbs_attachments.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/rackspace/rax_cbs_attachments.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_cdb.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_cdb.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_cdb.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_cdb.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/rackspace/rax_cdb.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_cdb_database.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_cdb_database.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_cdb_database.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/rackspace/rax_cdb_database.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_cdb_user.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_cdb_user.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_cdb_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/rackspace/rax_cdb_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/rackspace/rax_cdb_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_clb.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_clb.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_clb.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/rackspace/rax_clb.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_clb_nodes.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_clb_nodes.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_clb_nodes.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_clb_nodes.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/rackspace/rax_clb_ssl.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_clb_ssl.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_clb_ssl.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_dns.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_dns.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_dns.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_dns_record.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_dns_record.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_dns_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_facts.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_files.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_files.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_files.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_files.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/rackspace/rax_files.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/cloud/rackspace/rax_files.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_files_objects.py use-argspec-type-path
lib/ansible/modules/cloud/rackspace/rax_files_objects.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_files_objects.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_files_objects.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/cloud/rackspace/rax_files_objects.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_identity.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_identity.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_identity.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_identity.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_keypair.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_keypair.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_keypair.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_meta.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_meta.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_meta.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_mon_alarm.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_mon_alarm.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_mon_alarm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_mon_check.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_mon_check.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_mon_check.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_mon_check.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_mon_entity.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_mon_entity.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_mon_entity.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_mon_notification.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_mon_notification.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_mon_notification.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_mon_notification_plan.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_mon_notification_plan.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_mon_notification_plan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/rackspace/rax_mon_notification_plan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_network.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_network.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_network.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/rackspace/rax_network.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_queue.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_queue.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_queue.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_scaling_group.py use-argspec-type-path # fix needed
lib/ansible/modules/cloud/rackspace/rax_scaling_group.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_scaling_group.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_scaling_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/rackspace/rax_scaling_group.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/rackspace/rax_scaling_policy.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/rackspace/rax_scaling_policy.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/rackspace/rax_scaling_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/scaleway/_scaleway_image_facts.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/_scaleway_image_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/_scaleway_image_facts.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/_scaleway_image_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/scaleway/_scaleway_ip_facts.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/_scaleway_ip_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/_scaleway_ip_facts.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/_scaleway_ip_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/scaleway/_scaleway_organization_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/_scaleway_organization_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/scaleway/_scaleway_security_group_facts.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/_scaleway_security_group_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/_scaleway_security_group_facts.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/_scaleway_security_group_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/scaleway/_scaleway_server_facts.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/_scaleway_server_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/_scaleway_server_facts.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/_scaleway_server_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/scaleway/_scaleway_snapshot_facts.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/_scaleway_snapshot_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/_scaleway_snapshot_facts.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/_scaleway_snapshot_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/scaleway/_scaleway_volume_facts.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/_scaleway_volume_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/_scaleway_volume_facts.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/_scaleway_volume_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/scaleway/scaleway_compute.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/scaleway_compute.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_compute.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/scaleway_compute.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/scaleway/scaleway_compute.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/scaleway/scaleway_image_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/scaleway_image_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_image_info.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/scaleway_image_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/scaleway/scaleway_ip.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/scaleway_ip.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_ip.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/scaleway_ip_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/scaleway_ip_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_ip_info.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/scaleway_ip_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/scaleway/scaleway_lb.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/scaleway_lb.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_lb.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/scaleway_lb.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/scaleway/scaleway_lb.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/scaleway/scaleway_organization_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_organization_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/scaleway/scaleway_security_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_security_group.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/scaleway_security_group_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/scaleway_security_group_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_security_group_info.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/scaleway_security_group_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/scaleway/scaleway_security_group_rule.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_security_group_rule.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/scaleway_security_group_rule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/scaleway/scaleway_server_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/scaleway_server_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_server_info.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/scaleway_server_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/scaleway/scaleway_snapshot_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/scaleway_snapshot_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_snapshot_info.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/scaleway_snapshot_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/scaleway/scaleway_sshkey.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/scaleway_sshkey.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_user_data.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/scaleway_user_data.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_user_data.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/scaleway_user_data.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/scaleway/scaleway_volume.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/scaleway_volume.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_volume.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/scaleway_volume.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/scaleway/scaleway_volume_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/scaleway/scaleway_volume_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/scaleway/scaleway_volume_info.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/scaleway/scaleway_volume_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/smartos/imgadm.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/smartos/imgadm.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/cloud/smartos/smartos_image_info.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/smartos/vmadm.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/smartos/vmadm.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/smartos/vmadm.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/smartos/vmadm.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/smartos/vmadm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/smartos/vmadm.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/softlayer/sl_vm.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/softlayer/sl_vm.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/softlayer/sl_vm.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/softlayer/sl_vm.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/softlayer/sl_vm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/spotinst/spotinst_aws_elastigroup.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/spotinst/spotinst_aws_elastigroup.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/spotinst/spotinst_aws_elastigroup.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/spotinst/spotinst_aws_elastigroup.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/spotinst/spotinst_aws_elastigroup.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/cloud/spotinst/spotinst_aws_elastigroup.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/spotinst/spotinst_aws_elastigroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/spotinst/spotinst_aws_elastigroup.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/univention/udm_dns_record.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/univention/udm_dns_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/univention/udm_dns_zone.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/univention/udm_dns_zone.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/univention/udm_dns_zone.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/univention/udm_dns_zone.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/univention/udm_group.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/univention/udm_group.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/univention/udm_share.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/univention/udm_share.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/univention/udm_share.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/cloud/univention/udm_share.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/univention/udm_share.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/univention/udm_share.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/univention/udm_user.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/univention/udm_user.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/univention/udm_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/univention/udm_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/_vmware_dns_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/_vmware_drs_group_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vca_fw.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/vmware/vca_fw.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/vmware/vca_fw.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vca_fw.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vca_fw.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vca_fw.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vca_nat.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/vmware/vca_nat.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/vmware/vca_nat.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vca_nat.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vca_nat.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vca_nat.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vca_vapp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/vmware/vca_vapp.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/vmware/vca_vapp.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vca_vapp.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/vmware/vca_vapp.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_category.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/vmware/vmware_category.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/vmware/vmware_cfg_backup.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vmware_cluster.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/vmware/vmware_cluster.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_cluster_drs.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/cloud/vmware/vmware_cluster_ha.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/vmware/vmware_content_library_manager.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vmware_datastore_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_deploy_ovf.py use-argspec-type-path
lib/ansible/modules/cloud/vmware/vmware_deploy_ovf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_drs_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vmware_drs_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_drs_group_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vmware_dvs_host.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/vmware/vmware_dvs_host.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vmware_dvs_host.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/vmware/vmware_dvs_host.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_dvs_host.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_dvs_host.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_dvs_portgroup.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/vmware/vmware_dvs_portgroup.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/vmware/vmware_dvs_portgroup.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/vmware/vmware_dvs_portgroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_dvs_portgroup.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_dvswitch.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/vmware/vmware_dvswitch.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_dvswitch.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_dvswitch_lacp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_dvswitch_nioc.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/vmware/vmware_dvswitch_nioc.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/vmware/vmware_dvswitch_nioc.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/vmware/vmware_dvswitch_nioc.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vmware_dvswitch_nioc.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/vmware/vmware_dvswitch_nioc.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_dvswitch_nioc.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_dvswitch_pvlans.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_dvswitch_uplink_pg.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/vmware/vmware_dvswitch_uplink_pg.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/vmware/vmware_dvswitch_uplink_pg.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/vmware/vmware_dvswitch_uplink_pg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_dvswitch_uplink_pg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_dvswitch_uplink_pg.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_guest.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vmware_guest.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_guest.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_guest.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_guest_boot_manager.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_guest_controller.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/vmware/vmware_guest_custom_attribute_defs.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vmware_guest_custom_attributes.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vmware_guest_custom_attributes.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/vmware/vmware_guest_custom_attributes.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_guest_custom_attributes.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_guest_custom_attributes.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_guest_disk.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_guest_file_operation.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/vmware/vmware_guest_file_operation.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vmware_guest_file_operation.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/vmware/vmware_guest_file_operation.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_guest_file_operation.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_guest_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_guest_network.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_guest_sendkey.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_guest_serial_port.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_guest_snapshot.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vmware_host_acceptance.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/cloud/vmware/vmware_host_datastore.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vmware_host_dns.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_host_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_host_firewall_manager.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_host_lockdown.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_host_ntp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_host_snmp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_local_role_manager.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_portgroup.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/vmware/vmware_portgroup.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/vmware/vmware_portgroup.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_portgroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_portgroup.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_tag_manager.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_vcenter_settings.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/vmware/vmware_vcenter_settings.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/vmware/vmware_vcenter_settings.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_vcenter_settings.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_vcenter_statistics.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/vmware/vmware_vcenter_statistics.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/vmware/vmware_vcenter_statistics.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/vmware/vmware_vcenter_statistics.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_vcenter_statistics.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_vm_host_drs_rule.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware/vmware_vm_shell.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_vm_vm_drs_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vmware_vmkernel.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/vmware/vmware_vmkernel.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/cloud/vmware/vmware_vmkernel.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/vmware/vmware_vmkernel.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_vmkernel.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_vspan_session.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/vmware/vmware_vspan_session.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vmware/vmware_vspan_session.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware/vmware_vswitch.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware/vsphere_copy.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/vmware_httpapi/vmware_appliance_access_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware_httpapi/vmware_appliance_access_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware_httpapi/vmware_appliance_health_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware_httpapi/vmware_appliance_health_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware_httpapi/vmware_cis_category_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware_httpapi/vmware_cis_category_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vmware_httpapi/vmware_core_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vmware_httpapi/vmware_core_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vultr/_vultr_block_storage_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/_vultr_dns_domain_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/_vultr_firewall_group_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/_vultr_network_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/_vultr_os_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/_vultr_region_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/_vultr_server_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/_vultr_ssh_key_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/_vultr_startup_script_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/_vultr_user_facts.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/vultr_block_storage.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/vultr/vultr_block_storage.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/vultr/vultr_block_storage.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vultr/vultr_dns_domain.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/vultr/vultr_dns_domain_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/vultr_dns_record.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/vultr/vultr_dns_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vultr/vultr_firewall_group.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/vultr/vultr_firewall_group_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/vultr_firewall_rule.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/vultr/vultr_firewall_rule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/vultr/vultr_network.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/vultr/vultr_network_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/vultr_region_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/vultr_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/vultr/vultr_server_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/vultr_startup_script_info.py validate-modules:return-syntax-error
lib/ansible/modules/cloud/vultr/vultr_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/webfaction/webfaction_app.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/webfaction/webfaction_db.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/webfaction/webfaction_domain.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/webfaction/webfaction_domain.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/webfaction/webfaction_domain.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/webfaction/webfaction_mailbox.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/webfaction/webfaction_site.py validate-modules:doc-missing-type
lib/ansible/modules/cloud/webfaction/webfaction_site.py validate-modules:parameter-list-no-elements
lib/ansible/modules/cloud/webfaction/webfaction_site.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/xenserver/xenserver_guest.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/cloud/xenserver/xenserver_guest.py validate-modules:doc-elements-mismatch
lib/ansible/modules/cloud/xenserver/xenserver_guest.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/xenserver/xenserver_guest.py validate-modules:missing-suboption-docs
lib/ansible/modules/cloud/xenserver/xenserver_guest.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/cloud/xenserver/xenserver_guest.py validate-modules:undocumented-parameter
lib/ansible/modules/cloud/xenserver/xenserver_guest_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/cloud/xenserver/xenserver_guest_powerstate.py validate-modules:doc-required-mismatch
lib/ansible/modules/clustering/consul/consul.py validate-modules:doc-missing-type
lib/ansible/modules/clustering/consul/consul.py validate-modules:parameter-list-no-elements
lib/ansible/modules/clustering/consul/consul.py validate-modules:undocumented-parameter
lib/ansible/modules/clustering/consul/consul_acl.py validate-modules:doc-missing-type
lib/ansible/modules/clustering/consul/consul_acl.py validate-modules:doc-required-mismatch
lib/ansible/modules/clustering/consul/consul_acl.py validate-modules:parameter-list-no-elements
lib/ansible/modules/clustering/consul/consul_kv.py validate-modules:doc-required-mismatch
lib/ansible/modules/clustering/consul/consul_kv.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/clustering/consul/consul_session.py validate-modules:parameter-list-no-elements
lib/ansible/modules/clustering/consul/consul_session.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/clustering/etcd3.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/clustering/etcd3.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/clustering/k8s/k8s.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/clustering/k8s/k8s.py validate-modules:doc-missing-type
lib/ansible/modules/clustering/k8s/k8s.py validate-modules:doc-required-mismatch
lib/ansible/modules/clustering/k8s/k8s.py validate-modules:parameter-list-no-elements
lib/ansible/modules/clustering/k8s/k8s.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/clustering/k8s/k8s.py validate-modules:return-syntax-error
lib/ansible/modules/clustering/k8s/k8s_auth.py validate-modules:doc-missing-type
lib/ansible/modules/clustering/k8s/k8s_auth.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/clustering/k8s/k8s_info.py validate-modules:doc-missing-type
lib/ansible/modules/clustering/k8s/k8s_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/clustering/k8s/k8s_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/clustering/k8s/k8s_scale.py validate-modules:doc-missing-type
lib/ansible/modules/clustering/k8s/k8s_scale.py validate-modules:doc-required-mismatch
lib/ansible/modules/clustering/k8s/k8s_scale.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/clustering/k8s/k8s_scale.py validate-modules:return-syntax-error
lib/ansible/modules/clustering/k8s/k8s_service.py validate-modules:doc-missing-type
lib/ansible/modules/clustering/k8s/k8s_service.py validate-modules:parameter-list-no-elements
lib/ansible/modules/clustering/k8s/k8s_service.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/clustering/k8s/k8s_service.py validate-modules:return-syntax-error
lib/ansible/modules/clustering/pacemaker_cluster.py validate-modules:doc-required-mismatch
lib/ansible/modules/clustering/pacemaker_cluster.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/clustering/znode.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/clustering/znode.py validate-modules:doc-missing-type
lib/ansible/modules/clustering/znode.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/commands/command.py validate-modules:doc-missing-type
lib/ansible/modules/commands/command.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/commands/command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/commands/command.py validate-modules:undocumented-parameter
lib/ansible/modules/commands/expect.py validate-modules:doc-missing-type
lib/ansible/modules/crypto/acme/acme_account_info.py validate-modules:return-syntax-error
lib/ansible/modules/crypto/acme/acme_certificate.py validate-modules:doc-elements-mismatch
lib/ansible/modules/database/aerospike/aerospike_migrations.py yamllint:unparsable-with-libyaml
lib/ansible/modules/database/influxdb/influxdb_database.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/database/influxdb/influxdb_database.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/influxdb/influxdb_query.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/database/influxdb/influxdb_query.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/influxdb/influxdb_retention_policy.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/database/influxdb/influxdb_retention_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/database/influxdb/influxdb_retention_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/influxdb/influxdb_user.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/database/influxdb/influxdb_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/database/influxdb/influxdb_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/influxdb/influxdb_write.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/database/influxdb/influxdb_write.py validate-modules:parameter-list-no-elements
lib/ansible/modules/database/influxdb/influxdb_write.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/misc/elasticsearch_plugin.py validate-modules:doc-missing-type
lib/ansible/modules/database/misc/elasticsearch_plugin.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/database/misc/elasticsearch_plugin.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/misc/kibana_plugin.py validate-modules:doc-missing-type
lib/ansible/modules/database/misc/kibana_plugin.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/database/misc/kibana_plugin.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/misc/redis.py validate-modules:doc-required-mismatch
lib/ansible/modules/database/misc/redis.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/misc/riak.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/database/misc/riak.py validate-modules:doc-missing-type
lib/ansible/modules/database/misc/riak.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/mongodb/mongodb_parameter.py use-argspec-type-path
lib/ansible/modules/database/mongodb/mongodb_parameter.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/database/mongodb/mongodb_parameter.py validate-modules:doc-missing-type
lib/ansible/modules/database/mongodb/mongodb_parameter.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/database/mongodb/mongodb_parameter.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/database/mongodb/mongodb_parameter.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/mongodb/mongodb_replicaset.py use-argspec-type-path
lib/ansible/modules/database/mongodb/mongodb_replicaset.py validate-modules:parameter-list-no-elements
lib/ansible/modules/database/mongodb/mongodb_shard.py use-argspec-type-path
lib/ansible/modules/database/mongodb/mongodb_shard.py validate-modules:doc-missing-type
lib/ansible/modules/database/mongodb/mongodb_shard.py validate-modules:doc-required-mismatch
lib/ansible/modules/database/mongodb/mongodb_shard.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/mongodb/mongodb_user.py use-argspec-type-path
lib/ansible/modules/database/mongodb/mongodb_user.py validate-modules:doc-missing-type
lib/ansible/modules/database/mongodb/mongodb_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/database/mongodb/mongodb_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/mongodb/mongodb_user.py validate-modules:undocumented-parameter
lib/ansible/modules/database/mssql/mssql_db.py validate-modules:doc-missing-type
lib/ansible/modules/database/mssql/mssql_db.py validate-modules:doc-required-mismatch
lib/ansible/modules/database/mysql/mysql_db.py validate-modules:doc-elements-mismatch
lib/ansible/modules/database/mysql/mysql_db.py validate-modules:parameter-list-no-elements
lib/ansible/modules/database/mysql/mysql_db.py validate-modules:use-run-command-not-popen
lib/ansible/modules/database/mysql/mysql_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/database/mysql/mysql_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/database/mysql/mysql_query.py validate-modules:parameter-list-no-elements
lib/ansible/modules/database/mysql/mysql_user.py validate-modules:undocumented-parameter
lib/ansible/modules/database/mysql/mysql_variables.py validate-modules:doc-required-mismatch
lib/ansible/modules/database/postgresql/postgresql_db.py use-argspec-type-path
lib/ansible/modules/database/postgresql/postgresql_db.py validate-modules:use-run-command-not-popen
lib/ansible/modules/database/postgresql/postgresql_privs.py validate-modules:parameter-documented-multiple-times
lib/ansible/modules/database/postgresql/postgresql_user.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/database/proxysql/proxysql_backend_servers.py validate-modules:doc-missing-type
lib/ansible/modules/database/proxysql/proxysql_backend_servers.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/proxysql/proxysql_backend_servers.py validate-modules:undocumented-parameter
lib/ansible/modules/database/proxysql/proxysql_global_variables.py validate-modules:doc-missing-type
lib/ansible/modules/database/proxysql/proxysql_global_variables.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/proxysql/proxysql_global_variables.py validate-modules:undocumented-parameter
lib/ansible/modules/database/proxysql/proxysql_manage_config.py validate-modules:doc-missing-type
lib/ansible/modules/database/proxysql/proxysql_manage_config.py validate-modules:undocumented-parameter
lib/ansible/modules/database/proxysql/proxysql_mysql_users.py validate-modules:doc-missing-type
lib/ansible/modules/database/proxysql/proxysql_mysql_users.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/proxysql/proxysql_mysql_users.py validate-modules:undocumented-parameter
lib/ansible/modules/database/proxysql/proxysql_query_rules.py validate-modules:doc-missing-type
lib/ansible/modules/database/proxysql/proxysql_query_rules.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/proxysql/proxysql_query_rules.py validate-modules:undocumented-parameter
lib/ansible/modules/database/proxysql/proxysql_replication_hostgroups.py validate-modules:doc-missing-type
lib/ansible/modules/database/proxysql/proxysql_replication_hostgroups.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/proxysql/proxysql_replication_hostgroups.py validate-modules:undocumented-parameter
lib/ansible/modules/database/proxysql/proxysql_scheduler.py validate-modules:doc-missing-type
lib/ansible/modules/database/proxysql/proxysql_scheduler.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/database/proxysql/proxysql_scheduler.py validate-modules:undocumented-parameter
lib/ansible/modules/database/vertica/vertica_configuration.py validate-modules:doc-missing-type
lib/ansible/modules/database/vertica/vertica_configuration.py validate-modules:doc-required-mismatch
lib/ansible/modules/database/vertica/vertica_info.py validate-modules:doc-missing-type
lib/ansible/modules/database/vertica/vertica_role.py validate-modules:doc-missing-type
lib/ansible/modules/database/vertica/vertica_role.py validate-modules:undocumented-parameter
lib/ansible/modules/database/vertica/vertica_schema.py validate-modules:doc-missing-type
lib/ansible/modules/database/vertica/vertica_schema.py validate-modules:undocumented-parameter
lib/ansible/modules/database/vertica/vertica_user.py validate-modules:doc-missing-type
lib/ansible/modules/database/vertica/vertica_user.py validate-modules:undocumented-parameter
lib/ansible/modules/files/acl.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/files/archive.py use-argspec-type-path # fix needed
lib/ansible/modules/files/archive.py validate-modules:parameter-list-no-elements
lib/ansible/modules/files/assemble.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/files/blockinfile.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/files/blockinfile.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/files/copy.py pylint:blacklisted-name
lib/ansible/modules/files/copy.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/files/copy.py validate-modules:doc-type-does-not-match-spec
lib/ansible/modules/files/copy.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/files/copy.py validate-modules:undocumented-parameter
lib/ansible/modules/files/file.py pylint:ansible-bad-function
lib/ansible/modules/files/file.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/files/file.py validate-modules:undocumented-parameter
lib/ansible/modules/files/find.py use-argspec-type-path # fix needed
lib/ansible/modules/files/find.py validate-modules:parameter-list-no-elements
lib/ansible/modules/files/find.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/files/iso_extract.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/files/iso_extract.py validate-modules:parameter-list-no-elements
lib/ansible/modules/files/lineinfile.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/files/lineinfile.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/files/lineinfile.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/files/patch.py pylint:blacklisted-name
lib/ansible/modules/files/read_csv.py validate-modules:parameter-list-no-elements
lib/ansible/modules/files/replace.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/files/stat.py validate-modules:parameter-invalid
lib/ansible/modules/files/stat.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/files/stat.py validate-modules:undocumented-parameter
lib/ansible/modules/files/synchronize.py pylint:blacklisted-name
lib/ansible/modules/files/synchronize.py use-argspec-type-path
lib/ansible/modules/files/synchronize.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/files/synchronize.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/files/synchronize.py validate-modules:parameter-list-no-elements
lib/ansible/modules/files/synchronize.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/files/synchronize.py validate-modules:undocumented-parameter
lib/ansible/modules/files/unarchive.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/files/unarchive.py validate-modules:parameter-list-no-elements
lib/ansible/modules/files/xml.py validate-modules:doc-required-mismatch
lib/ansible/modules/files/xml.py validate-modules:parameter-list-no-elements
lib/ansible/modules/identity/cyberark/cyberark_authentication.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/identity/ipa/ipa_hbacrule.py validate-modules:doc-elements-mismatch
lib/ansible/modules/identity/keycloak/keycloak_client.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/identity/keycloak/keycloak_client.py validate-modules:doc-elements-mismatch
lib/ansible/modules/identity/keycloak/keycloak_client.py validate-modules:doc-missing-type
lib/ansible/modules/identity/keycloak/keycloak_client.py validate-modules:parameter-list-no-elements
lib/ansible/modules/identity/keycloak/keycloak_client.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/identity/keycloak/keycloak_clienttemplate.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/identity/keycloak/keycloak_clienttemplate.py validate-modules:doc-elements-mismatch
lib/ansible/modules/identity/keycloak/keycloak_clienttemplate.py validate-modules:doc-missing-type
lib/ansible/modules/identity/keycloak/keycloak_clienttemplate.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/identity/onepassword_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/identity/opendj/opendj_backendprop.py validate-modules:doc-missing-type
lib/ansible/modules/identity/opendj/opendj_backendprop.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/messaging/rabbitmq/rabbitmq_binding.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/messaging/rabbitmq/rabbitmq_binding.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/messaging/rabbitmq/rabbitmq_exchange.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/messaging/rabbitmq/rabbitmq_exchange.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/messaging/rabbitmq/rabbitmq_exchange.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/messaging/rabbitmq/rabbitmq_global_parameter.py validate-modules:doc-missing-type
lib/ansible/modules/messaging/rabbitmq/rabbitmq_global_parameter.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/messaging/rabbitmq/rabbitmq_parameter.py validate-modules:doc-missing-type
lib/ansible/modules/messaging/rabbitmq/rabbitmq_plugin.py validate-modules:doc-missing-type
lib/ansible/modules/messaging/rabbitmq/rabbitmq_policy.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/messaging/rabbitmq/rabbitmq_policy.py validate-modules:doc-missing-type
lib/ansible/modules/messaging/rabbitmq/rabbitmq_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/messaging/rabbitmq/rabbitmq_queue.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/messaging/rabbitmq/rabbitmq_queue.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/messaging/rabbitmq/rabbitmq_queue.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/messaging/rabbitmq/rabbitmq_user.py validate-modules:doc-missing-type
lib/ansible/modules/messaging/rabbitmq/rabbitmq_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/messaging/rabbitmq/rabbitmq_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/messaging/rabbitmq/rabbitmq_vhost.py validate-modules:doc-missing-type
lib/ansible/modules/messaging/rabbitmq/rabbitmq_vhost_limits.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/airbrake_deployment.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/monitoring/airbrake_deployment.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/bigpanda.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/monitoring/bigpanda.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/bigpanda.py validate-modules:undocumented-parameter
lib/ansible/modules/monitoring/circonus_annotation.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/monitoring/circonus_annotation.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/circonus_annotation.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/datadog/datadog_event.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/monitoring/datadog/datadog_event.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/monitoring/datadog/datadog_event.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/datadog/datadog_event.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/datadog/datadog_event.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/datadog/datadog_monitor.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/monitoring/datadog/datadog_monitor.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/datadog/datadog_monitor.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/grafana/grafana_dashboard.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/grafana/grafana_dashboard.py validate-modules:doc-required-mismatch
lib/ansible/modules/monitoring/grafana/grafana_dashboard.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/grafana/grafana_datasource.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/monitoring/grafana/grafana_datasource.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/grafana/grafana_datasource.py validate-modules:doc-required-mismatch
lib/ansible/modules/monitoring/grafana/grafana_datasource.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/grafana/grafana_plugin.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/grafana/grafana_plugin.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/honeybadger_deployment.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/icinga2_feature.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/icinga2_host.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/monitoring/icinga2_host.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/icinga2_host.py validate-modules:doc-required-mismatch
lib/ansible/modules/monitoring/icinga2_host.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/icinga2_host.py validate-modules:undocumented-parameter
lib/ansible/modules/monitoring/librato_annotation.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/librato_annotation.py validate-modules:doc-required-mismatch
lib/ansible/modules/monitoring/librato_annotation.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/librato_annotation.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/logentries.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/monitoring/logentries.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/logentries.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/logentries.py validate-modules:undocumented-parameter
lib/ansible/modules/monitoring/logicmonitor.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/monitoring/logicmonitor.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/logicmonitor.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/monitoring/logicmonitor.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/logicmonitor.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/logicmonitor.py yamllint:unparsable-with-libyaml
lib/ansible/modules/monitoring/logicmonitor_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/monitoring/logicmonitor_facts.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/logicmonitor_facts.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/monitoring/logstash_plugin.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/logstash_plugin.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/monitoring/logstash_plugin.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/monit.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/monit.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/nagios.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/monitoring/nagios.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/nagios.py validate-modules:doc-required-mismatch
lib/ansible/modules/monitoring/nagios.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/monitoring/newrelic_deployment.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/pagerduty.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/monitoring/pagerduty.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/pagerduty.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/pagerduty.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/pagerduty_alert.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/pagerduty_alert.py validate-modules:doc-required-mismatch
lib/ansible/modules/monitoring/pingdom.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/monitoring/pingdom.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/rollbar_deployment.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/sensu/sensu_check.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/monitoring/sensu/sensu_check.py validate-modules:doc-required-mismatch
lib/ansible/modules/monitoring/sensu/sensu_check.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/sensu/sensu_check.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/sensu/sensu_client.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/monitoring/sensu/sensu_client.py validate-modules:doc-required-mismatch
lib/ansible/modules/monitoring/sensu/sensu_client.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/sensu/sensu_client.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/sensu/sensu_handler.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/monitoring/sensu/sensu_handler.py validate-modules:doc-required-mismatch
lib/ansible/modules/monitoring/sensu/sensu_handler.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/sensu/sensu_handler.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/sensu/sensu_silence.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/sensu/sensu_silence.py validate-modules:doc-required-mismatch
lib/ansible/modules/monitoring/sensu/sensu_silence.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/sensu/sensu_subscription.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/spectrum_device.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/spectrum_device.py validate-modules:doc-required-mismatch
lib/ansible/modules/monitoring/spectrum_device.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/stackdriver.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/stackdriver.py validate-modules:doc-required-mismatch
lib/ansible/modules/monitoring/statusio_maintenance.py pylint:blacklisted-name
lib/ansible/modules/monitoring/statusio_maintenance.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/statusio_maintenance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/statusio_maintenance.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/uptimerobot.py validate-modules:doc-missing-type
lib/ansible/modules/monitoring/zabbix/zabbix_action.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/monitoring/zabbix/zabbix_action.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/monitoring/zabbix/zabbix_action.py validate-modules:doc-elements-mismatch
lib/ansible/modules/monitoring/zabbix/zabbix_action.py validate-modules:doc-required-mismatch
lib/ansible/modules/monitoring/zabbix/zabbix_action.py validate-modules:missing-suboption-docs
lib/ansible/modules/monitoring/zabbix/zabbix_action.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/zabbix/zabbix_action.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/monitoring/zabbix/zabbix_action.py validate-modules:undocumented-parameter
lib/ansible/modules/monitoring/zabbix/zabbix_group.py validate-modules:doc-elements-mismatch
lib/ansible/modules/monitoring/zabbix/zabbix_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/zabbix/zabbix_group_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/monitoring/zabbix/zabbix_group_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/zabbix/zabbix_host.py validate-modules:doc-elements-mismatch
lib/ansible/modules/monitoring/zabbix/zabbix_host.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/zabbix/zabbix_host_info.py validate-modules:doc-elements-mismatch
lib/ansible/modules/monitoring/zabbix/zabbix_host_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/zabbix/zabbix_maintenance.py validate-modules:doc-elements-mismatch
lib/ansible/modules/monitoring/zabbix/zabbix_maintenance.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/zabbix/zabbix_mediatype.py validate-modules:doc-elements-mismatch
lib/ansible/modules/monitoring/zabbix/zabbix_mediatype.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/zabbix/zabbix_template.py validate-modules:doc-elements-mismatch
lib/ansible/modules/monitoring/zabbix/zabbix_template.py validate-modules:parameter-list-no-elements
lib/ansible/modules/monitoring/zabbix/zabbix_user.py validate-modules:doc-elements-mismatch
lib/ansible/modules/monitoring/zabbix/zabbix_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/net_tools/basics/get_url.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/basics/uri.py pylint:blacklisted-name
lib/ansible/modules/net_tools/basics/uri.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/basics/uri.py validate-modules:parameter-list-no-elements
lib/ansible/modules/net_tools/basics/uri.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/cloudflare_dns.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/dnsimple.py validate-modules:parameter-list-no-elements
lib/ansible/modules/net_tools/dnsmadeeasy.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/dnsmadeeasy.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/dnsmadeeasy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/ip_netns.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/ipinfoio_facts.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/ipinfoio_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/ldap/ldap_entry.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/ldap/ldap_entry.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/ldap/ldap_passwd.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/ldap/ldap_passwd.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/netbox/netbox_device.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/netbox/netbox_device.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/netbox/netbox_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/netbox/netbox_ip_address.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/netbox/netbox_ip_address.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/netbox/netbox_prefix.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/netbox/netbox_prefix.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/netbox/netbox_site.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/netcup_dns.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/netcup_dns.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_a_record.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_a_record.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_a_record.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_a_record.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_a_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_a_record.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_aaaa_record.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_aaaa_record.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_aaaa_record.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_aaaa_record.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_aaaa_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_aaaa_record.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_cname_record.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_cname_record.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_cname_record.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_cname_record.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_cname_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_cname_record.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_dns_view.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_dns_view.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_dns_view.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_dns_view.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_dns_view.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_dns_view.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_fixed_address.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_fixed_address.py validate-modules:doc-elements-mismatch
lib/ansible/modules/net_tools/nios/nios_fixed_address.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_fixed_address.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_fixed_address.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_fixed_address.py validate-modules:parameter-alias-self
lib/ansible/modules/net_tools/nios/nios_fixed_address.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_fixed_address.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_host_record.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_host_record.py validate-modules:doc-elements-mismatch
lib/ansible/modules/net_tools/nios/nios_host_record.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_host_record.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_host_record.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_host_record.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/net_tools/nios/nios_host_record.py validate-modules:parameter-alias-self
lib/ansible/modules/net_tools/nios/nios_host_record.py validate-modules:parameter-list-no-elements
lib/ansible/modules/net_tools/nios/nios_host_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_host_record.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_member.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_member.py validate-modules:doc-elements-mismatch
lib/ansible/modules/net_tools/nios/nios_member.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_member.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_member.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_member.py validate-modules:parameter-list-no-elements
lib/ansible/modules/net_tools/nios/nios_member.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_member.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_mx_record.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_mx_record.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_mx_record.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_mx_record.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_mx_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_mx_record.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_naptr_record.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_naptr_record.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_naptr_record.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_naptr_record.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_naptr_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_naptr_record.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_network.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_network.py validate-modules:doc-elements-mismatch
lib/ansible/modules/net_tools/nios/nios_network.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_network.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_network.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_network.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_network.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_network_view.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_network_view.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_network_view.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_network_view.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_network_view.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_network_view.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_nsgroup.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/net_tools/nios/nios_nsgroup.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_nsgroup.py validate-modules:doc-elements-mismatch
lib/ansible/modules/net_tools/nios/nios_nsgroup.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_nsgroup.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_nsgroup.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_nsgroup.py validate-modules:missing-suboption-docs
lib/ansible/modules/net_tools/nios/nios_nsgroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_nsgroup.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_ptr_record.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_ptr_record.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_ptr_record.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_ptr_record.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_ptr_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_ptr_record.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_srv_record.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_srv_record.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_srv_record.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_srv_record.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_srv_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_srv_record.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_txt_record.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_txt_record.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_txt_record.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_txt_record.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_txt_record.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_txt_record.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nios/nios_zone.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/net_tools/nios/nios_zone.py validate-modules:doc-elements-mismatch
lib/ansible/modules/net_tools/nios/nios_zone.py validate-modules:doc-missing-type
lib/ansible/modules/net_tools/nios/nios_zone.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/nios/nios_zone.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/net_tools/nios/nios_zone.py validate-modules:parameter-alias-self
lib/ansible/modules/net_tools/nios/nios_zone.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nios/nios_zone.py validate-modules:undocumented-parameter
lib/ansible/modules/net_tools/nmcli.py validate-modules:parameter-list-no-elements
lib/ansible/modules/net_tools/nmcli.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/nsupdate.py validate-modules:parameter-list-no-elements
lib/ansible/modules/net_tools/nsupdate.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/omapi_host.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/a10/a10_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/a10/a10_server.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/a10/a10_server_axapi3.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/a10/a10_server_axapi3.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/a10/a10_server_axapi3.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/a10/a10_service_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/a10/a10_service_group.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/a10/a10_virtual_server.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/a10/a10_virtual_server.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/a10/a10_virtual_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/a10/a10_virtual_server.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/aci/aci_aaa_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_aaa_user_certificate.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_access_port_block_to_access_port.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_access_port_to_interface_policy_leaf_profile.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_access_sub_port_block_to_access_port.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_aep.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_aep_to_domain.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_ap.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_bd.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_bd_subnet.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_bd_subnet.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/aci_bd_to_l3out.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_config_rollback.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_config_snapshot.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_contract.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_contract_subject.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_contract_subject_to_filter.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_domain.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_domain_to_encap_pool.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_domain_to_vlan_pool.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_encap_pool.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_encap_pool_range.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_epg.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_epg_monitoring_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_epg_to_contract.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_epg_to_domain.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_fabric_node.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_fabric_scheduler.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_fabric_scheduler.py validate-modules:parameter-alias-self
lib/ansible/modules/network/aci/aci_filter.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_filter_entry.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_firmware_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_firmware_group.py validate-modules:parameter-alias-self
lib/ansible/modules/network/aci/aci_firmware_group_node.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_firmware_group_node.py validate-modules:parameter-alias-self
lib/ansible/modules/network/aci/aci_firmware_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_firmware_policy.py validate-modules:parameter-alias-self
lib/ansible/modules/network/aci/aci_firmware_source.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_interface_policy_cdp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_interface_policy_fc.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_interface_policy_l2.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_interface_policy_leaf_policy_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_interface_policy_leaf_profile.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_interface_policy_lldp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_interface_policy_mcp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_interface_policy_ospf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_interface_policy_ospf.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/aci_interface_policy_port_channel.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_interface_policy_port_security.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_interface_selector_to_switch_policy_leaf_profile.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_l3out.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_l3out.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/aci_l3out_extepg.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_l3out_extsubnet.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_l3out_extsubnet.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/aci_l3out_route_tag_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_maintenance_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_maintenance_group_node.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_maintenance_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_rest.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_static_binding_to_epg.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_static_binding_to_epg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/aci_switch_leaf_selector.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_switch_policy_leaf_profile.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_switch_policy_vpc_protection_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_taboo_contract.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_tenant.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_tenant_action_rule_profile.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_tenant_ep_retention_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_tenant_span_dst_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_tenant_span_src_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_tenant_span_src_group_to_dst_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_vlan_pool.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_vlan_pool_encap_block.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_vmm_credential.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/aci_vmm_credential.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/aci/aci_vrf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_label.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_role.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_role.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/mso_schema.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_schema.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/mso_schema_site.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_schema_site_anp_epg.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_schema_site_anp_epg_domain.py pylint:ansible-bad-function
lib/ansible/modules/network/aci/mso_schema_site_anp_epg_domain.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_schema_site_anp_epg_staticleaf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_schema_site_anp_epg_staticport.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_schema_site_anp_epg_subnet.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_schema_site_bd_l3out.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_schema_site_vrf_region.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_schema_site_vrf_region_cidr.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_schema_site_vrf_region_cidr_subnet.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_schema_template_anp_epg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/mso_schema_template_bd.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/mso_schema_template_contract_filter.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/aci/mso_schema_template_contract_filter.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/mso_schema_template_deploy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_schema_template_external_epg_subnet.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/mso_schema_template_filter_entry.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/mso_site.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_site.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/mso_tenant.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_tenant.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aci/mso_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aci/mso_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aireos/aireos_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/aireos/aireos_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/aireos/aireos_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aireos/aireos_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aireos/aireos_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/aireos/aireos_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/aireos/aireos_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/aireos/aireos_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aireos/aireos_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aireos/aireos_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/apconos/apconos_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aruba/aruba_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/aruba/aruba_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/aruba/aruba_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aruba/aruba_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aruba/aruba_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/aruba/aruba_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/aruba/aruba_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/aruba/aruba_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/aruba/aruba_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/aruba/aruba_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/asa/asa_acl.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/asa/asa_acl.py validate-modules:doc-missing-type
lib/ansible/modules/network/asa/asa_acl.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/asa/asa_acl.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/asa/asa_acl.py validate-modules:undocumented-parameter
lib/ansible/modules/network/asa/asa_acl.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/asa/asa_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/asa/asa_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/asa/asa_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/asa/asa_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/asa/asa_command.py validate-modules:undocumented-parameter
lib/ansible/modules/network/asa/asa_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/asa/asa_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/asa/asa_config.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/network/asa/asa_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/asa/asa_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/asa/asa_config.py validate-modules:undocumented-parameter
lib/ansible/modules/network/asa/asa_config.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/asa/asa_og.py validate-modules:doc-missing-type
lib/ansible/modules/network/asa/asa_og.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/asa/asa_og.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_actiongroupconfig.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_actiongroupconfig.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_actiongroupconfig.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_actiongroupconfig.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_alertconfig.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_alertconfig.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_alertconfig.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_alertconfig.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_alertemailconfig.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_alertemailconfig.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_alertemailconfig.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_alertemailconfig.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_alertscriptconfig.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_alertscriptconfig.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_alertscriptconfig.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_alertscriptconfig.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_alertsyslogconfig.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_alertsyslogconfig.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_alertsyslogconfig.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_alertsyslogconfig.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_alertsyslogconfig.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_analyticsprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_analyticsprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_analyticsprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_analyticsprofile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_analyticsprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_api_session.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_api_session.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_api_session.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_api_session.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/avi/avi_api_session.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_api_version.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_api_version.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_api_version.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_api_version.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_applicationpersistenceprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_applicationpersistenceprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_applicationpersistenceprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_applicationpersistenceprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_applicationprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_applicationprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_applicationprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_applicationprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_authprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_authprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_authprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_authprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_autoscalelaunchconfig.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_autoscalelaunchconfig.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_autoscalelaunchconfig.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_autoscalelaunchconfig.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_backup.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_backup.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_backup.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_backup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_backupconfiguration.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_backupconfiguration.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_backupconfiguration.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_backupconfiguration.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_certificatemanagementprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_certificatemanagementprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_certificatemanagementprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_certificatemanagementprofile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_certificatemanagementprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_cloud.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_cloud.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_cloud.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_cloud.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_cloud.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_cloudconnectoruser.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_cloudconnectoruser.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_cloudconnectoruser.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_cloudconnectoruser.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_cloudproperties.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_cloudproperties.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_cloudproperties.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_cloudproperties.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_cloudproperties.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_cluster.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_cluster.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_cluster.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_cluster.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_cluster.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_clusterclouddetails.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_clusterclouddetails.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_clusterclouddetails.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_clusterclouddetails.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_controllerproperties.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_controllerproperties.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_controllerproperties.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_controllerproperties.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_controllerproperties.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_customipamdnsprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_customipamdnsprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_customipamdnsprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_customipamdnsprofile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_customipamdnsprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_dnspolicy.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_dnspolicy.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_dnspolicy.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_dnspolicy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_dnspolicy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_errorpagebody.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_errorpagebody.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_errorpagebody.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_errorpagebody.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_errorpageprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_errorpageprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_errorpageprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_errorpageprofile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_errorpageprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_gslb.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_gslb.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_gslb.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_gslb.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_gslb.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_gslbgeodbprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_gslbgeodbprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_gslbgeodbprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_gslbgeodbprofile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_gslbgeodbprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_gslbservice.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_gslbservice.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_gslbservice.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_gslbservice.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_gslbservice.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_gslbservice_patch_member.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_gslbservice_patch_member.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_gslbservice_patch_member.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_gslbservice_patch_member.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_hardwaresecuritymodulegroup.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_hardwaresecuritymodulegroup.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_hardwaresecuritymodulegroup.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_hardwaresecuritymodulegroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_healthmonitor.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_healthmonitor.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_healthmonitor.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_healthmonitor.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_httppolicyset.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_httppolicyset.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_httppolicyset.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_httppolicyset.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_ipaddrgroup.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_ipaddrgroup.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_ipaddrgroup.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_ipaddrgroup.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_ipaddrgroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_ipamdnsproviderprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_ipamdnsproviderprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_ipamdnsproviderprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_ipamdnsproviderprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_l4policyset.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_l4policyset.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_l4policyset.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_l4policyset.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_microservicegroup.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_microservicegroup.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_microservicegroup.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_microservicegroup.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_microservicegroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_network.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_network.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_network.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_network.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_network.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_networkprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_networkprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_networkprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_networkprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_networksecuritypolicy.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_networksecuritypolicy.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_networksecuritypolicy.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_networksecuritypolicy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_networksecuritypolicy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_pkiprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_pkiprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_pkiprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_pkiprofile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_pkiprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_pool.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_pool.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_pool.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_pool.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_pool.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_poolgroup.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_poolgroup.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_poolgroup.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_poolgroup.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_poolgroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_poolgroupdeploymentpolicy.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_poolgroupdeploymentpolicy.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_poolgroupdeploymentpolicy.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_poolgroupdeploymentpolicy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_poolgroupdeploymentpolicy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_prioritylabels.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_prioritylabels.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_prioritylabels.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_prioritylabels.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_prioritylabels.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_role.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_role.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_role.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_role.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_role.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_scheduler.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_scheduler.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_scheduler.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_scheduler.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_seproperties.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_seproperties.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_seproperties.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_seproperties.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_serverautoscalepolicy.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_serverautoscalepolicy.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_serverautoscalepolicy.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_serverautoscalepolicy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_serverautoscalepolicy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_serviceengine.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_serviceengine.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_serviceengine.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_serviceengine.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_serviceengine.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_serviceenginegroup.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_serviceenginegroup.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_serviceenginegroup.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_serviceenginegroup.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_serviceenginegroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_snmptrapprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_snmptrapprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_snmptrapprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_snmptrapprofile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_snmptrapprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_sslkeyandcertificate.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_sslkeyandcertificate.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_sslkeyandcertificate.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_sslkeyandcertificate.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_sslkeyandcertificate.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_sslprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_sslprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_sslprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_sslprofile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_sslprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_stringgroup.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_stringgroup.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_stringgroup.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_stringgroup.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_stringgroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_systemconfiguration.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_systemconfiguration.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_systemconfiguration.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_systemconfiguration.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_systemconfiguration.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_tenant.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_tenant.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_tenant.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_tenant.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_trafficcloneprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_trafficcloneprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_trafficcloneprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_trafficcloneprofile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_trafficcloneprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_user.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_useraccount.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_useraccount.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_useraccount.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_useraccount.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/avi/avi_useraccount.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_useraccountprofile.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_useraccountprofile.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_useraccountprofile.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_useraccountprofile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_virtualservice.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_virtualservice.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_virtualservice.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_virtualservice.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_virtualservice.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_vrfcontext.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_vrfcontext.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_vrfcontext.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_vrfcontext.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_vrfcontext.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_vsdatascriptset.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_vsdatascriptset.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_vsdatascriptset.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_vsdatascriptset.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_vsdatascriptset.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_vsvip.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_vsvip.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_vsvip.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_vsvip.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/avi/avi_vsvip.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/avi/avi_webhook.py future-import-boilerplate
lib/ansible/modules/network/avi/avi_webhook.py metaclass-boilerplate
lib/ansible/modules/network/avi/avi_webhook.py validate-modules:doc-missing-type
lib/ansible/modules/network/avi/avi_webhook.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/bigswitch/bcf_switch.py validate-modules:doc-missing-type
lib/ansible/modules/network/bigswitch/bcf_switch.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/bigswitch/bigmon_chain.py validate-modules:doc-missing-type
lib/ansible/modules/network/bigswitch/bigmon_chain.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/bigswitch/bigmon_policy.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/bigswitch/bigmon_policy.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/bigswitch/bigmon_policy.py validate-modules:doc-missing-type
lib/ansible/modules/network/bigswitch/bigmon_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/check_point/checkpoint_access_rule.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/check_point/checkpoint_access_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/checkpoint_host.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/checkpoint_object_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/check_point/checkpoint_run_script.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/checkpoint_session.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/check_point/checkpoint_task_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/check_point/cp_mgmt_access_layer.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_access_layer_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_access_role.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_access_role_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_access_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_access_rule_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_address_range.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_address_range_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_administrator.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_administrator_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_application_site.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_application_site_category.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_application_site_category_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_application_site_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_application_site_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_application_site_group_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_assign_global_assignment.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_dns_domain.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_dns_domain_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_dynamic_object.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_dynamic_object_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_exception_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_exception_group_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_global_assignment_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_group_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_group_with_exclusion.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_group_with_exclusion_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_host.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_host_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_install_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_mds_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_multicast_address_range.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_multicast_address_range_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_network.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_network_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_package.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_package_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_put_file.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_run_script.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_security_zone.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_security_zone_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_dce_rpc.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_dce_rpc_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_group_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_icmp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_icmp6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_icmp6_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_icmp_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_other.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_other_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_rpc.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_rpc_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_sctp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_sctp_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_tcp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_tcp_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_udp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_service_udp_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_session_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_simple_gateway.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_simple_gateway_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_tag.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_tag_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_threat_exception.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_threat_exception_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_threat_indicator.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_threat_indicator_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_threat_layer.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_threat_layer_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_threat_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_threat_profile_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_threat_protection_override.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_threat_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_threat_rule_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_time.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_time_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_vpn_community_meshed.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_vpn_community_meshed_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_vpn_community_star.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_vpn_community_star_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_wildcard.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/check_point/cp_mgmt_wildcard_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cli/cli_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cli/cli_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cli/cli_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/cli/cli_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cli/cli_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_aaa_server.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_aaa_server.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_aaa_server.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_aaa_server.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_aaa_server.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_aaa_server.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_aaa_server.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_aaa_server.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_aaa_server_host.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_aaa_server_host.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_aaa_server_host.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_aaa_server_host.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_aaa_server_host.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_aaa_server_host.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_aaa_server_host.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_aaa_server_host.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_acl.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_acl.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_acl.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_acl.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_acl.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_acl.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_acl.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_acl.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_acl_advance.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_acl_advance.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_acl_advance.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_acl_advance.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_acl_advance.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_acl_advance.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_acl_advance.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_acl_advance.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_acl_interface.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_acl_interface.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_acl_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_acl_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_acl_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_acl_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_acl_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_acl_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_bfd_global.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_bfd_global.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_bfd_global.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_bfd_global.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_bfd_global.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_bfd_global.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_bfd_global.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_bfd_global.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_bfd_session.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_bfd_session.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_bfd_session.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_bfd_session.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_bfd_session.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_bfd_session.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_bfd_session.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_bfd_session.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_bfd_view.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_bfd_view.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_bfd_view.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/cloudengine/ce_bfd_view.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_bfd_view.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cloudengine/ce_bfd_view.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_bfd_view.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_bgp.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_bgp.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_bgp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_bgp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_bgp.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_bgp.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_bgp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_bgp.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_bgp_af.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_bgp_af.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_bgp_af.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_bgp_af.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_bgp_af.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_bgp_af.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_bgp_af.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_bgp_af.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor_af.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor_af.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor_af.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor_af.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor_af.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor_af.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor_af.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_bgp_neighbor_af.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_command.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_command.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_command.py pylint:blacklisted-name
lib/ansible/modules/network/cloudengine/ce_command.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_command.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cloudengine/ce_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_command.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_config.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_config.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_config.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_config.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cloudengine/ce_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_config.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_dldp.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_dldp.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_dldp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_dldp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_dldp.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_dldp.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_dldp.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/cloudengine/ce_dldp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_dldp.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_dldp_interface.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_dldp_interface.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_dldp_interface.py pylint:blacklisted-name
lib/ansible/modules/network/cloudengine/ce_dldp_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_dldp_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_dldp_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_dldp_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_dldp_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_dldp_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_eth_trunk.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_eth_trunk.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_eth_trunk.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_eth_trunk.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_eth_trunk.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_eth_trunk.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_eth_trunk.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cloudengine/ce_eth_trunk.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_eth_trunk.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_evpn_bd_vni.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_evpn_bd_vni.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_evpn_bd_vni.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_evpn_bd_vni.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_evpn_bd_vni.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_evpn_bd_vni.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cloudengine/ce_evpn_bd_vni.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_evpn_bd_vni.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cloudengine/ce_evpn_bd_vni.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_evpn_bd_vni.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_evpn_bgp.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_evpn_bgp.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_evpn_bgp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_evpn_bgp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_evpn_bgp.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_evpn_bgp.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_evpn_bgp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_evpn_bgp.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_evpn_bgp_rr.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_evpn_bgp_rr.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_evpn_bgp_rr.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_evpn_bgp_rr.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_evpn_bgp_rr.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_evpn_bgp_rr.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_evpn_bgp_rr.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_evpn_bgp_rr.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_evpn_global.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_evpn_global.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_evpn_global.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_evpn_global.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_evpn_global.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_evpn_global.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_evpn_global.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_evpn_global.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_facts.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_facts.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_facts.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_facts.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_facts.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cloudengine/ce_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_facts.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_file_copy.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_file_copy.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_file_copy.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_file_copy.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_file_copy.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_file_copy.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_file_copy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_file_copy.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_info_center_debug.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_info_center_debug.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_info_center_debug.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_info_center_debug.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_info_center_debug.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_info_center_debug.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_info_center_debug.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_info_center_debug.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_info_center_global.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_info_center_global.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_info_center_global.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_info_center_global.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_info_center_global.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_info_center_global.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_info_center_global.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_info_center_global.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_info_center_log.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_info_center_log.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_info_center_log.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_info_center_log.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_info_center_log.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_info_center_log.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_info_center_log.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_info_center_log.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_info_center_trap.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_info_center_trap.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_info_center_trap.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_info_center_trap.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_info_center_trap.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_info_center_trap.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_info_center_trap.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_info_center_trap.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_interface.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_interface.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_interface_ospf.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_interface_ospf.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_interface_ospf.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_interface_ospf.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_interface_ospf.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_interface_ospf.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_interface_ospf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_interface_ospf.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_ip_interface.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_ip_interface.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_ip_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_ip_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_ip_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_ip_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_ip_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_ip_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_is_is_view.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cloudengine/ce_link_status.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_link_status.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_link_status.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_link_status.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_link_status.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_link_status.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_link_status.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_link_status.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_mlag_config.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_mlag_config.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_mlag_config.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_mlag_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_mlag_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_mlag_config.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_mlag_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_mlag_config.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_mlag_interface.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_mlag_interface.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_mlag_interface.py pylint:blacklisted-name
lib/ansible/modules/network/cloudengine/ce_mlag_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_mlag_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_mlag_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_mlag_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_mlag_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_mlag_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_mtu.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_mtu.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_mtu.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_mtu.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_mtu.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_mtu.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cloudengine/ce_mtu.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_mtu.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_mtu.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_netconf.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_netconf.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_netconf.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_netconf.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_netconf.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_netconf.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_netconf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_netconf.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_netstream_aging.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_netstream_aging.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_netstream_aging.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_netstream_aging.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_netstream_aging.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_netstream_aging.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_netstream_aging.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_netstream_aging.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_netstream_export.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_netstream_export.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_netstream_export.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_netstream_export.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_netstream_export.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_netstream_export.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_netstream_export.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_netstream_export.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_netstream_global.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_netstream_global.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_netstream_global.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_netstream_global.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_netstream_global.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_netstream_global.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_netstream_global.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_netstream_global.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_netstream_template.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_netstream_template.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_netstream_template.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_netstream_template.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_netstream_template.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_netstream_template.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_netstream_template.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_netstream_template.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_ntp.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_ntp.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_ntp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_ntp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_ntp.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_ntp.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_ntp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_ntp.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_ntp_auth.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_ntp_auth.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_ntp_auth.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_ntp_auth.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_ntp_auth.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_ntp_auth.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_ntp_auth.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_ntp_auth.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_ospf.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_ospf.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_ospf.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_ospf.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_ospf.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_ospf.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_ospf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_ospf.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_ospf_vrf.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_ospf_vrf.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_ospf_vrf.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_ospf_vrf.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_ospf_vrf.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_ospf_vrf.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_ospf_vrf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_ospf_vrf.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_reboot.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_reboot.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_reboot.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_reboot.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_reboot.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_reboot.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_reboot.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_reboot.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_rollback.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_rollback.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_rollback.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_rollback.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_rollback.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_rollback.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cloudengine/ce_rollback.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_rollback.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_rollback.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_sflow.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_sflow.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_sflow.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_sflow.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_sflow.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_sflow.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_sflow.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cloudengine/ce_sflow.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_sflow.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_snmp_community.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_snmp_community.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_snmp_community.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_snmp_community.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_snmp_community.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_snmp_community.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_snmp_community.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_snmp_community.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_snmp_contact.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_snmp_contact.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_snmp_contact.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_snmp_contact.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_snmp_contact.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_snmp_contact.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_snmp_contact.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_snmp_contact.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_snmp_location.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_snmp_location.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_snmp_location.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_snmp_location.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_snmp_location.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_snmp_location.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_snmp_location.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_snmp_location.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_snmp_target_host.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_snmp_target_host.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_snmp_target_host.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_snmp_target_host.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_snmp_target_host.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_snmp_target_host.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_snmp_target_host.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_snmp_target_host.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_snmp_traps.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_snmp_traps.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_snmp_traps.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_snmp_traps.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_snmp_traps.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_snmp_traps.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_snmp_traps.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_snmp_traps.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_snmp_user.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_snmp_user.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_snmp_user.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_snmp_user.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_snmp_user.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_snmp_user.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_snmp_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_snmp_user.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_startup.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_startup.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_startup.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_startup.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_startup.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_startup.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_startup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_startup.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_static_route.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_static_route.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_static_route.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_static_route.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_static_route.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_static_route.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_static_route.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_static_route.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_static_route_bfd.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cloudengine/ce_static_route_bfd.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cloudengine/ce_stp.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_stp.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_stp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_stp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_stp.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_stp.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_stp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_stp.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_switchport.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_switchport.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_switchport.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_switchport.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_switchport.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_switchport.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_switchport.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_switchport.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_vlan.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_vlan.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_vlan.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vlan.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_vlan.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_vlan.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_vrf.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_vrf.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_vrf.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vrf.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vrf.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_vrf.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_vrf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_vrf.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_vrf_af.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_vrf_af.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_vrf_af.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vrf_af.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vrf_af.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_vrf_af.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_vrf_af.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_vrf_af.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_vrf_interface.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_vrf_interface.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_vrf_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vrf_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vrf_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_vrf_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_vrf_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_vrf_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_vrrp.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_vrrp.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_vrrp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vrrp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vrrp.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_vrrp.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_vrrp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_vrrp.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_vxlan_arp.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_vxlan_arp.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_vxlan_arp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vxlan_arp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vxlan_arp.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_vxlan_arp.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_vxlan_arp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_vxlan_arp.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_vxlan_gateway.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_vxlan_gateway.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_vxlan_gateway.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vxlan_gateway.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vxlan_gateway.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_vxlan_gateway.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_vxlan_gateway.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_vxlan_gateway.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_vxlan_global.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_vxlan_global.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_vxlan_global.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vxlan_global.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vxlan_global.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_vxlan_global.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_vxlan_global.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_vxlan_global.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_vxlan_tunnel.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_vxlan_tunnel.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_vxlan_tunnel.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vxlan_tunnel.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vxlan_tunnel.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_vxlan_tunnel.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_vxlan_tunnel.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cloudengine/ce_vxlan_tunnel.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_vxlan_tunnel.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudengine/ce_vxlan_vap.py future-import-boilerplate
lib/ansible/modules/network/cloudengine/ce_vxlan_vap.py metaclass-boilerplate
lib/ansible/modules/network/cloudengine/ce_vxlan_vap.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vxlan_vap.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cloudengine/ce_vxlan_vap.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudengine/ce_vxlan_vap.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cloudengine/ce_vxlan_vap.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cloudengine/ce_vxlan_vap.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cloudvision/cv_server_provision.py pylint:blacklisted-name
lib/ansible/modules/network/cloudvision/cv_server_provision.py validate-modules:doc-missing-type
lib/ansible/modules/network/cloudvision/cv_server_provision.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cloudvision/cv_server_provision.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/cnos/cnos_backup.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_backup.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_backup.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_backup.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/cnos/cnos_backup.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cnos/cnos_backup.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_banner.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cnos/cnos_banner.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_banner.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_banner.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_banner.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cnos/cnos_bgp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_bgp.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_bgp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_bgp.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cnos/cnos_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_conditional_command.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_conditional_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_conditional_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_conditional_command.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_conditional_template.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_conditional_template.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_conditional_template.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_conditional_template.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cnos/cnos_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_config.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_factory.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_factory.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_factory.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_facts.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/cnos/cnos_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cnos/cnos_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_facts.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_image.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_image.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_image.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_image.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cnos/cnos_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/cnos/cnos_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cnos/cnos_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cnos/cnos_l2_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_l2_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cnos/cnos_l2_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/cnos/cnos_l2_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_l2_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_l2_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cnos/cnos_l2_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_l2_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cnos/cnos_l3_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_l3_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cnos/cnos_l3_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/cnos/cnos_l3_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_l3_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_l3_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cnos/cnos_l3_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_l3_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cnos/cnos_linkagg.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_linkagg.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cnos/cnos_linkagg.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/cnos/cnos_linkagg.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_linkagg.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_linkagg.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cnos/cnos_linkagg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cnos/cnos_linkagg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_linkagg.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cnos/cnos_lldp.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_logging.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_logging.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/cnos/cnos_logging.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_logging.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cnos/cnos_logging.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_logging.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cnos/cnos_reload.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_reload.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_reload.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_rollback.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_rollback.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_rollback.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_rollback.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/cnos/cnos_rollback.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cnos/cnos_rollback.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_save.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_save.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_save.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_showrun.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_showrun.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/cnos/cnos_showrun.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_static_route.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_static_route.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/cnos/cnos_static_route.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_static_route.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_static_route.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cnos/cnos_static_route.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_static_route.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cnos/cnos_system.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_system.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cnos/cnos_system.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_template.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_template.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_template.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_template.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_user.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_user.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/cnos/cnos_user.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_user.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cnos/cnos_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cnos/cnos_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_user.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cnos/cnos_vlag.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_vlag.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_vlag.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_vlag.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/cnos/cnos_vlan.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_vlan.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/cnos/cnos_vlan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/cnos/cnos_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_vlan.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cnos/cnos_vlan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cnos/cnos_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_vlan.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cnos/cnos_vrf.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/cnos/cnos_vrf.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/cnos/cnos_vrf.py validate-modules:doc-missing-type
lib/ansible/modules/network/cnos/cnos_vrf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/cnos/cnos_vrf.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/cnos/cnos_vrf.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cnos/cnos_vrf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/cnos/cnos_vrf.py validate-modules:undocumented-parameter
lib/ansible/modules/network/cumulus/nclu.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/cumulus/nclu.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/dellos10/dellos10_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/dellos10/dellos10_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/dellos10/dellos10_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/dellos10/dellos10_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/dellos10/dellos10_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/dellos10/dellos10_command.py validate-modules:undocumented-parameter
lib/ansible/modules/network/dellos10/dellos10_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/dellos10/dellos10_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/dellos10/dellos10_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/dellos10/dellos10_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/dellos10/dellos10_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/dellos10/dellos10_config.py validate-modules:undocumented-parameter
lib/ansible/modules/network/dellos10/dellos10_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/dellos10/dellos10_facts.py validate-modules:doc-missing-type
lib/ansible/modules/network/dellos10/dellos10_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/dellos10/dellos10_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/dellos10/dellos10_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/dellos10/dellos10_facts.py validate-modules:undocumented-parameter
lib/ansible/modules/network/dellos6/dellos6_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/dellos6/dellos6_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/dellos6/dellos6_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/dellos6/dellos6_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/dellos6/dellos6_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/dellos6/dellos6_command.py validate-modules:undocumented-parameter
lib/ansible/modules/network/dellos6/dellos6_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/dellos6/dellos6_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/dellos6/dellos6_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/dellos6/dellos6_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/dellos6/dellos6_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/dellos6/dellos6_config.py validate-modules:undocumented-parameter
lib/ansible/modules/network/dellos6/dellos6_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/dellos6/dellos6_facts.py validate-modules:doc-missing-type
lib/ansible/modules/network/dellos6/dellos6_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/dellos6/dellos6_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/dellos6/dellos6_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/dellos6/dellos6_facts.py validate-modules:undocumented-parameter
lib/ansible/modules/network/dellos9/dellos9_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/dellos9/dellos9_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/dellos9/dellos9_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/dellos9/dellos9_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/dellos9/dellos9_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/dellos9/dellos9_command.py validate-modules:undocumented-parameter
lib/ansible/modules/network/dellos9/dellos9_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/dellos9/dellos9_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/dellos9/dellos9_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/dellos9/dellos9_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/dellos9/dellos9_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/dellos9/dellos9_config.py validate-modules:undocumented-parameter
lib/ansible/modules/network/dellos9/dellos9_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/dellos9/dellos9_facts.py validate-modules:doc-missing-type
lib/ansible/modules/network/dellos9/dellos9_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/dellos9/dellos9_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/dellos9/dellos9_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/dellos9/dellos9_facts.py validate-modules:undocumented-parameter
lib/ansible/modules/network/edgeos/edgeos_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/edgeos/edgeos_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/edgeos/edgeos_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/edgeos/edgeos_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/edgeos/edgeos_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/edgeos/edgeos_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/edgeos/edgeos_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/edgeos/edgeos_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/edgeswitch/edgeswitch_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/edgeswitch/edgeswitch_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/edgeswitch/edgeswitch_vlan.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/edgeswitch/edgeswitch_vlan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/edgeswitch/edgeswitch_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/network/edgeswitch/edgeswitch_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/edgeswitch/edgeswitch_vlan.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/edgeswitch/edgeswitch_vlan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/edgeswitch/edgeswitch_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/edgeswitch/edgeswitch_vlan.py validate-modules:undocumented-parameter
lib/ansible/modules/network/enos/enos_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/enos/enos_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/enos/enos_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/enos/enos_command.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/enos/enos_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/enos/enos_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/enos/enos_command.py validate-modules:undocumented-parameter
lib/ansible/modules/network/enos/enos_command.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/enos/enos_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/enos/enos_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/enos/enos_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/enos/enos_config.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/enos/enos_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/enos/enos_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/enos/enos_config.py validate-modules:undocumented-parameter
lib/ansible/modules/network/enos/enos_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/enos/enos_facts.py validate-modules:doc-missing-type
lib/ansible/modules/network/enos/enos_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/enos/enos_facts.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/enos/enos_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/enos/enos_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/enos/enos_facts.py validate-modules:undocumented-parameter
lib/ansible/modules/network/enos/enos_facts.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/eos/_eos_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/eos/_eos_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/eos/_eos_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/_eos_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/_eos_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/eos/_eos_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/_eos_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/eos/_eos_l2_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/eos/_eos_l2_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/eos/_eos_l2_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/_eos_l2_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/_eos_l2_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/eos/_eos_l2_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/_eos_l2_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/eos/_eos_l3_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/eos/_eos_l3_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/eos/_eos_l3_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/_eos_l3_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/_eos_l3_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/eos/_eos_l3_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/_eos_l3_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/eos/_eos_linkagg.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/eos/_eos_linkagg.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/eos/_eos_linkagg.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/_eos_linkagg.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/_eos_linkagg.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/eos/_eos_linkagg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/eos/_eos_linkagg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/_eos_linkagg.py validate-modules:undocumented-parameter
lib/ansible/modules/network/eos/_eos_vlan.py future-import-boilerplate
lib/ansible/modules/network/eos/_eos_vlan.py metaclass-boilerplate
lib/ansible/modules/network/eos/_eos_vlan.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/eos/_eos_vlan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/eos/_eos_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/_eos_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/_eos_vlan.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/eos/_eos_vlan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/eos/_eos_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/_eos_vlan.py validate-modules:undocumented-parameter
lib/ansible/modules/network/eos/eos_banner.py future-import-boilerplate
lib/ansible/modules/network/eos/eos_banner.py metaclass-boilerplate
lib/ansible/modules/network/eos/eos_banner.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/eos_banner.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/eos_bgp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/eos/eos_bgp.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/eos/eos_bgp.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/eos_bgp.py validate-modules:doc-type-does-not-match-spec
lib/ansible/modules/network/eos/eos_bgp.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/eos/eos_bgp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/eos_command.py future-import-boilerplate
lib/ansible/modules/network/eos/eos_command.py metaclass-boilerplate
lib/ansible/modules/network/eos/eos_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/eos_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/eos_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/eos/eos_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/eos_config.py future-import-boilerplate
lib/ansible/modules/network/eos/eos_config.py metaclass-boilerplate
lib/ansible/modules/network/eos/eos_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/eos_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/eos_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/eos/eos_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/eos_eapi.py future-import-boilerplate
lib/ansible/modules/network/eos/eos_eapi.py metaclass-boilerplate
lib/ansible/modules/network/eos/eos_eapi.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/eos/eos_eapi.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/eos_eapi.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/eos_eapi.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/eos_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/eos_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/eos/eos_interfaces.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/eos/eos_l2_interfaces.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/eos/eos_lldp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/eos/eos_lldp.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/eos_lldp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/eos_logging.py future-import-boilerplate
lib/ansible/modules/network/eos/eos_logging.py metaclass-boilerplate
lib/ansible/modules/network/eos/eos_logging.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/eos/eos_logging.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/eos/eos_logging.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/eos_logging.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/eos_logging.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/eos/eos_logging.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/eos_logging.py validate-modules:undocumented-parameter
lib/ansible/modules/network/eos/eos_static_route.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/eos/eos_static_route.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/eos/eos_static_route.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/eos_static_route.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/eos_static_route.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/eos/eos_static_route.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/eos_static_route.py validate-modules:undocumented-parameter
lib/ansible/modules/network/eos/eos_system.py future-import-boilerplate
lib/ansible/modules/network/eos/eos_system.py metaclass-boilerplate
lib/ansible/modules/network/eos/eos_system.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/eos_system.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/eos_system.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/eos/eos_system.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/eos_user.py future-import-boilerplate
lib/ansible/modules/network/eos/eos_user.py metaclass-boilerplate
lib/ansible/modules/network/eos/eos_user.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/eos/eos_user.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/eos/eos_user.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/eos_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/eos_user.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/eos/eos_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/eos_user.py validate-modules:undocumented-parameter
lib/ansible/modules/network/eos/eos_vrf.py future-import-boilerplate
lib/ansible/modules/network/eos/eos_vrf.py metaclass-boilerplate
lib/ansible/modules/network/eos/eos_vrf.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/eos/eos_vrf.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/eos/eos_vrf.py validate-modules:doc-missing-type
lib/ansible/modules/network/eos/eos_vrf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/eos/eos_vrf.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/eos/eos_vrf.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/eos/eos_vrf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/eos/eos_vrf.py validate-modules:undocumented-parameter
lib/ansible/modules/network/eric_eccli/eric_eccli_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/exos/exos_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/exos/exos_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/exos/exos_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/exos/exos_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/exos/exos_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/exos/exos_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/exos/exos_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/exos/exos_l2_interfaces.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/_bigip_asm_policy.py validate-modules:doc-missing-type
lib/ansible/modules/network/f5/_bigip_asm_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/_bigip_asm_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/f5/_bigip_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/_bigip_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/f5/_bigip_gtm_facts.py validate-modules:doc-missing-type
lib/ansible/modules/network/f5/_bigip_gtm_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/_bigip_gtm_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/_bigip_gtm_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/f5/bigip_apm_acl.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/f5/bigip_apm_acl.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_apm_network_access.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_apm_network_access.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_apm_policy_fetch.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_apm_policy_import.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_appsvcs_extension.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_asm_dos_application.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/f5/bigip_asm_dos_application.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_asm_dos_application.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_asm_policy_fetch.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_asm_policy_import.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_asm_policy_manage.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_asm_policy_server_technology.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_asm_policy_signature_set.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_cli_alias.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_cli_script.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_configsync_action.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_data_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_data_group.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/f5/bigip_data_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_device_auth.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_auth_ldap.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_auth_ldap.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_device_certificate.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_connectivity.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_connectivity.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_device_dns.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_dns.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_device_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_group_member.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_ha_group.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/f5/bigip_device_ha_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_httpd.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_httpd.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_device_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_device_info.py validate-modules:return-syntax-error
lib/ansible/modules/network/f5/bigip_device_license.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_ntp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_ntp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_device_sshd.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_sshd.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_device_syslog.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_traffic_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_device_traffic_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_device_trust.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_dns_cache_resolver.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/f5/bigip_dns_cache_resolver.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_dns_nameserver.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_dns_resolver.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_dns_zone.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_dns_zone.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_file_copy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_firewall_address_list.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/f5/bigip_firewall_address_list.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/f5/bigip_firewall_address_list.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_firewall_address_list.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_firewall_dos_profile.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_firewall_dos_vector.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_firewall_global_rules.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_firewall_log_profile.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_firewall_log_profile_network.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_firewall_log_profile_network.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/network/f5/bigip_firewall_log_profile_network.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_firewall_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_firewall_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_firewall_port_list.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_firewall_port_list.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_firewall_rule.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/f5/bigip_firewall_rule.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_firewall_rule_list.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_firewall_rule_list.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_firewall_schedule.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_firewall_schedule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_gtm_datacenter.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_global.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_monitor_bigip.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_monitor_external.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_monitor_firepass.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_monitor_http.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_monitor_https.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_monitor_tcp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_monitor_tcp_half_open.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_pool.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_pool.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_gtm_pool_member.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/f5/bigip_gtm_pool_member.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/f5/bigip_gtm_pool_member.py validate-modules:doc-missing-type
lib/ansible/modules/network/f5/bigip_gtm_pool_member.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_pool_member.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/f5/bigip_gtm_pool_member.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/f5/bigip_gtm_pool_member.py validate-modules:undocumented-parameter
lib/ansible/modules/network/f5/bigip_gtm_server.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_gtm_topology_record.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_topology_region.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/f5/bigip_gtm_topology_region.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_virtual_server.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_virtual_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_gtm_wide_ip.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_gtm_wide_ip.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_hostname.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_iapp_service.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_iapp_service.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_iapp_template.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_ike_peer.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_ike_peer.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_imish_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_imish_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_imish_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/f5/bigip_imish_config.py validate-modules:undocumented-parameter
lib/ansible/modules/network/f5/bigip_ipsec_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_irule.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_log_destination.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_log_destination.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/f5/bigip_log_publisher.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_log_publisher.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_lx_package.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_management_route.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_message_routing_peer.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_message_routing_protocol.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_message_routing_route.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_message_routing_route.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_message_routing_router.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_message_routing_router.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_message_routing_transport_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_message_routing_transport_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_monitor_dns.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_monitor_external.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_monitor_gateway_icmp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_monitor_http.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_monitor_https.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_monitor_ldap.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_monitor_snmp_dca.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_monitor_tcp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_monitor_tcp_echo.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_monitor_tcp_half_open.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_monitor_udp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_node.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_node.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_partition.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_password_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_policy_rule.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/f5/bigip_policy_rule.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_policy_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_pool.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/f5/bigip_pool.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/f5/bigip_pool.py validate-modules:doc-missing-type
lib/ansible/modules/network/f5/bigip_pool.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_pool.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/f5/bigip_pool.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_pool.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/f5/bigip_pool.py validate-modules:undocumented-parameter
lib/ansible/modules/network/f5/bigip_pool_member.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/f5/bigip_pool_member.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/f5/bigip_pool_member.py validate-modules:doc-missing-type
lib/ansible/modules/network/f5/bigip_pool_member.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_pool_member.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/f5/bigip_pool_member.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_pool_member.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/f5/bigip_pool_member.py validate-modules:undocumented-parameter
lib/ansible/modules/network/f5/bigip_profile_analytics.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_profile_analytics.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_profile_client_ssl.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_profile_client_ssl.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_profile_dns.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_profile_fastl4.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_profile_http.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_profile_http.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_profile_http2.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_profile_http2.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/f5/bigip_profile_http2.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_profile_http_compression.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_profile_oneconnect.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_profile_persistence_cookie.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_profile_persistence_src_addr.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_profile_server_ssl.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_profile_tcp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_profile_udp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_provision.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_qkview.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_qkview.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_remote_role.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_remote_syslog.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_remote_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_routedomain.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_routedomain.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_selfip.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_selfip.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_service_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_smtp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_snat_pool.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_snat_pool.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_snat_translation.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_snmp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_snmp_community.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_snmp_trap.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_software_image.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_software_install.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_software_update.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_ssl_certificate.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_ssl_key.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_ssl_ocsp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_static_route.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_sys_daemon_log_tmm.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_sys_db.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_sys_global.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_timer_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_timer_policy.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/f5/bigip_timer_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_traffic_selector.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_trunk.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_trunk.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_tunnel.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_tunnel.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/f5/bigip_ucs.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_ucs_fetch.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_ucs_fetch.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/f5/bigip_ucs_fetch.py validate-modules:undocumented-parameter
lib/ansible/modules/network/f5/bigip_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_vcmp_guest.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_vcmp_guest.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_virtual_address.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_virtual_server.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_virtual_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigip_vlan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigip_wait.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigiq_application_fasthttp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigiq_application_fasthttp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigiq_application_fastl4_tcp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigiq_application_fastl4_tcp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigiq_application_fastl4_udp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigiq_application_fastl4_udp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigiq_application_http.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigiq_application_http.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigiq_application_https_offload.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigiq_application_https_offload.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/f5/bigiq_application_https_offload.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigiq_application_https_waf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigiq_application_https_waf.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/f5/bigiq_application_https_waf.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigiq_device_discovery.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigiq_device_discovery.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigiq_device_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigiq_device_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/f5/bigiq_device_info.py validate-modules:return-syntax-error
lib/ansible/modules/network/f5/bigiq_regkey_license.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigiq_regkey_license_assignment.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigiq_regkey_pool.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigiq_utility_license.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/f5/bigiq_utility_license_assignment.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/fortianalyzer/faz_device.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/fortimanager/fmgr_device.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/fortimanager/fmgr_device.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_device_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_device_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/fortimanager/fmgr_device_group.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_device_provision_template.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/fortimanager/fmgr_device_provision_template.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_fwobj_address.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_fwobj_ippool.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortimanager/fmgr_fwobj_ippool.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_fwobj_ippool6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortimanager/fmgr_fwobj_ippool6.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_fwobj_service.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_fwobj_vip.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortimanager/fmgr_fwobj_vip.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_fwpol_ipv4.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortimanager/fmgr_fwpol_ipv4.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_fwpol_package.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/fortimanager/fmgr_fwpol_package.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_ha.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_provisioning.py validate-modules:doc-missing-type
lib/ansible/modules/network/fortimanager/fmgr_provisioning.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/fortimanager/fmgr_provisioning.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_query.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortimanager/fmgr_query.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_script.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/fortimanager/fmgr_script.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/fortimanager/fmgr_script.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_secprof_appctrl.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortimanager/fmgr_secprof_appctrl.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_secprof_av.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortimanager/fmgr_secprof_av.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_secprof_dns.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_secprof_ips.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortimanager/fmgr_secprof_ips.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_secprof_profile_group.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_secprof_proxy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortimanager/fmgr_secprof_proxy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_secprof_spam.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortimanager/fmgr_secprof_spam.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_secprof_ssl_ssh.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortimanager/fmgr_secprof_ssl_ssh.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_secprof_voip.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_secprof_waf.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortimanager/fmgr_secprof_waf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_secprof_wanopt.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortimanager/fmgr_secprof_web.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortimanager/fmgr_secprof_web.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortios/fortios_address.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/fortios/fortios_address.py validate-modules:doc-missing-type
lib/ansible/modules/network/fortios/fortios_antivirus_quarantine.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/fortios/fortios_application_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_application_list.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_application_name.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_authentication_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_authentication_scheme.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortios/fortios_dlp_filepattern.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_dlp_sensor.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_dnsfilter_domain_filter.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_dnsfilter_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_endpoint_control_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_DoS_policy.py validate-modules:parameter-invalid
lib/ansible/modules/network/fortios/fortios_firewall_DoS_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_DoS_policy6.py validate-modules:parameter-invalid
lib/ansible/modules/network/fortios/fortios_firewall_DoS_policy6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_address.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_address6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_address6_template.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_addrgrp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_addrgrp6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_auth_portal.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_central_snat_map.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_identity_based_route.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_interface_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_interface_policy6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_internet_service.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_internet_service_custom.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_internet_service_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_local_in_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_local_in_policy6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_multicast_address.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_multicast_address6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_multicast_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_multicast_policy6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_policy.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/fortios/fortios_firewall_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_policy46.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_policy6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_policy64.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_proxy_address.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_proxy_addrgrp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_proxy_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_schedule_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_service_custom.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_service_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_shaping_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_shaping_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_sniffer.py validate-modules:parameter-invalid
lib/ansible/modules/network/fortios/fortios_firewall_sniffer.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_ssl_ssh_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_ttl_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_vip.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_vip46.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_vip6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_vip64.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_vipgrp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_vipgrp46.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_vipgrp6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_vipgrp64.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_firewall_wildcard_fqdn_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_ips_decoder.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_ips_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_ips_sensor.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_ipv4_policy.py validate-modules:doc-missing-type
lib/ansible/modules/network/fortios/fortios_ipv4_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_ipv4_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortios/fortios_log_setting.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_log_syslogd2_setting.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_log_syslogd3_setting.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_log_syslogd4_setting.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_log_syslogd_override_setting.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_log_syslogd_setting.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_log_threat_weight.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_report_chart.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/fortios/fortios_report_chart.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_report_dataset.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_report_layout.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_access_list.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_access_list6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_aspath_list.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_bfd.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_bfd6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_bgp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_community_list.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_isis.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_key_chain.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_multicast.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_multicast6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_multicast_flow.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_ospf.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_ospf6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_prefix_list.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_prefix_list6.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_rip.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_ripng.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_router_route_map.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_spamfilter_bwl.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_spamfilter_bword.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_spamfilter_dnsbl.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_spamfilter_iptrust.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_spamfilter_mheader.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_spamfilter_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_ssh_filter_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_switch_controller_global.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_switch_controller_lldp_profile.py validate-modules:parameter-invalid
lib/ansible/modules/network/fortios/fortios_switch_controller_lldp_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_switch_controller_managed_switch.py validate-modules:parameter-invalid
lib/ansible/modules/network/fortios/fortios_switch_controller_managed_switch.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_switch_controller_qos_ip_dscp_map.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_switch_controller_qos_queue_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_switch_controller_quarantine.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_switch_controller_security_policy_802_1X.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_switch_controller_switch_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_switch_controller_vlan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_admin.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_alarm.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_api_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_automation_action.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_automation_destination.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_automation_stitch.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_central_management.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_cluster_sync.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_csf.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_ddns.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_dhcp6_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_dhcp_server.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/fortios/fortios_system_dhcp_server.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_dns.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_dns_database.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_geoip_override.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_global.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/fortios/fortios_system_global.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_ha.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_interface.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_link_monitor.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_mobile_tunnel.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_nat64.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_nd_proxy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_ntp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_object_tagging.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_replacemsg_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_sdn_connector.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_session_ttl.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_settings.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_snmp_community.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_snmp_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_switch_interface.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_vdom_exception.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_virtual_wan_link.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_virtual_wire_pair.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_vxlan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_system_zone.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_user_device.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_user_device_access_list.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_user_device_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_user_fsso_polling.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_user_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_user_peergrp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_user_quarantine.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_user_radius.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_user_security_exempt_list.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_user_setting.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_voip_profile.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/fortios/fortios_vpn_ipsec_concentrator.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_vpn_ipsec_manualkey.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/fortios/fortios_vpn_ipsec_manualkey_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/fortios/fortios_vpn_ipsec_phase1.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_vpn_ipsec_phase1_interface.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_vpn_ipsec_phase2_interface.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_vpn_ssl_settings.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_vpn_ssl_web_host_check_software.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_vpn_ssl_web_portal.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_vpn_ssl_web_user_bookmark.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_vpn_ssl_web_user_group_bookmark.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_waf_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wanopt_cache_service.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wanopt_content_delivery_network_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_web_proxy_explicit.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_web_proxy_forward_server_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_web_proxy_global.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_web_proxy_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_webfilter.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/fortios/fortios_webfilter.py validate-modules:doc-choices-incompatible-type
lib/ansible/modules/network/fortios/fortios_webfilter.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/fortios/fortios_webfilter.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/fortios/fortios_webfilter.py validate-modules:parameter-invalid
lib/ansible/modules/network/fortios/fortios_webfilter.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_webfilter.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/fortios/fortios_webfilter_content.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_webfilter_content_header.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_webfilter_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_webfilter_urlfilter.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_bonjour_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_hotspot20_anqp_3gpp_cellular.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_hotspot20_anqp_nai_realm.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_hotspot20_anqp_roaming_consortium.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_hotspot20_anqp_venue_name.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_hotspot20_h2qp_operator_name.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_hotspot20_h2qp_osu_provider.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_hotspot20_hs_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_hotspot20_icon.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_hotspot20_qos_map.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_inter_controller.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_qos_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_setting.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/fortios/fortios_wireless_controller_timers.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_vap.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_vap_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_wtp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/fortios/fortios_wireless_controller_wtp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_wtp_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/fortios/fortios_wireless_controller_wtp_profile.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/fortios/fortios_wireless_controller_wtp_profile.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/frr/frr_bgp.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/frr/frr_bgp.py validate-modules:doc-missing-type
lib/ansible/modules/network/frr/frr_bgp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/frr/frr_bgp.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/frr/frr_bgp.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/frr/frr_bgp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/frr/frr_bgp.py validate-modules:undocumented-parameter
lib/ansible/modules/network/frr/frr_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/frr/frr_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/icx/icx_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/icx/icx_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/icx/icx_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/icx/icx_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/icx/icx_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/icx/icx_l3_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/icx/icx_l3_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/icx/icx_linkagg.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/icx/icx_linkagg.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/icx/icx_linkagg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/icx/icx_lldp.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/icx/icx_lldp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/icx/icx_logging.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/icx/icx_logging.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/icx/icx_static_route.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/icx/icx_static_route.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/icx/icx_system.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/icx/icx_system.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/icx/icx_user.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/icx/icx_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/icx/icx_vlan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/icx/icx_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/icx/icx_vlan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/illumos/dladm_etherstub.py pylint:blacklisted-name
lib/ansible/modules/network/illumos/dladm_etherstub.py validate-modules:doc-missing-type
lib/ansible/modules/network/illumos/dladm_iptun.py pylint:blacklisted-name
lib/ansible/modules/network/illumos/dladm_iptun.py validate-modules:doc-missing-type
lib/ansible/modules/network/illumos/dladm_iptun.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/illumos/dladm_linkprop.py pylint:blacklisted-name
lib/ansible/modules/network/illumos/dladm_linkprop.py validate-modules:doc-missing-type
lib/ansible/modules/network/illumos/dladm_linkprop.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/network/illumos/dladm_linkprop.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/illumos/dladm_vlan.py pylint:blacklisted-name
lib/ansible/modules/network/illumos/dladm_vlan.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/illumos/dladm_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/network/illumos/dladm_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/illumos/dladm_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/illumos/dladm_vnic.py pylint:blacklisted-name
lib/ansible/modules/network/illumos/dladm_vnic.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/illumos/dladm_vnic.py validate-modules:doc-missing-type
lib/ansible/modules/network/illumos/flowadm.py pylint:blacklisted-name
lib/ansible/modules/network/illumos/flowadm.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/illumos/flowadm.py validate-modules:doc-missing-type
lib/ansible/modules/network/illumos/ipadm_addr.py pylint:blacklisted-name
lib/ansible/modules/network/illumos/ipadm_addr.py validate-modules:doc-missing-type
lib/ansible/modules/network/illumos/ipadm_addr.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/illumos/ipadm_addrprop.py pylint:blacklisted-name
lib/ansible/modules/network/illumos/ipadm_addrprop.py validate-modules:doc-missing-type
lib/ansible/modules/network/illumos/ipadm_addrprop.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/network/illumos/ipadm_if.py pylint:blacklisted-name
lib/ansible/modules/network/illumos/ipadm_if.py validate-modules:doc-missing-type
lib/ansible/modules/network/illumos/ipadm_ifprop.py pylint:blacklisted-name
lib/ansible/modules/network/illumos/ipadm_ifprop.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/illumos/ipadm_ifprop.py validate-modules:doc-missing-type
lib/ansible/modules/network/illumos/ipadm_ifprop.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/network/illumos/ipadm_prop.py pylint:blacklisted-name
lib/ansible/modules/network/illumos/ipadm_prop.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/illumos/ipadm_prop.py validate-modules:doc-missing-type
lib/ansible/modules/network/ingate/ig_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/ingate/ig_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ingate/ig_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ingate/ig_config.py validate-modules:return-syntax-error
lib/ansible/modules/network/ingate/ig_unit_information.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ingate/ig_unit_information.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/_ios_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/ios/_ios_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/_ios_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/ios/_ios_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/_ios_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/_ios_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/ios/_ios_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/_ios_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/ios/_ios_l2_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/ios/_ios_l2_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/_ios_l2_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/ios/_ios_l2_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/_ios_l2_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/_ios_l2_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/ios/_ios_l2_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/_ios_l2_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/ios/_ios_l3_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/ios/_ios_l3_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/_ios_l3_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/ios/_ios_l3_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/_ios_l3_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/_ios_l3_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/ios/_ios_l3_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/_ios_l3_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/ios/_ios_vlan.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/ios/_ios_vlan.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/_ios_vlan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/ios/_ios_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/_ios_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/_ios_vlan.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/ios/_ios_vlan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ios/_ios_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/_ios_vlan.py validate-modules:undocumented-parameter
lib/ansible/modules/network/ios/ios_banner.py future-import-boilerplate
lib/ansible/modules/network/ios/ios_banner.py metaclass-boilerplate
lib/ansible/modules/network/ios/ios_banner.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/ios_banner.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/ios_banner.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_bgp.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/ios/ios_bgp.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/ios_bgp.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/ios/ios_bgp.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/ios/ios_bgp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/ios_command.py future-import-boilerplate
lib/ansible/modules/network/ios/ios_command.py metaclass-boilerplate
lib/ansible/modules/network/ios/ios_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/ios_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/ios_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ios/ios_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/ios_config.py future-import-boilerplate
lib/ansible/modules/network/ios/ios_config.py metaclass-boilerplate
lib/ansible/modules/network/ios/ios_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/ios_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/ios_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ios/ios_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/ios_facts.py future-import-boilerplate
lib/ansible/modules/network/ios/ios_facts.py metaclass-boilerplate
lib/ansible/modules/network/ios/ios_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/ios_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ios/ios_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/ios_interfaces.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/ios/ios_l2_interfaces.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ios/ios_l3_interfaces.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/ios/ios_l3_interfaces.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/ios/ios_l3_interfaces.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ios/ios_lag_interfaces.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/ios/ios_lag_interfaces.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_linkagg.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/ios/ios_linkagg.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/ios_linkagg.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/ios/ios_linkagg.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/ios_linkagg.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_linkagg.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/ios/ios_linkagg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ios/ios_linkagg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/ios_linkagg.py validate-modules:undocumented-parameter
lib/ansible/modules/network/ios/ios_lldp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/ios/ios_lldp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/ios_lldp.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/ios_lldp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_logging.py future-import-boilerplate
lib/ansible/modules/network/ios/ios_logging.py metaclass-boilerplate
lib/ansible/modules/network/ios/ios_logging.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/ios/ios_logging.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/ios_logging.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/ios/ios_logging.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/ios_logging.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_logging.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/ios/ios_logging.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/ios_logging.py validate-modules:undocumented-parameter
lib/ansible/modules/network/ios/ios_ntp.py future-import-boilerplate
lib/ansible/modules/network/ios/ios_ntp.py metaclass-boilerplate
lib/ansible/modules/network/ios/ios_ntp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/ios_ntp.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/ios_ntp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_ping.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/ios_ping.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_ping.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/ios_static_route.py future-import-boilerplate
lib/ansible/modules/network/ios/ios_static_route.py metaclass-boilerplate
lib/ansible/modules/network/ios/ios_static_route.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/ios/ios_static_route.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/ios_static_route.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/ios/ios_static_route.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/ios_static_route.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_static_route.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/ios/ios_static_route.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/ios_static_route.py validate-modules:undocumented-parameter
lib/ansible/modules/network/ios/ios_system.py future-import-boilerplate
lib/ansible/modules/network/ios/ios_system.py metaclass-boilerplate
lib/ansible/modules/network/ios/ios_system.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/ios_system.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/ios_system.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_system.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ios/ios_system.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/ios_user.py future-import-boilerplate
lib/ansible/modules/network/ios/ios_user.py metaclass-boilerplate
lib/ansible/modules/network/ios/ios_user.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/ios/ios_user.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/ios_user.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/ios/ios_user.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/ios_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_user.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/ios/ios_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ios/ios_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ios/ios_user.py validate-modules:undocumented-parameter
lib/ansible/modules/network/ios/ios_vrf.py future-import-boilerplate
lib/ansible/modules/network/ios/ios_vrf.py metaclass-boilerplate
lib/ansible/modules/network/ios/ios_vrf.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ios/ios_vrf.py validate-modules:doc-missing-type
lib/ansible/modules/network/ios/ios_vrf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ios/ios_vrf.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ios/ios_vrf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/iosxr/_iosxr_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/iosxr/_iosxr_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/iosxr/_iosxr_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/iosxr/_iosxr_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/iosxr/_iosxr_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/iosxr/_iosxr_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/iosxr/_iosxr_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/iosxr/_iosxr_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/iosxr/iosxr_banner.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_banner.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_banner.py validate-modules:doc-missing-type
lib/ansible/modules/network/iosxr/iosxr_banner.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/iosxr/iosxr_banner.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/iosxr/iosxr_banner.py validate-modules:undocumented-parameter
lib/ansible/modules/network/iosxr/iosxr_bgp.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/iosxr/iosxr_bgp.py validate-modules:doc-missing-type
lib/ansible/modules/network/iosxr/iosxr_bgp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/iosxr/iosxr_bgp.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/iosxr/iosxr_bgp.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/iosxr/iosxr_bgp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/iosxr/iosxr_bgp.py validate-modules:undocumented-parameter
lib/ansible/modules/network/iosxr/iosxr_command.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/iosxr/iosxr_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/iosxr/iosxr_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/iosxr/iosxr_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/iosxr/iosxr_command.py validate-modules:undocumented-parameter
lib/ansible/modules/network/iosxr/iosxr_config.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/iosxr/iosxr_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/iosxr/iosxr_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/iosxr/iosxr_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/iosxr/iosxr_config.py validate-modules:undocumented-parameter
lib/ansible/modules/network/iosxr/iosxr_facts.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/iosxr/iosxr_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/iosxr/iosxr_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/iosxr/iosxr_facts.py validate-modules:undocumented-parameter
lib/ansible/modules/network/iosxr/iosxr_l2_interfaces.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/iosxr/iosxr_l2_interfaces.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/iosxr/iosxr_l3_interfaces.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/iosxr/iosxr_l3_interfaces.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/iosxr/iosxr_lacp_interfaces.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/iosxr/iosxr_logging.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_logging.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_logging.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/iosxr/iosxr_logging.py validate-modules:doc-missing-type
lib/ansible/modules/network/iosxr/iosxr_logging.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/iosxr/iosxr_logging.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/iosxr/iosxr_logging.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/iosxr/iosxr_logging.py validate-modules:undocumented-parameter
lib/ansible/modules/network/iosxr/iosxr_netconf.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_netconf.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_netconf.py validate-modules:doc-missing-type
lib/ansible/modules/network/iosxr/iosxr_netconf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/iosxr/iosxr_netconf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/iosxr/iosxr_netconf.py validate-modules:undocumented-parameter
lib/ansible/modules/network/iosxr/iosxr_system.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_system.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_system.py validate-modules:doc-missing-type
lib/ansible/modules/network/iosxr/iosxr_system.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/iosxr/iosxr_system.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/iosxr/iosxr_system.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/iosxr/iosxr_system.py validate-modules:undocumented-parameter
lib/ansible/modules/network/iosxr/iosxr_user.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_user.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/iosxr/iosxr_user.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/iosxr/iosxr_user.py validate-modules:doc-missing-type
lib/ansible/modules/network/iosxr/iosxr_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/iosxr/iosxr_user.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/iosxr/iosxr_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/iosxr/iosxr_user.py validate-modules:undocumented-parameter
lib/ansible/modules/network/ironware/ironware_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ironware/ironware_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/ironware/ironware_command.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/ironware/ironware_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ironware/ironware_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ironware/ironware_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ironware/ironware_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/ironware/ironware_config.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/ironware/ironware_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ironware/ironware_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ironware/ironware_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ironware/ironware_facts.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/ironware/ironware_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ironware/ironware_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/itential/iap_start_workflow.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/itential/iap_token.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/_junos_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/_junos_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/_junos_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/junos/_junos_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/_junos_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/_junos_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/junos/_junos_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/_junos_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/_junos_l2_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/_junos_l2_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/_junos_l2_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/junos/_junos_l2_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/_junos_l2_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/_junos_l2_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/junos/_junos_l2_interface.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/junos/_junos_l2_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/_junos_l2_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/_junos_l3_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/_junos_l3_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/_junos_l3_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/junos/_junos_l3_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/_junos_l3_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/_junos_l3_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/junos/_junos_l3_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/_junos_l3_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/_junos_linkagg.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/_junos_linkagg.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/_junos_linkagg.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/junos/_junos_linkagg.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/_junos_linkagg.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/_junos_linkagg.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/junos/_junos_linkagg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/junos/_junos_linkagg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/_junos_linkagg.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/_junos_lldp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/_junos_lldp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/_junos_lldp.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/_junos_lldp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/_junos_lldp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/_junos_lldp.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/_junos_lldp_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/_junos_lldp_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/_junos_lldp_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/_junos_lldp_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/_junos_lldp_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/_junos_static_route.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/_junos_static_route.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/_junos_static_route.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/junos/_junos_static_route.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/_junos_static_route.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/_junos_static_route.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/junos/_junos_static_route.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/_junos_static_route.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/_junos_vlan.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/_junos_vlan.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/_junos_vlan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/junos/_junos_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/_junos_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/_junos_vlan.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/junos/_junos_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/_junos_vlan.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/junos_banner.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/junos_banner.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/junos_banner.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_banner.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_banner.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/junos_command.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/junos_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/junos_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/junos/junos_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/junos_command.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/junos_config.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/junos_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/junos_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/junos/junos_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/junos_config.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/junos_facts.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/junos_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/junos_facts.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/junos/junos_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/junos_facts.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/junos_interfaces.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/junos/junos_interfaces.py validate-modules:doc-type-does-not-match-spec
lib/ansible/modules/network/junos/junos_l2_interfaces.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/junos/junos_lag_interfaces.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/junos/junos_lag_interfaces.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_logging.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/junos_logging.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/junos_logging.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/junos/junos_logging.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_logging.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_logging.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/junos/junos_logging.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/junos_logging.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/junos_netconf.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/junos_netconf.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/junos_netconf.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_netconf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_netconf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/junos_netconf.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/junos_package.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/junos_package.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/junos_package.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_package.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_package.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/junos_package.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/junos_ping.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/junos_ping.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/junos_ping.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_ping.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_ping.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/junos_ping.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/junos_rpc.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/junos_rpc.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/junos_rpc.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_rpc.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_rpc.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/junos_rpc.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/junos_scp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/junos_scp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/junos_scp.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_scp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_scp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/junos/junos_scp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/junos_scp.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/junos_system.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/junos_system.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/junos_system.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_system.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_system.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/junos/junos_system.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/junos_system.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/junos_user.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/junos_user.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/junos_user.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/junos/junos_user.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_user.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/junos/junos_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/junos_user.py validate-modules:undocumented-parameter
lib/ansible/modules/network/junos/junos_vlans.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_vrf.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/junos/junos_vrf.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/junos/junos_vrf.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/junos/junos_vrf.py validate-modules:doc-missing-type
lib/ansible/modules/network/junos/junos_vrf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/junos/junos_vrf.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/junos/junos_vrf.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/junos/junos_vrf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/junos/junos_vrf.py validate-modules:undocumented-parameter
lib/ansible/modules/network/meraki/meraki_admin.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/meraki/meraki_config_template.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/meraki/meraki_content_filtering.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/meraki/meraki_firewalled_services.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/meraki/meraki_malware.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/meraki/meraki_malware.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/meraki/meraki_mr_l3_firewall.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/meraki/meraki_mr_l3_firewall.py validate-modules:doc-type-does-not-match-spec
lib/ansible/modules/network/meraki/meraki_mx_l3_firewall.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/meraki/meraki_mx_l3_firewall.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/meraki/meraki_mx_l7_firewall.py pylint:ansible-bad-function
lib/ansible/modules/network/meraki/meraki_mx_l7_firewall.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/meraki/meraki_mx_l7_firewall.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/meraki/meraki_mx_l7_firewall.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/meraki/meraki_mx_l7_firewall.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/meraki/meraki_nat.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/meraki/meraki_nat.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/meraki/meraki_nat.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/meraki/meraki_network.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/meraki/meraki_network.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/meraki/meraki_organization.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/meraki/meraki_snmp.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/meraki/meraki_snmp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/meraki/meraki_ssid.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/meraki/meraki_ssid.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/meraki/meraki_ssid.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/meraki/meraki_static_route.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/meraki/meraki_switchport.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/meraki/meraki_switchport.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/meraki/meraki_switchport.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/meraki/meraki_syslog.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/meraki/meraki_syslog.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/meraki/meraki_vlan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/meraki/meraki_vlan.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/meraki/meraki_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/meraki/meraki_vlan.py validate-modules:undocumented-parameter
lib/ansible/modules/network/netact/netact_cm_command.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/netact/netact_cm_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netconf/netconf_config.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/netconf/netconf_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/netconf/netconf_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/netconf/netconf_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netconf/netconf_get.py validate-modules:doc-missing-type
lib/ansible/modules/network/netconf/netconf_get.py validate-modules:return-syntax-error
lib/ansible/modules/network/netconf/netconf_rpc.py validate-modules:doc-missing-type
lib/ansible/modules/network/netconf/netconf_rpc.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/netconf/netconf_rpc.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netconf/netconf_rpc.py validate-modules:return-syntax-error
lib/ansible/modules/network/netscaler/netscaler_cs_action.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/netscaler/netscaler_cs_action.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netscaler/netscaler_cs_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netscaler/netscaler_cs_vserver.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/netscaler/netscaler_cs_vserver.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/netscaler/netscaler_cs_vserver.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/netscaler/netscaler_cs_vserver.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netscaler/netscaler_cs_vserver.py validate-modules:undocumented-parameter
lib/ansible/modules/network/netscaler/netscaler_gslb_service.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/netscaler/netscaler_gslb_service.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netscaler/netscaler_gslb_site.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netscaler/netscaler_gslb_vserver.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/netscaler/netscaler_gslb_vserver.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netscaler/netscaler_gslb_vserver.py validate-modules:undocumented-parameter
lib/ansible/modules/network/netscaler/netscaler_lb_monitor.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/netscaler/netscaler_lb_monitor.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/netscaler/netscaler_lb_monitor.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/netscaler/netscaler_lb_monitor.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netscaler/netscaler_lb_vserver.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/netscaler/netscaler_lb_vserver.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/netscaler/netscaler_lb_vserver.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netscaler/netscaler_nitro_request.py pylint:ansible-bad-function
lib/ansible/modules/network/netscaler/netscaler_nitro_request.py validate-modules:doc-missing-type
lib/ansible/modules/network/netscaler/netscaler_nitro_request.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/netscaler/netscaler_nitro_request.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/netscaler/netscaler_nitro_request.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netscaler/netscaler_save_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/netscaler/netscaler_save_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netscaler/netscaler_server.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/netscaler/netscaler_server.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netscaler/netscaler_service.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/netscaler/netscaler_service.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/netscaler/netscaler_service.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netscaler/netscaler_servicegroup.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/netscaler/netscaler_servicegroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netscaler/netscaler_ssl_certkey.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/_pn_cluster.py future-import-boilerplate
lib/ansible/modules/network/netvisor/_pn_cluster.py metaclass-boilerplate
lib/ansible/modules/network/netvisor/_pn_cluster.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/_pn_ospf.py future-import-boilerplate
lib/ansible/modules/network/netvisor/_pn_ospf.py metaclass-boilerplate
lib/ansible/modules/network/netvisor/_pn_ospf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/netvisor/_pn_ospf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/_pn_ospfarea.py future-import-boilerplate
lib/ansible/modules/network/netvisor/_pn_ospfarea.py metaclass-boilerplate
lib/ansible/modules/network/netvisor/_pn_ospfarea.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/_pn_show.py future-import-boilerplate
lib/ansible/modules/network/netvisor/_pn_show.py metaclass-boilerplate
lib/ansible/modules/network/netvisor/_pn_show.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/netvisor/_pn_show.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/_pn_trunk.py future-import-boilerplate
lib/ansible/modules/network/netvisor/_pn_trunk.py metaclass-boilerplate
lib/ansible/modules/network/netvisor/_pn_trunk.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/_pn_vlag.py future-import-boilerplate
lib/ansible/modules/network/netvisor/_pn_vlag.py metaclass-boilerplate
lib/ansible/modules/network/netvisor/_pn_vlag.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/_pn_vlan.py future-import-boilerplate
lib/ansible/modules/network/netvisor/_pn_vlan.py metaclass-boilerplate
lib/ansible/modules/network/netvisor/_pn_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/_pn_vrouter.py future-import-boilerplate
lib/ansible/modules/network/netvisor/_pn_vrouter.py metaclass-boilerplate
lib/ansible/modules/network/netvisor/_pn_vrouter.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/_pn_vrouterbgp.py future-import-boilerplate
lib/ansible/modules/network/netvisor/_pn_vrouterbgp.py metaclass-boilerplate
lib/ansible/modules/network/netvisor/_pn_vrouterbgp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/_pn_vrouterif.py future-import-boilerplate
lib/ansible/modules/network/netvisor/_pn_vrouterif.py metaclass-boilerplate
lib/ansible/modules/network/netvisor/_pn_vrouterif.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/netvisor/_pn_vrouterif.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/_pn_vrouterlbif.py future-import-boilerplate
lib/ansible/modules/network/netvisor/_pn_vrouterlbif.py metaclass-boilerplate
lib/ansible/modules/network/netvisor/_pn_vrouterlbif.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/netvisor/_pn_vrouterlbif.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/pn_access_list.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_access_list.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/pn_access_list_ip.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_access_list_ip.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/pn_admin_service.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_admin_session_timeout.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_admin_syslog.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_connection_stats_settings.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_cpu_class.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_cpu_class.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/pn_cpu_mgmt_class.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_dhcp_filter.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_dscp_map.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_dscp_map.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/pn_dscp_map_pri_map.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_fabric_local.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_fabric_local.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/pn_igmp_snooping.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_igmp_snooping.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/pn_ipv6security_raguard.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_ipv6security_raguard_port.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_ipv6security_raguard_vlan.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_log_audit_exception.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/netvisor/pn_log_audit_exception.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_port_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/pn_port_cos_bw.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_port_cos_rate_setting.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_prefix_list.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_prefix_list_network.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_role.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/netvisor/pn_role.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_snmp_community.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_snmp_community.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/pn_snmp_trap_sink.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_snmp_vacm.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_stp.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_stp_port.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_switch_setup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/pn_user.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_vflow_table_profile.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_vrouter_bgp.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_vrouter_bgp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/netvisor/pn_vrouter_bgp_network.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_vrouter_interface_ip.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_vrouter_loopback_interface.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_vrouter_ospf.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_vrouter_ospf6.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_vrouter_packet_relay.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_vrouter_pim_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/netvisor/pn_vrouter_pim_config.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/netvisor/pn_vtep.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/nos/nos_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/nos/nos_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nos/nos_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nos/nos_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/nos/nos_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nos/nos_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nos/nos_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nos/nos_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nso/nso_action.py validate-modules:doc-missing-type
lib/ansible/modules/network/nso/nso_action.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nso/nso_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nso/nso_config.py validate-modules:return-syntax-error
lib/ansible/modules/network/nso/nso_query.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nso/nso_query.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nso/nso_show.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nso/nso_verify.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nuage/nuage_vspk.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nuage/nuage_vspk.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/nuage/nuage_vspk.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nuage/nuage_vspk.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nuage/nuage_vspk.py validate-modules:undocumented-parameter
lib/ansible/modules/network/nxos/_nxos_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/_nxos_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/_nxos_interface.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/_nxos_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/nxos/_nxos_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/_nxos_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/_nxos_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/nxos/_nxos_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/_nxos_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/nxos/_nxos_l2_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/_nxos_l2_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/_nxos_l2_interface.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/_nxos_l2_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/nxos/_nxos_l2_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/_nxos_l2_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/_nxos_l2_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/nxos/_nxos_l2_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/_nxos_l2_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/nxos/_nxos_l3_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/_nxos_l3_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/_nxos_l3_interface.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/_nxos_l3_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/nxos/_nxos_l3_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/_nxos_l3_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/_nxos_l3_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/nxos/_nxos_l3_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/_nxos_l3_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/nxos/_nxos_linkagg.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/_nxos_linkagg.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/_nxos_linkagg.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/_nxos_linkagg.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/nxos/_nxos_linkagg.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/_nxos_linkagg.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/_nxos_linkagg.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/nxos/_nxos_linkagg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/_nxos_linkagg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/_nxos_linkagg.py validate-modules:undocumented-parameter
lib/ansible/modules/network/nxos/_nxos_vlan.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/_nxos_vlan.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/_nxos_vlan.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/_nxos_vlan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/nxos/_nxos_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/_nxos_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/_nxos_vlan.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/nxos/_nxos_vlan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/_nxos_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/_nxos_vlan.py validate-modules:undocumented-parameter
lib/ansible/modules/network/nxos/nxos_aaa_server.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_aaa_server.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_aaa_server.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_aaa_server.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_aaa_server.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_aaa_server.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_aaa_server.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_aaa_server.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_aaa_server_host.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_aaa_server_host.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_aaa_server_host.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_aaa_server_host.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_aaa_server_host.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_aaa_server_host.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_aaa_server_host.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_acl.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_acl.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_acl.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_acl.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_acl.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_acl.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_acl.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_acl.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_acl_interface.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_acl_interface.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_acl_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_acl_interface.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_acl_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_acl_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_acl_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_banner.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_banner.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_banner.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_banner.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_banner.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_banner.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_bfd_global.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_bfd_global.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_bfd_global.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_bfd_global.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_bfd_global.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_bgp.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_bgp.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_bgp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_bgp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_bgp.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_bgp.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_bgp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_bgp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_bgp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_bgp_af.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_bgp_af.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_bgp_af.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_bgp_af.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_bgp_af.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_bgp_af.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_bgp_af.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_bgp_af.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_bgp_neighbor.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_bgp_neighbor.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_bgp_neighbor.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_bgp_neighbor.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_bgp_neighbor.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_bgp_neighbor.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_bgp_neighbor.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_bgp_neighbor_af.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_bgp_neighbor_af.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_bgp_neighbor_af.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_bgp_neighbor_af.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_bgp_neighbor_af.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_bgp_neighbor_af.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_bgp_neighbor_af.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_bgp_neighbor_af.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_bgp_neighbor_af.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_command.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_command.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_config.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_config.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_config.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_evpn_global.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_evpn_global.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_evpn_global.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_evpn_global.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_evpn_global.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_evpn_vni.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_evpn_vni.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_evpn_vni.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_evpn_vni.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_evpn_vni.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_evpn_vni.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_evpn_vni.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_evpn_vni.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_facts.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_facts.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_facts.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_feature.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_feature.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_feature.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_feature.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_feature.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_feature.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_feature.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_gir.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_gir.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_gir.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_gir.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_gir.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_gir.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_gir.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_gir.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_gir_profile_management.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_gir_profile_management.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_gir_profile_management.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_gir_profile_management.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_gir_profile_management.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_gir_profile_management.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_gir_profile_management.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_gir_profile_management.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_hsrp.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_hsrp.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_hsrp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_hsrp.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_hsrp.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_hsrp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_hsrp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_igmp.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_igmp.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_igmp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_igmp.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_igmp.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_igmp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_igmp_interface.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_igmp_interface.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_igmp_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_igmp_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_igmp_interface.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_igmp_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_igmp_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_igmp_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_igmp_snooping.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_igmp_snooping.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_igmp_snooping.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_igmp_snooping.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_igmp_snooping.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_igmp_snooping.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_igmp_snooping.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_install_os.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_install_os.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_install_os.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_install_os.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_install_os.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_install_os.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_interface_ospf.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_interface_ospf.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_interface_ospf.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_interface_ospf.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_interface_ospf.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_interface_ospf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_interface_ospf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_lag_interfaces.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_lag_interfaces.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/nxos/nxos_lldp.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_lldp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_lldp.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_lldp.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_lldp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_logging.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_logging.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_logging.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_logging.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_logging.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_logging.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_logging.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_logging.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_ntp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_ntp.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_ntp.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_ntp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_ntp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_ntp_auth.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_ntp_auth.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_ntp_auth.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_ntp_auth.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_ntp_auth.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_ntp_auth.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_ntp_auth.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_ntp_options.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_ntp_options.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_ntp_options.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_ntp_options.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_ntp_options.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_ntp_options.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_ntp_options.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_nxapi.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_nxapi.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_nxapi.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_nxapi.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_nxapi.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_nxapi.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_nxapi.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_nxapi.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_ospf.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_ospf.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_ospf.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_ospf.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_ospf.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_ospf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_ospf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_ospf_vrf.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_ospf_vrf.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_ospf_vrf.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_ospf_vrf.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_ospf_vrf.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_ospf_vrf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_ospf_vrf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_overlay_global.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_overlay_global.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_overlay_global.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_overlay_global.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_overlay_global.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_overlay_global.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_pim.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_pim.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_pim.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_pim.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_pim.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_pim.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_pim.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_pim_interface.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_pim_interface.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_pim_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_pim_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_pim_interface.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_pim_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_pim_rp_address.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_pim_rp_address.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_pim_rp_address.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_pim_rp_address.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_pim_rp_address.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_pim_rp_address.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_pim_rp_address.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_pim_rp_address.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_ping.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_ping.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_ping.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_ping.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_ping.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_ping.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_ping.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_reboot.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_reboot.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_reboot.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_reboot.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_reboot.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_rollback.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_rollback.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_rollback.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_rollback.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_rollback.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_rollback.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_rpm.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_rpm.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_rpm.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_rpm.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_rpm.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_rpm.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/nxos/nxos_rpm.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_rpm.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_rpm.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/nxos/nxos_rpm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_rpm.py validate-modules:undocumented-parameter
lib/ansible/modules/network/nxos/nxos_smu.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_smu.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_smu.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_smu.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_smu.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_smu.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_snapshot.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_snapshot.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_snapshot.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_snapshot.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_snapshot.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_snapshot.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_snapshot.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_snmp_community.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_snmp_community.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_snmp_community.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_snmp_community.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_snmp_community.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_snmp_community.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_snmp_community.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_snmp_contact.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_snmp_contact.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_snmp_contact.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_snmp_contact.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_snmp_contact.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_snmp_contact.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_snmp_contact.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_snmp_host.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_snmp_host.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_snmp_host.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_snmp_host.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_snmp_host.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_snmp_host.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_snmp_host.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_snmp_location.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_snmp_location.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_snmp_location.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_snmp_location.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_snmp_location.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_snmp_location.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_snmp_location.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_snmp_traps.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_snmp_traps.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_snmp_traps.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_snmp_traps.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_snmp_traps.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_snmp_traps.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_snmp_user.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_snmp_user.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_snmp_user.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_snmp_user.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_snmp_user.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_snmp_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_snmp_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_static_route.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_static_route.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_static_route.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_static_route.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_static_route.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_static_route.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/nxos/nxos_static_route.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_static_route.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_static_route.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/nxos/nxos_static_route.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_static_route.py validate-modules:undocumented-parameter
lib/ansible/modules/network/nxos/nxos_system.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_system.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_system.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_system.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_system.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_system.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_system.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_system.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_telemetry.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_udld.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_udld.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_udld.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_udld.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_udld.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_udld.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_udld.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_udld_interface.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_udld_interface.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_udld_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_udld_interface.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_udld_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_udld_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_udld_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_user.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_user.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_user.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_user.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_user.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_user.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/nxos/nxos_user.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_user.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/nxos/nxos_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_user.py validate-modules:undocumented-parameter
lib/ansible/modules/network/nxos/nxos_vlans.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/nxos/nxos_vpc.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_vpc.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_vpc.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_vpc.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_vpc.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_vpc.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_vpc.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_vpc_interface.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_vpc_interface.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_vpc_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_vpc_interface.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_vpc_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_vpc_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_vpc_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_vrf.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_vrf.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_vrf.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/nxos/nxos_vrf.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_vrf.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_vrf.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/nxos/nxos_vrf.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_vrf.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/nxos/nxos_vrf.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_vrf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_vrf.py validate-modules:undocumented-parameter
lib/ansible/modules/network/nxos/nxos_vrf_af.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_vrf_af.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_vrf_af.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_vrf_af.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_vrf_af.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_vrf_af.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_vrf_interface.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_vrf_interface.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_vrf_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_vrf_interface.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_vrf_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_vrf_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_vrf_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_vrrp.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_vrrp.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_vrrp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_vrrp.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_vrrp.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_vrrp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_vrrp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_vtp_domain.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_vtp_domain.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_vtp_domain.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_vtp_domain.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_vtp_domain.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_vtp_domain.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_vtp_password.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_vtp_password.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_vtp_password.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_vtp_password.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_vtp_password.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_vtp_password.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_vtp_password.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_vtp_version.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_vtp_version.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_vtp_version.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_vtp_version.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_vtp_version.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_vtp_version.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_vxlan_vtep.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_vxlan_vtep.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_vxlan_vtep.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_vxlan_vtep.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_vxlan_vtep.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_vxlan_vtep.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_vxlan_vtep.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/nxos_vxlan_vtep_vni.py future-import-boilerplate
lib/ansible/modules/network/nxos/nxos_vxlan_vtep_vni.py metaclass-boilerplate
lib/ansible/modules/network/nxos/nxos_vxlan_vtep_vni.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/nxos/nxos_vxlan_vtep_vni.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/network/nxos/nxos_vxlan_vtep_vni.py validate-modules:doc-missing-type
lib/ansible/modules/network/nxos/nxos_vxlan_vtep_vni.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/nxos/nxos_vxlan_vtep_vni.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/nxos/nxos_vxlan_vtep_vni.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/nxos/storage/nxos_devicealias.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/nxos/storage/nxos_vsan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/nxos/storage/nxos_zone_zoneset.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_bgp.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_bgp.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_bgp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_buffer_pool.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_buffer_pool.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_command.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/onyx/onyx_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/onyx/onyx_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_config.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/onyx/onyx_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/onyx/onyx_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/onyx/onyx_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_igmp.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_igmp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_igmp_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_igmp_vlan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_igmp_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_igmp_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/onyx/onyx_igmp_vlan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/onyx/onyx_igmp_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/onyx/onyx_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/onyx/onyx_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/onyx/onyx_interface.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/onyx/onyx_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/onyx/onyx_l2_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/onyx/onyx_l2_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_l2_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_l2_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/onyx/onyx_l2_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/onyx/onyx_l2_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_l2_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/onyx/onyx_l3_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/onyx/onyx_l3_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_l3_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_l3_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/onyx/onyx_l3_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/onyx/onyx_l3_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_l3_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/onyx/onyx_linkagg.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/onyx/onyx_linkagg.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/onyx/onyx_linkagg.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_linkagg.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_linkagg.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/onyx/onyx_linkagg.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/onyx/onyx_linkagg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/onyx/onyx_linkagg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_linkagg.py validate-modules:undocumented-parameter
lib/ansible/modules/network/onyx/onyx_lldp.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_lldp_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/onyx/onyx_lldp_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_lldp_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_lldp_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/onyx/onyx_lldp_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/onyx/onyx_lldp_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_lldp_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/onyx/onyx_magp.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_magp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_mlag_ipl.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_mlag_vip.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/onyx/onyx_mlag_vip.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_mlag_vip.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_ntp.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_ntp_servers_peers.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_ospf.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_ospf.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_ospf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_pfc_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/onyx/onyx_pfc_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_pfc_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_pfc_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/onyx/onyx_pfc_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/onyx/onyx_pfc_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_pfc_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/onyx/onyx_protocol.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_ptp_global.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_ptp_global.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_ptp_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_ptp_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_qos.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_qos.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/onyx/onyx_qos.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_snmp.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_snmp_hosts.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_snmp_users.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_syslog_remote.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_traffic_class.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_traffic_class.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/onyx/onyx_traffic_class.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_vlan.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/onyx/onyx_vlan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/network/onyx/onyx_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/onyx/onyx_vlan.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/onyx/onyx_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_vlan.py validate-modules:undocumented-parameter
lib/ansible/modules/network/onyx/onyx_vxlan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/onyx/onyx_vxlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/onyx/onyx_vxlan.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/onyx/onyx_vxlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/onyx/onyx_vxlan.py validate-modules:undocumented-parameter
lib/ansible/modules/network/opx/opx_cps.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/opx/opx_cps.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ordnance/ordnance_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ordnance/ordnance_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/ordnance/ordnance_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ordnance/ordnance_config.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/ordnance/ordnance_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ordnance/ordnance_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ordnance/ordnance_config.py validate-modules:undocumented-parameter
lib/ansible/modules/network/ordnance/ordnance_config.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/ordnance/ordnance_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/ordnance/ordnance_facts.py validate-modules:doc-missing-type
lib/ansible/modules/network/ordnance/ordnance_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/ordnance/ordnance_facts.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/ordnance/ordnance_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/ordnance/ordnance_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ordnance/ordnance_facts.py validate-modules:undocumented-parameter
lib/ansible/modules/network/ordnance/ordnance_facts.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/ovs/openvswitch_bridge.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/ovs/openvswitch_bridge.py validate-modules:doc-missing-type
lib/ansible/modules/network/ovs/openvswitch_bridge.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ovs/openvswitch_db.py validate-modules:doc-missing-type
lib/ansible/modules/network/ovs/openvswitch_db.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/ovs/openvswitch_port.py validate-modules:doc-missing-type
lib/ansible/modules/network/ovs/openvswitch_port.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/panos/_panos_admin.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_admin.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_admin.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_admin.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/panos/_panos_admpwd.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_admpwd.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_admpwd.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_cert_gen_ssh.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_cert_gen_ssh.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_cert_gen_ssh.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_cert_gen_ssh.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/panos/_panos_check.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_check.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_check.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/panos/_panos_commit.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_commit.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_commit.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_commit.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/panos/_panos_commit.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/panos/_panos_commit.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/panos/_panos_dag.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_dag.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_dag.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_dag_tags.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_dag_tags.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_dag_tags.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_dag_tags.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/panos/_panos_dag_tags.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/panos/_panos_dag_tags.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/panos/_panos_import.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_import.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_import.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_interface.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_interface.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_lic.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_lic.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_lic.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_lic.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/panos/_panos_loadcfg.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_loadcfg.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_loadcfg.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_match_rule.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_match_rule.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_match_rule.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_match_rule.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/panos/_panos_match_rule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/panos/_panos_mgtconfig.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_mgtconfig.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_mgtconfig.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_nat_rule.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_nat_rule.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/panos/_panos_nat_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/panos/_panos_nat_rule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/panos/_panos_object.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_object.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_object.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_object.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/panos/_panos_object.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/panos/_panos_object.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/panos/_panos_op.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_op.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_op.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_op.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/panos/_panos_pg.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_pg.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_pg.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_query_rules.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_query_rules.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_query_rules.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_query_rules.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/panos/_panos_restart.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_restart.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_restart.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/panos/_panos_sag.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_sag.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_sag.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_sag.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/panos/_panos_sag.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/panos/_panos_security_rule.py validate-modules:doc-missing-type
lib/ansible/modules/network/panos/_panos_security_rule.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/panos/_panos_security_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/panos/_panos_security_rule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/panos/_panos_set.py future-import-boilerplate
lib/ansible/modules/network/panos/_panos_set.py metaclass-boilerplate
lib/ansible/modules/network/panos/_panos_set.py validate-modules:doc-missing-type
lib/ansible/modules/network/radware/vdirect_commit.py validate-modules:doc-missing-type
lib/ansible/modules/network/radware/vdirect_commit.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/radware/vdirect_commit.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/radware/vdirect_file.py validate-modules:doc-missing-type
lib/ansible/modules/network/radware/vdirect_file.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/radware/vdirect_runnable.py validate-modules:doc-missing-type
lib/ansible/modules/network/radware/vdirect_runnable.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/restconf/restconf_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/restconf/restconf_get.py validate-modules:doc-missing-type
lib/ansible/modules/network/routeros/routeros_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/routeros/routeros_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/routeros/routeros_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/routeros/routeros_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/routeros/routeros_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/skydive/skydive_capture.py validate-modules:doc-missing-type
lib/ansible/modules/network/skydive/skydive_capture.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/skydive/skydive_capture.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/skydive/skydive_capture.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/skydive/skydive_capture.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/skydive/skydive_capture.py validate-modules:undocumented-parameter
lib/ansible/modules/network/skydive/skydive_edge.py validate-modules:doc-missing-type
lib/ansible/modules/network/skydive/skydive_edge.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/skydive/skydive_edge.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/skydive/skydive_edge.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/skydive/skydive_edge.py validate-modules:undocumented-parameter
lib/ansible/modules/network/skydive/skydive_node.py validate-modules:doc-missing-type
lib/ansible/modules/network/skydive/skydive_node.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/skydive/skydive_node.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/network/skydive/skydive_node.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/skydive/skydive_node.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/skydive/skydive_node.py validate-modules:undocumented-parameter
lib/ansible/modules/network/slxos/slxos_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/slxos/slxos_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/slxos/slxos_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/slxos/slxos_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/slxos/slxos_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/slxos/slxos_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/slxos/slxos_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/slxos/slxos_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/slxos/slxos_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/slxos/slxos_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/slxos/slxos_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/slxos/slxos_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/slxos/slxos_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/slxos/slxos_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/slxos/slxos_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/slxos/slxos_l2_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/slxos/slxos_l2_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/slxos/slxos_l2_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/slxos/slxos_l2_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/slxos/slxos_l2_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/slxos/slxos_l2_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/slxos/slxos_l2_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/slxos/slxos_l3_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/slxos/slxos_l3_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/slxos/slxos_l3_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/slxos/slxos_l3_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/slxos/slxos_l3_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/slxos/slxos_l3_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/slxos/slxos_l3_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/slxos/slxos_linkagg.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/slxos/slxos_linkagg.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/slxos/slxos_linkagg.py validate-modules:doc-missing-type
lib/ansible/modules/network/slxos/slxos_linkagg.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/slxos/slxos_linkagg.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/slxos/slxos_linkagg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/slxos/slxos_linkagg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/slxos/slxos_linkagg.py validate-modules:undocumented-parameter
lib/ansible/modules/network/slxos/slxos_lldp.py validate-modules:doc-missing-type
lib/ansible/modules/network/slxos/slxos_vlan.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/slxos/slxos_vlan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/slxos/slxos_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/network/slxos/slxos_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/slxos/slxos_vlan.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/slxos/slxos_vlan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/slxos/slxos_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/slxos/slxos_vlan.py validate-modules:undocumented-parameter
lib/ansible/modules/network/sros/sros_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/sros/sros_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/sros/sros_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/sros/sros_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/sros/sros_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/sros/sros_command.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/sros/sros_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/sros/sros_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/sros/sros_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/sros/sros_config.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/network/sros/sros_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/sros/sros_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/sros/sros_config.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/sros/sros_rollback.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/sros/sros_rollback.py validate-modules:doc-missing-type
lib/ansible/modules/network/sros/sros_rollback.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/sros/sros_rollback.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/sros/sros_rollback.py yamllint:unparsable-with-libyaml
lib/ansible/modules/network/voss/voss_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/voss/voss_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/voss/voss_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/voss/voss_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/voss/voss_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/voss/voss_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/voss/voss_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/voss/voss_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/vyos/_vyos_interface.py future-import-boilerplate
lib/ansible/modules/network/vyos/_vyos_interface.py metaclass-boilerplate
lib/ansible/modules/network/vyos/_vyos_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/vyos/_vyos_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/_vyos_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/vyos/_vyos_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/vyos/_vyos_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/_vyos_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/vyos/_vyos_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/vyos/_vyos_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/vyos/_vyos_l3_interface.py future-import-boilerplate
lib/ansible/modules/network/vyos/_vyos_l3_interface.py metaclass-boilerplate
lib/ansible/modules/network/vyos/_vyos_l3_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/vyos/_vyos_l3_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/_vyos_l3_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/vyos/_vyos_l3_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/vyos/_vyos_l3_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/_vyos_l3_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/vyos/_vyos_l3_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/vyos/_vyos_l3_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/vyos/_vyos_linkagg.py future-import-boilerplate
lib/ansible/modules/network/vyos/_vyos_linkagg.py metaclass-boilerplate
lib/ansible/modules/network/vyos/_vyos_linkagg.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/vyos/_vyos_linkagg.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/_vyos_linkagg.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/vyos/_vyos_linkagg.py validate-modules:doc-missing-type
lib/ansible/modules/network/vyos/_vyos_linkagg.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/_vyos_linkagg.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/vyos/_vyos_linkagg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/vyos/_vyos_linkagg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/vyos/_vyos_linkagg.py validate-modules:undocumented-parameter
lib/ansible/modules/network/vyos/_vyos_lldp.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/_vyos_lldp.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/_vyos_lldp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/vyos/_vyos_lldp_interface.py future-import-boilerplate
lib/ansible/modules/network/vyos/_vyos_lldp_interface.py metaclass-boilerplate
lib/ansible/modules/network/vyos/_vyos_lldp_interface.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/vyos/_vyos_lldp_interface.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/_vyos_lldp_interface.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/vyos/_vyos_lldp_interface.py validate-modules:doc-missing-type
lib/ansible/modules/network/vyos/_vyos_lldp_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/_vyos_lldp_interface.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/vyos/_vyos_lldp_interface.py validate-modules:undocumented-parameter
lib/ansible/modules/network/vyos/vyos_banner.py future-import-boilerplate
lib/ansible/modules/network/vyos/vyos_banner.py metaclass-boilerplate
lib/ansible/modules/network/vyos/vyos_banner.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/vyos_banner.py validate-modules:doc-missing-type
lib/ansible/modules/network/vyos/vyos_banner.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/vyos_command.py future-import-boilerplate
lib/ansible/modules/network/vyos/vyos_command.py metaclass-boilerplate
lib/ansible/modules/network/vyos/vyos_command.py pylint:blacklisted-name
lib/ansible/modules/network/vyos/vyos_command.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/vyos_command.py validate-modules:doc-missing-type
lib/ansible/modules/network/vyos/vyos_command.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/vyos_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/vyos/vyos_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/vyos/vyos_config.py future-import-boilerplate
lib/ansible/modules/network/vyos/vyos_config.py metaclass-boilerplate
lib/ansible/modules/network/vyos/vyos_config.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/vyos_config.py validate-modules:doc-missing-type
lib/ansible/modules/network/vyos/vyos_config.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/vyos_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/vyos/vyos_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/vyos/vyos_facts.py future-import-boilerplate
lib/ansible/modules/network/vyos/vyos_facts.py metaclass-boilerplate
lib/ansible/modules/network/vyos/vyos_facts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/vyos_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/vyos_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/vyos/vyos_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/vyos/vyos_interfaces.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/vyos/vyos_lag_interfaces.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/vyos/vyos_lag_interfaces.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/vyos/vyos_lldp_global.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/vyos/vyos_lldp_interfaces.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/vyos/vyos_lldp_interfaces.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/vyos_logging.py future-import-boilerplate
lib/ansible/modules/network/vyos/vyos_logging.py metaclass-boilerplate
lib/ansible/modules/network/vyos/vyos_logging.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/vyos/vyos_logging.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/vyos_logging.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/vyos/vyos_logging.py validate-modules:doc-missing-type
lib/ansible/modules/network/vyos/vyos_logging.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/vyos_logging.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/vyos/vyos_logging.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/vyos/vyos_logging.py validate-modules:undocumented-parameter
lib/ansible/modules/network/vyos/vyos_ping.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/vyos_ping.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/vyos_ping.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/vyos/vyos_static_route.py future-import-boilerplate
lib/ansible/modules/network/vyos/vyos_static_route.py metaclass-boilerplate
lib/ansible/modules/network/vyos/vyos_static_route.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/vyos/vyos_static_route.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/vyos_static_route.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/vyos/vyos_static_route.py validate-modules:doc-missing-type
lib/ansible/modules/network/vyos/vyos_static_route.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/vyos_static_route.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/vyos/vyos_static_route.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/vyos/vyos_static_route.py validate-modules:undocumented-parameter
lib/ansible/modules/network/vyos/vyos_system.py future-import-boilerplate
lib/ansible/modules/network/vyos/vyos_system.py metaclass-boilerplate
lib/ansible/modules/network/vyos/vyos_system.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/vyos_system.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/vyos_system.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/vyos/vyos_system.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/vyos/vyos_user.py future-import-boilerplate
lib/ansible/modules/network/vyos/vyos_user.py metaclass-boilerplate
lib/ansible/modules/network/vyos/vyos_user.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/vyos/vyos_user.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/vyos_user.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/vyos/vyos_user.py validate-modules:doc-missing-type
lib/ansible/modules/network/vyos/vyos_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/vyos_user.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/vyos/vyos_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/vyos/vyos_user.py validate-modules:undocumented-parameter
lib/ansible/modules/network/vyos/vyos_vlan.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/network/vyos/vyos_vlan.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/network/vyos/vyos_vlan.py validate-modules:doc-elements-mismatch
lib/ansible/modules/network/vyos/vyos_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/network/vyos/vyos_vlan.py validate-modules:doc-required-mismatch
lib/ansible/modules/network/vyos/vyos_vlan.py validate-modules:missing-suboption-docs
lib/ansible/modules/network/vyos/vyos_vlan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/network/vyos/vyos_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/network/vyos/vyos_vlan.py validate-modules:undocumented-parameter
lib/ansible/modules/notification/bearychat.py validate-modules:parameter-list-no-elements
lib/ansible/modules/notification/bearychat.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/campfire.py validate-modules:doc-missing-type
lib/ansible/modules/notification/catapult.py validate-modules:doc-missing-type
lib/ansible/modules/notification/catapult.py validate-modules:parameter-list-no-elements
lib/ansible/modules/notification/catapult.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/cisco_spark.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/notification/cisco_spark.py validate-modules:doc-missing-type
lib/ansible/modules/notification/cisco_spark.py validate-modules:undocumented-parameter
lib/ansible/modules/notification/flowdock.py validate-modules:doc-missing-type
lib/ansible/modules/notification/grove.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/hipchat.py validate-modules:doc-missing-type
lib/ansible/modules/notification/hipchat.py validate-modules:undocumented-parameter
lib/ansible/modules/notification/irc.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/notification/irc.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/notification/irc.py validate-modules:doc-missing-type
lib/ansible/modules/notification/irc.py validate-modules:doc-required-mismatch
lib/ansible/modules/notification/irc.py validate-modules:parameter-list-no-elements
lib/ansible/modules/notification/irc.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/irc.py validate-modules:undocumented-parameter
lib/ansible/modules/notification/jabber.py validate-modules:doc-missing-type
lib/ansible/modules/notification/jabber.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/logentries_msg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/mail.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/notification/mail.py validate-modules:parameter-list-no-elements
lib/ansible/modules/notification/mail.py validate-modules:undocumented-parameter
lib/ansible/modules/notification/matrix.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/mattermost.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/mqtt.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/notification/mqtt.py validate-modules:doc-missing-type
lib/ansible/modules/notification/mqtt.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/nexmo.py validate-modules:doc-missing-type
lib/ansible/modules/notification/nexmo.py validate-modules:parameter-list-no-elements
lib/ansible/modules/notification/nexmo.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/office_365_connector_card.py validate-modules:doc-missing-type
lib/ansible/modules/notification/office_365_connector_card.py validate-modules:parameter-list-no-elements
lib/ansible/modules/notification/office_365_connector_card.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/pushbullet.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/pushbullet.py validate-modules:undocumented-parameter
lib/ansible/modules/notification/pushover.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/notification/pushover.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/notification/pushover.py validate-modules:doc-missing-type
lib/ansible/modules/notification/pushover.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/rabbitmq_publish.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/rocketchat.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/notification/rocketchat.py validate-modules:parameter-list-no-elements
lib/ansible/modules/notification/rocketchat.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/say.py validate-modules:doc-missing-type
lib/ansible/modules/notification/sendgrid.py validate-modules:doc-missing-type
lib/ansible/modules/notification/sendgrid.py validate-modules:doc-required-mismatch
lib/ansible/modules/notification/sendgrid.py validate-modules:parameter-list-no-elements
lib/ansible/modules/notification/sendgrid.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/sendgrid.py validate-modules:undocumented-parameter
lib/ansible/modules/notification/slack.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/notification/slack.py validate-modules:parameter-list-no-elements
lib/ansible/modules/notification/slack.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/snow_record_find.py validate-modules:parameter-list-no-elements
lib/ansible/modules/notification/syslogger.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/telegram.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/twilio.py validate-modules:doc-missing-type
lib/ansible/modules/notification/twilio.py validate-modules:parameter-list-no-elements
lib/ansible/modules/notification/twilio.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/notification/typetalk.py validate-modules:doc-missing-type
lib/ansible/modules/notification/typetalk.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/language/bower.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/language/bower.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/language/bundler.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/language/bundler.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/language/bundler.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/language/bundler.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/language/composer.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/language/composer.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/language/cpanm.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/language/cpanm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/language/easy_install.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/language/easy_install.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/language/easy_install.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/language/gem.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/language/maven_artifact.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/language/maven_artifact.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/language/maven_artifact.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/language/pear.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/language/pear.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/language/pear.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/language/pear.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/language/pip.py pylint:blacklisted-name
lib/ansible/modules/packaging/language/pip.py validate-modules:doc-elements-mismatch
lib/ansible/modules/packaging/language/pip.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/packaging/language/pip_package_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/language/yarn.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/language/yarn.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/apk.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/apk.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/apk.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/apk.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/apt.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/apt.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/apt.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/apt.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/apt_key.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/apt_key.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/apt_repo.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/apt_repository.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/apt_repository.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/apt_repository.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/apt_repository.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/apt_rpm.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/apt_rpm.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/apt_rpm.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/apt_rpm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/apt_rpm.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/dnf.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/dnf.py validate-modules:doc-required-mismatch
lib/ansible/modules/packaging/os/dnf.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/dnf.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/dnf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/dpkg_selections.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/dpkg_selections.py validate-modules:doc-required-mismatch
lib/ansible/modules/packaging/os/flatpak.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/flatpak.py validate-modules:use-run-command-not-popen
lib/ansible/modules/packaging/os/flatpak_remote.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/flatpak_remote.py validate-modules:use-run-command-not-popen
lib/ansible/modules/packaging/os/homebrew.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/homebrew.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/homebrew.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/homebrew.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/homebrew.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/homebrew_cask.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/homebrew_cask.py validate-modules:doc-required-mismatch
lib/ansible/modules/packaging/os/homebrew_cask.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/homebrew_cask.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/homebrew_tap.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/homebrew_tap.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/homebrew_tap.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/installp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/layman.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/layman.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/macports.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/macports.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/macports.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/openbsd_pkg.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/openbsd_pkg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/openbsd_pkg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/opkg.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/opkg.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/opkg.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/opkg.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/opkg.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/package_facts.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/package_facts.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/package_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/package_facts.py validate-modules:return-syntax-error
lib/ansible/modules/packaging/os/pacman.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/pacman.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/pacman.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/pkg5.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/pkg5.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/pkg5.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/pkg5_publisher.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/pkg5_publisher.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/pkg5_publisher.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/pkgin.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/pkgin.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/pkgin.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/pkgin.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/pkgng.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/pkgng.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/pkgng.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/pkgng.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/pkgutil.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/portage.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/portage.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/portage.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/portinstall.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/portinstall.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/pulp_repo.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/pulp_repo.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/pulp_repo.py validate-modules:doc-required-mismatch
lib/ansible/modules/packaging/os/pulp_repo.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/redhat_subscription.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/redhat_subscription.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/redhat_subscription.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/redhat_subscription.py validate-modules:return-syntax-error
lib/ansible/modules/packaging/os/rhn_channel.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/rhn_channel.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/rhn_channel.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/rhn_register.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/rhsm_release.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/rhsm_repository.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/rhsm_repository.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/rhsm_repository.py validate-modules:doc-required-mismatch
lib/ansible/modules/packaging/os/rhsm_repository.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/rhsm_repository.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/rpm_key.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/slackpkg.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/slackpkg.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/slackpkg.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/slackpkg.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/slackpkg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/slackpkg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/slackpkg.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/snap.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/snap.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/sorcery.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/sorcery.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/sorcery.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/svr4pkg.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/swdepot.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/swdepot.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/swupd.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/urpmi.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/urpmi.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/urpmi.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/urpmi.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/urpmi.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/urpmi.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/xbps.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/xbps.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/xbps.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/xbps.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/xbps.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/xbps.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/yum.py pylint:blacklisted-name
lib/ansible/modules/packaging/os/yum.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/yum.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/yum.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/yum.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/yum.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/yum.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/yum_repository.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/yum_repository.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/yum_repository.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/yum_repository.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/yum_repository.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/zypper.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/zypper.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/zypper.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/zypper.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/zypper_repository.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/zypper_repository.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/cobbler/cobbler_sync.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/cobbler/cobbler_sync.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/cobbler/cobbler_system.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/cobbler/cobbler_system.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/cpm/cpm_plugconfig.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/cpm/cpm_plugconfig.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/cpm/cpm_plugconfig.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/cpm/cpm_plugcontrol.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/cpm/cpm_plugcontrol.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/cpm/cpm_plugcontrol.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/cpm/cpm_serial_port_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/cpm/cpm_serial_port_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/cpm/cpm_serial_port_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/cpm/cpm_serial_port_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/cpm/cpm_user.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/cpm/cpm_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/dellemc/idrac_server_config_profile.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/dellemc/idrac_server_config_profile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/dellemc/ome_device_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/foreman/_foreman.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/foreman/_katello.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/foreman/_katello.py yamllint:unparsable-with-libyaml
lib/ansible/modules/remote_management/hpilo/hpilo_boot.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/remote_management/hpilo/hpilo_boot.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/hpilo/hpilo_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/hpilo/hponcfg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/imc/imc_rest.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/intersight/intersight_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/intersight/intersight_rest_api.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ipmi/ipmi_boot.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/remote_management/ipmi/ipmi_boot.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/ipmi/ipmi_boot.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ipmi/ipmi_power.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/remote_management/ipmi/ipmi_power.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/ipmi/ipmi_power.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/lxca/lxca_cmms.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/lxca/lxca_nodes.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/manageiq/manageiq_alert_profiles.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/manageiq/manageiq_alert_profiles.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_alert_profiles.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_alert_profiles.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/manageiq/manageiq_alert_profiles.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/manageiq/manageiq_alerts.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/manageiq/manageiq_alerts.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_alerts.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_alerts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/manageiq/manageiq_group.py validate-modules:doc-elements-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_group.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/manageiq/manageiq_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_group.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_group.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/manageiq/manageiq_policies.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_policies.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_policies.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/remote_management/manageiq/manageiq_policies.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/manageiq/manageiq_policies.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/remote_management/manageiq/manageiq_policies.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/manageiq/manageiq_provider.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/remote_management/manageiq/manageiq_provider.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/remote_management/manageiq/manageiq_provider.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/manageiq/manageiq_provider.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_provider.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_provider.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/remote_management/manageiq/manageiq_provider.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/manageiq/manageiq_provider.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/manageiq/manageiq_tags.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_tags.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_tags.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/remote_management/manageiq/manageiq_tags.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/manageiq/manageiq_tags.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/remote_management/manageiq/manageiq_tags.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/manageiq/manageiq_tenant.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/manageiq/manageiq_tenant.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_tenant.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_tenant.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/manageiq/manageiq_user.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/manageiq/manageiq_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_user.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/remote_management/manageiq/manageiq_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_datacenter_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/oneview/oneview_datacenter_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_datacenter_info.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/oneview/oneview_enclosure_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/oneview/oneview_enclosure_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_enclosure_info.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/oneview/oneview_ethernet_network.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_ethernet_network.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/oneview/oneview_ethernet_network_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/oneview/oneview_ethernet_network_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_ethernet_network_info.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/oneview/oneview_fc_network.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/oneview/oneview_fc_network.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/oneview/oneview_fc_network.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_fc_network.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/oneview/oneview_fc_network_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_fc_network_info.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/oneview/oneview_fcoe_network.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/oneview/oneview_fcoe_network.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_fcoe_network.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/oneview/oneview_fcoe_network_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_fcoe_network_info.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/oneview/oneview_logical_interconnect_group.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/oneview/oneview_logical_interconnect_group.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_logical_interconnect_group.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/oneview/oneview_logical_interconnect_group_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_logical_interconnect_group_info.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/oneview/oneview_network_set.py validate-modules:doc-missing-type
lib/ansible/modules/remote_management/oneview/oneview_network_set.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_network_set.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/oneview/oneview_network_set_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/oneview/oneview_network_set_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_network_set_info.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/oneview/oneview_san_manager.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_san_manager.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/oneview/oneview_san_manager_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/oneview/oneview_san_manager_info.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/redfish/idrac_redfish_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/redfish/idrac_redfish_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/redfish/idrac_redfish_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/redfish/redfish_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/redfish/redfish_config.py validate-modules:doc-elements-mismatch
lib/ansible/modules/remote_management/redfish/redfish_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/redfish/redfish_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/stacki/stacki_host.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/remote_management/stacki/stacki_host.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/remote_management/stacki/stacki_host.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/remote_management/stacki/stacki_host.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/stacki/stacki_host.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/ucs/ucs_disk_group_policy.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/remote_management/ucs/ucs_disk_group_policy.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/remote_management/ucs/ucs_disk_group_policy.py validate-modules:doc-elements-mismatch
lib/ansible/modules/remote_management/ucs/ucs_disk_group_policy.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/ucs/ucs_disk_group_policy.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/remote_management/ucs/ucs_disk_group_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_ip_pool.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_lan_connectivity.py validate-modules:doc-elements-mismatch
lib/ansible/modules/remote_management/ucs/ucs_lan_connectivity.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_mac_pool.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_managed_objects.py validate-modules:doc-elements-mismatch
lib/ansible/modules/remote_management/ucs/ucs_managed_objects.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/ucs/ucs_managed_objects.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_managed_objects.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/ucs/ucs_ntp_server.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_san_connectivity.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/ucs/ucs_san_connectivity.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/remote_management/ucs/ucs_san_connectivity.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/ucs/ucs_san_connectivity.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_san_connectivity.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/ucs/ucs_service_profile_template.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_storage_profile.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/remote_management/ucs/ucs_storage_profile.py validate-modules:doc-elements-mismatch
lib/ansible/modules/remote_management/ucs/ucs_storage_profile.py validate-modules:doc-type-does-not-match-spec
lib/ansible/modules/remote_management/ucs/ucs_storage_profile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_timezone.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_uuid_pool.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_vhba_template.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/ucs/ucs_vhba_template.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/remote_management/ucs/ucs_vhba_template.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/ucs/ucs_vhba_template.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_vhba_template.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/ucs/ucs_vlans.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/ucs/ucs_vlans.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_vnic_template.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/remote_management/ucs/ucs_vnic_template.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/ucs/ucs_vnic_template.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_vsans.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/ucs/ucs_vsans.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/ucs/ucs_vsans.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_vsans.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/ucs/ucs_wwn_pool.py validate-modules:doc-required-mismatch
lib/ansible/modules/remote_management/ucs/ucs_wwn_pool.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/remote_management/ucs/ucs_wwn_pool.py validate-modules:parameter-list-no-elements
lib/ansible/modules/remote_management/ucs/ucs_wwn_pool.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/remote_management/ucs/ucs_wwn_pool.py validate-modules:undocumented-parameter
lib/ansible/modules/remote_management/wakeonlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/source_control/bzr.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/source_control/git.py pylint:blacklisted-name
lib/ansible/modules/source_control/git.py use-argspec-type-path
lib/ansible/modules/source_control/git.py validate-modules:doc-missing-type
lib/ansible/modules/source_control/git.py validate-modules:doc-required-mismatch
lib/ansible/modules/source_control/git.py validate-modules:parameter-list-no-elements
lib/ansible/modules/source_control/git.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/source_control/git_config.py validate-modules:doc-missing-type
lib/ansible/modules/source_control/git_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/source_control/github/_github_hooks.py validate-modules:doc-missing-type
lib/ansible/modules/source_control/github/github_deploy_key.py validate-modules:doc-missing-type
lib/ansible/modules/source_control/github/github_deploy_key.py validate-modules:parameter-invalid
lib/ansible/modules/source_control/github/github_deploy_key.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/source_control/github/github_issue.py validate-modules:doc-missing-type
lib/ansible/modules/source_control/github/github_issue.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/source_control/github/github_key.py validate-modules:doc-missing-type
lib/ansible/modules/source_control/github/github_release.py validate-modules:doc-missing-type
lib/ansible/modules/source_control/github/github_release.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/source_control/github/github_webhook.py validate-modules:doc-elements-mismatch
lib/ansible/modules/source_control/github/github_webhook.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/source_control/github/github_webhook_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/source_control/gitlab/gitlab_deploy_key.py validate-modules:doc-required-mismatch
lib/ansible/modules/source_control/gitlab/gitlab_hook.py validate-modules:doc-required-mismatch
lib/ansible/modules/source_control/gitlab/gitlab_runner.py validate-modules:doc-required-mismatch
lib/ansible/modules/source_control/gitlab/gitlab_runner.py validate-modules:parameter-list-no-elements
lib/ansible/modules/source_control/hg.py validate-modules:doc-required-mismatch
lib/ansible/modules/source_control/hg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/source_control/subversion.py validate-modules:doc-required-mismatch
lib/ansible/modules/source_control/subversion.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/source_control/subversion.py validate-modules:undocumented-parameter
lib/ansible/modules/storage/emc/emc_vnx_sg_member.py validate-modules:doc-missing-type
lib/ansible/modules/storage/emc/emc_vnx_sg_member.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/glusterfs/gluster_heal_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/glusterfs/gluster_peer.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/glusterfs/gluster_peer.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/glusterfs/gluster_peer.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/glusterfs/gluster_volume.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/glusterfs/gluster_volume.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/ibm/ibm_sa_domain.py validate-modules:doc-missing-type
lib/ansible/modules/storage/ibm/ibm_sa_domain.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/ibm/ibm_sa_host.py validate-modules:doc-missing-type
lib/ansible/modules/storage/ibm/ibm_sa_host.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/ibm/ibm_sa_host_ports.py validate-modules:doc-missing-type
lib/ansible/modules/storage/ibm/ibm_sa_host_ports.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/ibm/ibm_sa_pool.py validate-modules:doc-missing-type
lib/ansible/modules/storage/ibm/ibm_sa_pool.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/ibm/ibm_sa_vol.py validate-modules:doc-missing-type
lib/ansible/modules/storage/ibm/ibm_sa_vol.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/ibm/ibm_sa_vol_map.py validate-modules:doc-missing-type
lib/ansible/modules/storage/ibm/ibm_sa_vol_map.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/infinidat/infini_export.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/storage/infinidat/infini_export.py validate-modules:doc-missing-type
lib/ansible/modules/storage/infinidat/infini_export.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/storage/infinidat/infini_export.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/infinidat/infini_export.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/infinidat/infini_export_client.py validate-modules:doc-missing-type
lib/ansible/modules/storage/infinidat/infini_export_client.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/storage/infinidat/infini_fs.py validate-modules:doc-missing-type
lib/ansible/modules/storage/infinidat/infini_host.py validate-modules:doc-missing-type
lib/ansible/modules/storage/infinidat/infini_host.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/infinidat/infini_host.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/infinidat/infini_pool.py validate-modules:doc-missing-type
lib/ansible/modules/storage/infinidat/infini_vol.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/_na_cdot_aggregate.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/_na_cdot_aggregate.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_na_cdot_license.py validate-modules:incompatible-default-type
lib/ansible/modules/storage/netapp/_na_cdot_license.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_na_cdot_lun.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/_na_cdot_lun.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_na_cdot_qtree.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/_na_cdot_qtree.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_na_cdot_svm.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/_na_cdot_svm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_na_cdot_user.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/_na_cdot_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_na_cdot_user_role.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/_na_cdot_user_role.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_na_cdot_volume.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/_na_cdot_volume.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/storage/netapp/_na_cdot_volume.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_na_cdot_volume.py validate-modules:undocumented-parameter
lib/ansible/modules/storage/netapp/_na_ontap_gather_facts.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/_na_ontap_gather_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/_na_ontap_gather_facts.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/storage/netapp/_na_ontap_gather_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_sf_account_manager.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/_sf_account_manager.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_sf_check_connections.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_sf_snapshot_schedule_manager.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/_sf_snapshot_schedule_manager.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/_sf_snapshot_schedule_manager.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_sf_volume_access_group_manager.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/_sf_volume_access_group_manager.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/_sf_volume_access_group_manager.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_sf_volume_manager.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/_sf_volume_manager.py validate-modules:parameter-invalid
lib/ansible/modules/storage/netapp/_sf_volume_manager.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/_sf_volume_manager.py validate-modules:undocumented-parameter
lib/ansible/modules/storage/netapp/na_elementsw_access_group.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_access_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_elementsw_access_group.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_account.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_account.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_admin_users.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_admin_users.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_elementsw_admin_users.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_backup.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_backup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_check_connections.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_cluster.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_cluster_config.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_elementsw_cluster_config.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_cluster_pair.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_cluster_pair.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_cluster_snmp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_drive.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_drive.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_initiators.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_initiators.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/na_elementsw_initiators.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_elementsw_initiators.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_initiators.py validate-modules:undocumented-parameter
lib/ansible/modules/storage/netapp/na_elementsw_ldap.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_network_interfaces.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_elementsw_network_interfaces.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_node.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_node.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_elementsw_node.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_snapshot.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_snapshot.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_snapshot_restore.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_snapshot_schedule.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_snapshot_schedule.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/na_elementsw_snapshot_schedule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_elementsw_snapshot_schedule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_vlan.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_elementsw_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_volume.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_volume.py validate-modules:parameter-invalid
lib/ansible/modules/storage/netapp/na_elementsw_volume.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_volume_clone.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_volume_clone.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_elementsw_volume_pair.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_elementsw_volume_pair.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_aggregate.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_aggregate.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_aggregate.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_autosupport.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_autosupport.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_autosupport.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_broadcast_domain.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_broadcast_domain.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_broadcast_domain.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_broadcast_domain_ports.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_broadcast_domain_ports.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/na_ontap_broadcast_domain_ports.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_broadcast_domain_ports.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_cg_snapshot.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_cg_snapshot.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_cg_snapshot.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_cifs.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_cifs.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_cifs_acl.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_cifs_server.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_cifs_server.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_cluster.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_cluster.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_cluster_ha.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_cluster_peer.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_cluster_peer.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_command.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_disks.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_dns.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_dns.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_dns.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_export_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_export_policy_rule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_export_policy_rule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_fcp.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_fcp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_firewall_policy.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_firewall_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_firewall_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_firmware_upgrade.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/na_ontap_flexcache.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_flexcache.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_igroup.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_igroup.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_igroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_igroup_initiator.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_igroup_initiator.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_igroup_initiator.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_info.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/storage/netapp/na_ontap_interface.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_interface.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_ipspace.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_ipspace.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_iscsi.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_iscsi.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_job_schedule.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_job_schedule.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_job_schedule.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_kerberos_realm.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/storage/netapp/na_ontap_ldap_client.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/storage/netapp/na_ontap_ldap_client.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_license.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_license.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_license.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_lun.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_lun.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_lun_copy.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_lun_copy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_lun_map.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_lun_map.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_motd.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_ndmp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_net_ifgrp.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_net_ifgrp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_net_ifgrp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_net_port.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_net_port.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_net_port.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_net_routes.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_net_routes.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_net_subnet.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_net_subnet.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/na_ontap_net_subnet.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_net_vlan.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_net_vlan.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_nfs.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_nfs.py validate-modules:parameter-invalid
lib/ansible/modules/storage/netapp/na_ontap_nfs.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_node.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_ntp.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_ntp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_nvme.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_nvme_namespace.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/na_ontap_nvme_namespace.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_nvme_subsystem.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_nvme_subsystem.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_ports.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_portset.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_portset.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_portset.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_qos_policy_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/na_ontap_qos_policy_group.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_qtree.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_qtree.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/na_ontap_security_key_manager.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_security_key_manager.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_service_processor_network.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_service_processor_network.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_snapmirror.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_snapshot.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_snapshot.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_snapshot_policy.py validate-modules:doc-elements-mismatch
lib/ansible/modules/storage/netapp/na_ontap_snapshot_policy.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_snapshot_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_snmp.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_software_update.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_software_update.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_svm.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_svm.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_svm.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_svm_options.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_ucadapter.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_ucadapter.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_unix_group.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_unix_group.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_unix_group.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_unix_user.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_unix_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_user.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_user_role.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_user_role.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_volume.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_volume.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_volume.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_volume_clone.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_vscan_on_access_policy.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_vscan_on_access_policy.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_vscan_on_access_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_vscan_on_demand_task.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_vscan_on_demand_task.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_vscan_on_demand_task.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_vscan_scanner_pool.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/na_ontap_vscan_scanner_pool.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_vscan_scanner_pool.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/na_ontap_vserver_peer.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/na_ontap_vserver_peer.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/na_ontap_vserver_peer.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_alerts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/netapp_e_alerts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_amg.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/netapp_e_amg.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_amg.py validate-modules:undocumented-parameter
lib/ansible/modules/storage/netapp/netapp_e_amg_role.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/netapp_e_amg_role.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/netapp_e_amg_role.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_amg_role.py validate-modules:undocumented-parameter
lib/ansible/modules/storage/netapp/netapp_e_amg_sync.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/netapp_e_amg_sync.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_asup.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/netapp_e_asup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_auditlog.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_auth.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/netapp_e_auth.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/netapp_e_auth.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_drive_firmware.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/netapp_e_facts.py validate-modules:return-syntax-error
lib/ansible/modules/storage/netapp/netapp_e_flashcache.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/storage/netapp/netapp_e_flashcache.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/netapp_e_flashcache.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/netapp_e_flashcache.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/netapp_e_flashcache.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_flashcache.py validate-modules:undocumented-parameter
lib/ansible/modules/storage/netapp/netapp_e_global.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_host.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/netapp_e_host.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_hostgroup.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/netapp_e_hostgroup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_iscsi_interface.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/netapp_e_iscsi_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_iscsi_target.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_ldap.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/netapp_e_ldap.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/netapp_e_ldap.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_lun_mapping.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/netapp_e_lun_mapping.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_mgmt_interface.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_snapshot_group.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/storage/netapp/netapp_e_snapshot_group.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/netapp_e_snapshot_group.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/netapp_e_snapshot_group.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_snapshot_group.py validate-modules:undocumented-parameter
lib/ansible/modules/storage/netapp/netapp_e_snapshot_images.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/netapp_e_snapshot_images.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/netapp_e_snapshot_images.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_snapshot_images.py validate-modules:undocumented-parameter
lib/ansible/modules/storage/netapp/netapp_e_snapshot_volume.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/storage/netapp/netapp_e_snapshot_volume.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/storage/netapp/netapp_e_snapshot_volume.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/netapp_e_snapshot_volume.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_storage_system.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/storage/netapp/netapp_e_storage_system.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/netapp_e_storage_system.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/netapp_e_storage_system.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/netapp_e_storage_system.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_storage_system.py validate-modules:undocumented-parameter
lib/ansible/modules/storage/netapp/netapp_e_storagepool.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/netapp_e_storagepool.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_syslog.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/netapp_e_syslog.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/netapp/netapp_e_syslog.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_volume.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/storage/netapp/netapp_e_volume.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/storage/netapp/netapp_e_volume.py validate-modules:doc-missing-type
lib/ansible/modules/storage/netapp/netapp_e_volume.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/netapp_e_volume.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_volume_copy.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/storage/netapp/netapp_e_volume_copy.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/storage/netapp/netapp_e_volume_copy.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/netapp/netapp_e_volume_copy.py validate-modules:implied-parameter-type-mismatch
lib/ansible/modules/storage/netapp/netapp_e_volume_copy.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/storage/netapp/netapp_e_volume_copy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/netapp/netapp_e_volume_copy.py validate-modules:undocumented-parameter
lib/ansible/modules/storage/purestorage/_purefa_facts.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/_purefa_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/purestorage/_purefa_facts.py validate-modules:return-syntax-error
lib/ansible/modules/storage/purestorage/_purefb_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/purestorage/_purefb_facts.py validate-modules:return-syntax-error
lib/ansible/modules/storage/purestorage/purefa_alert.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_arrayname.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_banner.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_connect.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_dns.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_dns.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/purestorage/purefa_ds.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_ds.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/purestorage/purefa_dsrole.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_dsrole.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/purestorage/purefa_hg.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_hg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/purestorage/purefa_host.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_host.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/purestorage/purefa_info.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/purestorage/purefa_info.py validate-modules:return-syntax-error
lib/ansible/modules/storage/purestorage/purefa_ntp.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_ntp.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/purestorage/purefa_offload.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_pg.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_pg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/purestorage/purefa_pgsnap.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_pgsnap.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/purestorage/purefa_phonehome.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_ra.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_smtp.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_snap.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_snmp.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_syslog.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_vg.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefa_volume.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefb_ds.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefb_ds.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/purestorage/purefb_dsrole.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefb_fs.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/storage/purestorage/purefb_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/purestorage/purefb_info.py validate-modules:return-syntax-error
lib/ansible/modules/storage/purestorage/purefb_s3acc.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/purestorage/purefb_s3user.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/zfs/zfs.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/zfs/zfs_delegate_admin.py validate-modules:doc-required-mismatch
lib/ansible/modules/storage/zfs/zfs_delegate_admin.py validate-modules:parameter-list-no-elements
lib/ansible/modules/storage/zfs/zfs_delegate_admin.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/zfs/zfs_facts.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/storage/zfs/zfs_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/storage/zfs/zpool_facts.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/storage/zfs/zpool_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/aix_devices.py validate-modules:doc-required-mismatch
lib/ansible/modules/system/aix_filesystem.py validate-modules:doc-required-mismatch
lib/ansible/modules/system/aix_filesystem.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/aix_inittab.py validate-modules:doc-required-mismatch
lib/ansible/modules/system/aix_lvg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/aix_lvol.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/alternatives.py pylint:blacklisted-name
lib/ansible/modules/system/at.py validate-modules:doc-required-mismatch
lib/ansible/modules/system/authorized_key.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/awall.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/beadm.py pylint:blacklisted-name
lib/ansible/modules/system/cronvar.py pylint:blacklisted-name
lib/ansible/modules/system/dconf.py pylint:blacklisted-name
lib/ansible/modules/system/dconf.py validate-modules:doc-missing-type
lib/ansible/modules/system/dconf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/filesystem.py pylint:blacklisted-name
lib/ansible/modules/system/filesystem.py validate-modules:doc-missing-type
lib/ansible/modules/system/gconftool2.py pylint:blacklisted-name
lib/ansible/modules/system/gconftool2.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/system/gconftool2.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/getent.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/hostname.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/system/hostname.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/interfaces_file.py pylint:blacklisted-name
lib/ansible/modules/system/interfaces_file.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/iptables.py pylint:blacklisted-name
lib/ansible/modules/system/iptables.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/java_cert.py pylint:blacklisted-name
lib/ansible/modules/system/java_keystore.py validate-modules:doc-missing-type
lib/ansible/modules/system/java_keystore.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/java_keystore.py validate-modules:undocumented-parameter
lib/ansible/modules/system/kernel_blacklist.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/known_hosts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/system/known_hosts.py validate-modules:doc-missing-type
lib/ansible/modules/system/known_hosts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/lbu.py validate-modules:doc-elements-mismatch
lib/ansible/modules/system/locale_gen.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/lvg.py pylint:blacklisted-name
lib/ansible/modules/system/lvg.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/lvol.py pylint:blacklisted-name
lib/ansible/modules/system/lvol.py validate-modules:doc-required-mismatch
lib/ansible/modules/system/lvol.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/mksysb.py validate-modules:doc-missing-type
lib/ansible/modules/system/modprobe.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/nosh.py validate-modules:doc-missing-type
lib/ansible/modules/system/nosh.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/nosh.py validate-modules:return-syntax-error
lib/ansible/modules/system/openwrt_init.py validate-modules:doc-missing-type
lib/ansible/modules/system/openwrt_init.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/osx_defaults.py validate-modules:doc-required-mismatch
lib/ansible/modules/system/osx_defaults.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/system/pam_limits.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/pamd.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/parted.py pylint:blacklisted-name
lib/ansible/modules/system/parted.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/parted.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/system/puppet.py use-argspec-type-path
lib/ansible/modules/system/puppet.py validate-modules:parameter-invalid
lib/ansible/modules/system/puppet.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/puppet.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/puppet.py validate-modules:undocumented-parameter
lib/ansible/modules/system/python_requirements_info.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/python_requirements_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/runit.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/system/runit.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/runit.py validate-modules:undocumented-parameter
lib/ansible/modules/system/seboolean.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/sefcontext.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/system/selinux.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/system/selinux.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/selogin.py validate-modules:doc-required-mismatch
lib/ansible/modules/system/selogin.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/seport.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/service.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/system/service.py validate-modules:use-run-command-not-popen
lib/ansible/modules/system/setup.py validate-modules:doc-missing-type
lib/ansible/modules/system/setup.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/setup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/solaris_zone.py validate-modules:doc-required-mismatch
lib/ansible/modules/system/sysctl.py validate-modules:doc-missing-type
lib/ansible/modules/system/sysctl.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/syspatch.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/systemd.py validate-modules:parameter-invalid
lib/ansible/modules/system/systemd.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/systemd.py validate-modules:return-syntax-error
lib/ansible/modules/system/sysvinit.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/sysvinit.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/sysvinit.py validate-modules:return-syntax-error
lib/ansible/modules/system/timezone.py pylint:blacklisted-name
lib/ansible/modules/system/user.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/system/user.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/system/user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/user.py validate-modules:use-run-command-not-popen
lib/ansible/modules/system/vdo.py validate-modules:doc-required-mismatch
lib/ansible/modules/system/xfconf.py validate-modules:parameter-state-invalid-choice
lib/ansible/modules/system/xfconf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/utilities/logic/async_status.py use-argspec-type-path
lib/ansible/modules/utilities/logic/async_status.py validate-modules!skip
lib/ansible/modules/utilities/logic/async_wrapper.py ansible-doc!skip # not an actual module
lib/ansible/modules/utilities/logic/async_wrapper.py pylint:ansible-bad-function
lib/ansible/modules/utilities/logic/async_wrapper.py use-argspec-type-path
lib/ansible/modules/utilities/logic/wait_for.py validate-modules:parameter-list-no-elements
lib/ansible/modules/web_infrastructure/_nginx_status_facts.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/_nginx_status_facts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/ansible_tower/tower_credential.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/web_infrastructure/ansible_tower/tower_credential.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/web_infrastructure/ansible_tower/tower_credential_type.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_credential_type.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/web_infrastructure/ansible_tower/tower_credential_type.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/ansible_tower/tower_group.py use-argspec-type-path
lib/ansible/modules/web_infrastructure/ansible_tower/tower_group.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/web_infrastructure/ansible_tower/tower_group.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_host.py use-argspec-type-path
lib/ansible/modules/web_infrastructure/ansible_tower/tower_host.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_inventory.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_inventory_source.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_inventory_source.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/web_infrastructure/ansible_tower/tower_inventory_source.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/ansible_tower/tower_job_cancel.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/ansible_tower/tower_job_launch.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_job_launch.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/web_infrastructure/ansible_tower/tower_job_launch.py validate-modules:parameter-list-no-elements
lib/ansible/modules/web_infrastructure/ansible_tower/tower_job_launch.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/ansible_tower/tower_job_list.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_job_list.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/ansible_tower/tower_job_template.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_job_template.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/ansible_tower/tower_job_template.py validate-modules:undocumented-parameter
lib/ansible/modules/web_infrastructure/ansible_tower/tower_job_wait.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/ansible_tower/tower_label.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_notification.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_notification.py validate-modules:parameter-list-no-elements
lib/ansible/modules/web_infrastructure/ansible_tower/tower_notification.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/ansible_tower/tower_organization.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_project.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_project.py validate-modules:doc-required-mismatch
lib/ansible/modules/web_infrastructure/ansible_tower/tower_project.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/ansible_tower/tower_receive.py validate-modules:parameter-list-no-elements
lib/ansible/modules/web_infrastructure/ansible_tower/tower_receive.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/ansible_tower/tower_role.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_role.py validate-modules:doc-required-mismatch
lib/ansible/modules/web_infrastructure/ansible_tower/tower_send.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_send.py validate-modules:parameter-list-no-elements
lib/ansible/modules/web_infrastructure/ansible_tower/tower_send.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/ansible_tower/tower_settings.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_team.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_team.py validate-modules:undocumented-parameter
lib/ansible/modules/web_infrastructure/ansible_tower/tower_user.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_workflow_launch.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ansible_tower/tower_workflow_launch.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/ansible_tower/tower_workflow_template.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/apache2_mod_proxy.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/web_infrastructure/apache2_mod_proxy.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/web_infrastructure/apache2_mod_proxy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/apache2_module.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/apache2_module.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/deploy_helper.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/deploy_helper.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/deploy_helper.py validate-modules:undocumented-parameter
lib/ansible/modules/web_infrastructure/django_manage.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/web_infrastructure/django_manage.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/django_manage.py validate-modules:no-default-for-required-parameter
lib/ansible/modules/web_infrastructure/django_manage.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/django_manage.py validate-modules:undocumented-parameter
lib/ansible/modules/web_infrastructure/ejabberd_user.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/ejabberd_user.py validate-modules:doc-required-mismatch
lib/ansible/modules/web_infrastructure/ejabberd_user.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/gunicorn.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/gunicorn.py validate-modules:undocumented-parameter
lib/ansible/modules/web_infrastructure/htpasswd.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/web_infrastructure/htpasswd.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/jenkins_job.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/jenkins_job_info.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/jenkins_plugin.py use-argspec-type-path
lib/ansible/modules/web_infrastructure/jenkins_plugin.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/jenkins_plugin.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/jenkins_plugin.py validate-modules:undocumented-parameter
lib/ansible/modules/web_infrastructure/jenkins_script.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/jira.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/jira.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/jira.py validate-modules:undocumented-parameter
lib/ansible/modules/web_infrastructure/rundeck_acl_policy.py pylint:blacklisted-name
lib/ansible/modules/web_infrastructure/rundeck_acl_policy.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/rundeck_project.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/sophos_utm/utm_aaa_group.py validate-modules:doc-elements-mismatch
lib/ansible/modules/web_infrastructure/sophos_utm/utm_aaa_group_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/sophos_utm/utm_ca_host_key_cert.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/sophos_utm/utm_ca_host_key_cert_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/sophos_utm/utm_dns_host.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/sophos_utm/utm_network_interface_address.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/sophos_utm/utm_network_interface_address_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/sophos_utm/utm_proxy_auth_profile.py validate-modules:doc-elements-mismatch
lib/ansible/modules/web_infrastructure/sophos_utm/utm_proxy_auth_profile.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/sophos_utm/utm_proxy_exception.py validate-modules:doc-elements-mismatch
lib/ansible/modules/web_infrastructure/sophos_utm/utm_proxy_exception.py validate-modules:return-syntax-error
lib/ansible/modules/web_infrastructure/sophos_utm/utm_proxy_frontend.py validate-modules:doc-elements-mismatch
lib/ansible/modules/web_infrastructure/sophos_utm/utm_proxy_frontend.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/sophos_utm/utm_proxy_frontend_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/sophos_utm/utm_proxy_location.py validate-modules:doc-elements-mismatch
lib/ansible/modules/web_infrastructure/sophos_utm/utm_proxy_location.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/sophos_utm/utm_proxy_location_info.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/supervisorctl.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/supervisorctl.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/web_infrastructure/taiga_issue.py validate-modules:doc-missing-type
lib/ansible/modules/web_infrastructure/taiga_issue.py validate-modules:parameter-list-no-elements
lib/ansible/modules/web_infrastructure/taiga_issue.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/windows/async_status.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/setup.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_acl_inheritance.ps1 pslint:PSAvoidTrailingWhitespace
lib/ansible/modules/windows/win_audit_rule.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_certificate_store.ps1 validate-modules:parameter-type-not-in-doc
lib/ansible/modules/windows/win_chocolatey.ps1 validate-modules:doc-elements-mismatch
lib/ansible/modules/windows/win_chocolatey_config.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_chocolatey_facts.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_chocolatey_source.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_copy.ps1 pslint:PSUseApprovedVerbs
lib/ansible/modules/windows/win_credential.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_credential.ps1 validate-modules:doc-elements-mismatch
lib/ansible/modules/windows/win_credential.ps1 validate-modules:parameter-type-not-in-doc
lib/ansible/modules/windows/win_defrag.ps1 validate-modules:parameter-list-no-elements
lib/ansible/modules/windows/win_dns_client.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_dns_record.ps1 validate-modules:doc-elements-mismatch
lib/ansible/modules/windows/win_domain.ps1 pslint:PSAvoidUsingEmptyCatchBlock # Keep
lib/ansible/modules/windows/win_domain.ps1 pslint:PSUseApprovedVerbs
lib/ansible/modules/windows/win_domain_controller.ps1 pslint:PSAvoidGlobalVars # New PR
lib/ansible/modules/windows/win_domain_controller.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_domain_controller.ps1 pslint:PSUseApprovedVerbs
lib/ansible/modules/windows/win_domain_membership.ps1 pslint:PSAvoidGlobalVars # New PR
lib/ansible/modules/windows/win_domain_membership.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_domain_membership.ps1 pslint:PSUseApprovedVerbs
lib/ansible/modules/windows/win_dotnet_ngen.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_dsc.ps1 pslint:PSAvoidUsingEmptyCatchBlock # Keep
lib/ansible/modules/windows/win_dsc.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_eventlog.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_feature.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_file_version.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_find.ps1 pslint:PSAvoidUsingEmptyCatchBlock # Keep
lib/ansible/modules/windows/win_find.ps1 validate-modules:doc-elements-mismatch
lib/ansible/modules/windows/win_firewall_rule.ps1 pslint:PSUseApprovedVerbs
lib/ansible/modules/windows/win_hosts.ps1 validate-modules:doc-elements-mismatch
lib/ansible/modules/windows/win_hotfix.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_hotfix.ps1 pslint:PSUseApprovedVerbs
lib/ansible/modules/windows/win_http_proxy.ps1 validate-modules:parameter-list-no-elements
lib/ansible/modules/windows/win_http_proxy.ps1 validate-modules:parameter-type-not-in-doc
lib/ansible/modules/windows/win_iis_virtualdirectory.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_iis_webapplication.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_iis_webapppool.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_iis_webbinding.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_iis_webbinding.ps1 pslint:PSUseApprovedVerbs
lib/ansible/modules/windows/win_iis_website.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_inet_proxy.ps1 validate-modules:parameter-list-no-elements
lib/ansible/modules/windows/win_inet_proxy.ps1 validate-modules:parameter-type-not-in-doc
lib/ansible/modules/windows/win_lineinfile.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_mapped_drive.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_netbios.ps1 validate-modules:parameter-list-no-elements
lib/ansible/modules/windows/win_optional_feature.ps1 validate-modules:parameter-list-no-elements
lib/ansible/modules/windows/win_pagefile.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_pagefile.ps1 pslint:PSUseDeclaredVarsMoreThanAssignments # New PR - bug test_path should be testPath
lib/ansible/modules/windows/win_pagefile.ps1 pslint:PSUseSupportsShouldProcess
lib/ansible/modules/windows/win_pester.ps1 validate-modules:doc-elements-mismatch
lib/ansible/modules/windows/win_product_facts.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_psexec.ps1 validate-modules:parameter-list-no-elements
lib/ansible/modules/windows/win_psexec.ps1 validate-modules:parameter-type-not-in-doc
lib/ansible/modules/windows/win_rabbitmq_plugin.ps1 pslint:PSAvoidUsingInvokeExpression
lib/ansible/modules/windows/win_rabbitmq_plugin.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_rds_cap.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_rds_rap.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_rds_settings.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_regedit.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_region.ps1 pslint:PSAvoidUsingEmptyCatchBlock # Keep
lib/ansible/modules/windows/win_region.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_regmerge.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_robocopy.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_say.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_security_policy.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_security_policy.ps1 pslint:PSUseApprovedVerbs
lib/ansible/modules/windows/win_share.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_shell.ps1 pslint:PSUseApprovedVerbs
lib/ansible/modules/windows/win_shortcut.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_snmp.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_unzip.ps1 pslint:PSUseApprovedVerbs
lib/ansible/modules/windows/win_updates.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_uri.ps1 pslint:PSAvoidUsingEmptyCatchBlock # Keep
lib/ansible/modules/windows/win_user_profile.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_user_profile.ps1 validate-modules:parameter-type-not-in-doc
lib/ansible/modules/windows/win_wait_for.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/modules/windows/win_wait_for_process.ps1 validate-modules:parameter-list-no-elements
lib/ansible/modules/windows/win_webpicmd.ps1 pslint:PSAvoidUsingInvokeExpression
lib/ansible/modules/windows/win_xml.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/parsing/vault/__init__.py pylint:blacklisted-name
lib/ansible/playbook/base.py pylint:blacklisted-name
lib/ansible/playbook/collectionsearch.py required-and-default-attributes # https://github.com/ansible/ansible/issues/61460
lib/ansible/playbook/helpers.py pylint:blacklisted-name
lib/ansible/playbook/role/__init__.py pylint:blacklisted-name
lib/ansible/plugins/action/aireos.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/aruba.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/asa.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/bigip.py action-plugin-docs # undocumented action plugin to fix, existed before sanity test was added
lib/ansible/plugins/action/bigiq.py action-plugin-docs # undocumented action plugin to fix, existed before sanity test was added
lib/ansible/plugins/action/ce.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/ce_template.py action-plugin-docs # undocumented action plugin to fix, existed before sanity test was added
lib/ansible/plugins/action/cnos.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/dellos10.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/dellos6.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/dellos9.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/enos.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/eos.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/exos.py action-plugin-docs # undocumented action plugin to fix
lib/ansible/plugins/action/ios.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/iosxr.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/ironware.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/junos.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/net_base.py action-plugin-docs # base class for other net_* action plugins which have a matching module
lib/ansible/plugins/action/netconf.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/network.py action-plugin-docs # base class for network action plugins
lib/ansible/plugins/action/normal.py action-plugin-docs # default action plugin for modules without a dedicated action plugin
lib/ansible/plugins/action/nxos.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/slxos.py action-plugin-docs # undocumented action plugin to fix
lib/ansible/plugins/action/sros.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/action/voss.py action-plugin-docs # undocumented action plugin to fix
lib/ansible/plugins/action/vyos.py action-plugin-docs # base class for deprecated network platform modules using `connection: local`
lib/ansible/plugins/cache/base.py ansible-doc!skip # not a plugin, but a stub for backwards compatibility
lib/ansible/plugins/callback/hipchat.py pylint:blacklisted-name
lib/ansible/plugins/connection/lxc.py pylint:blacklisted-name
lib/ansible/plugins/connection/vmware_tools.py yamllint:unparsable-with-libyaml
lib/ansible/plugins/doc_fragments/a10.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/a10.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/aireos.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/aireos.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/alicloud.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/alicloud.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/aruba.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/aruba.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/asa.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/asa.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/auth_basic.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/auth_basic.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/avi.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/avi.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/aws.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/aws.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/aws_credentials.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/aws_credentials.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/aws_region.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/aws_region.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/azure.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/azure.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/azure_tags.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/azure_tags.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/backup.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/backup.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/ce.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/ce.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/cnos.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/cnos.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/constructed.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/constructed.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/decrypt.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/decrypt.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/default_callback.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/default_callback.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/dellos10.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/dellos10.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/dellos6.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/dellos6.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/dellos9.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/dellos9.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/digital_ocean.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/digital_ocean.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/dimensiondata.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/dimensiondata.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/dimensiondata_wait.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/dimensiondata_wait.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/ec2.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/ec2.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/emc.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/emc.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/enos.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/enos.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/eos.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/eos.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/f5.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/f5.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/files.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/files.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/fortios.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/fortios.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/gcp.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/gcp.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/hcloud.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/hcloud.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/hetzner.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/hetzner.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/hpe3par.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/hpe3par.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/hwc.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/hwc.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/infinibox.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/infinibox.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/influxdb.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/influxdb.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/ingate.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/ingate.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/intersight.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/intersight.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/inventory_cache.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/inventory_cache.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/ios.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/ios.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/iosxr.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/iosxr.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/ipa.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/ipa.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/ironware.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/ironware.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/junos.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/junos.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/k8s_auth_options.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/k8s_auth_options.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/k8s_name_options.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/k8s_name_options.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/k8s_resource_options.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/k8s_resource_options.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/k8s_scale_options.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/k8s_scale_options.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/k8s_state_options.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/k8s_state_options.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/keycloak.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/keycloak.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/kubevirt_common_options.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/kubevirt_common_options.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/kubevirt_vm_options.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/kubevirt_vm_options.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/ldap.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/ldap.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/lxca_common.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/lxca_common.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/manageiq.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/manageiq.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/meraki.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/meraki.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/mysql.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/mysql.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/netapp.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/netapp.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/netconf.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/netconf.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/netscaler.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/netscaler.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/network_agnostic.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/network_agnostic.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/nios.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/nios.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/nso.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/nso.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/nxos.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/nxos.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/oneview.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/oneview.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/online.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/online.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/onyx.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/onyx.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/opennebula.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/opennebula.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/openstack.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/openstack.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/openswitch.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/openswitch.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/oracle.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/oracle.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/oracle_creatable_resource.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/oracle_creatable_resource.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/oracle_display_name_option.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/oracle_display_name_option.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/oracle_name_option.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/oracle_name_option.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/oracle_tags.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/oracle_tags.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/oracle_wait_options.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/oracle_wait_options.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/ovirt.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/ovirt.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/ovirt_info.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/ovirt_info.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/panos.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/panos.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/postgres.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/postgres.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/proxysql.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/proxysql.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/purestorage.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/purestorage.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/rabbitmq.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/rabbitmq.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/rackspace.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/rackspace.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/return_common.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/return_common.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/scaleway.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/scaleway.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/shell_common.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/shell_common.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/shell_windows.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/shell_windows.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/skydive.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/skydive.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/sros.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/sros.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/tower.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/tower.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/ucs.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/ucs.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/url.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/url.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/utm.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/utm.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/validate.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/validate.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/vca.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/vca.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/vexata.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/vexata.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/vmware.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/vmware.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/vmware_rest_client.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/vmware_rest_client.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/vultr.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/vultr.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/vyos.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/vyos.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/xenserver.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/xenserver.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/zabbix.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/zabbix.py metaclass-boilerplate
lib/ansible/plugins/lookup/sequence.py pylint:blacklisted-name
lib/ansible/plugins/strategy/__init__.py pylint:blacklisted-name
lib/ansible/plugins/strategy/linear.py pylint:blacklisted-name
lib/ansible/vars/hostvars.py pylint:blacklisted-name
setup.py future-import-boilerplate
setup.py metaclass-boilerplate
test/integration/targets/ansible-runner/files/adhoc_example1.py future-import-boilerplate
test/integration/targets/ansible-runner/files/adhoc_example1.py metaclass-boilerplate
test/integration/targets/ansible-runner/files/playbook_example1.py future-import-boilerplate
test/integration/targets/ansible-runner/files/playbook_example1.py metaclass-boilerplate
test/integration/targets/async/library/async_test.py future-import-boilerplate
test/integration/targets/async/library/async_test.py metaclass-boilerplate
test/integration/targets/async_fail/library/async_test.py future-import-boilerplate
test/integration/targets/async_fail/library/async_test.py metaclass-boilerplate
test/integration/targets/aws_lambda/files/mini_lambda.py future-import-boilerplate
test/integration/targets/aws_lambda/files/mini_lambda.py metaclass-boilerplate
test/integration/targets/collections_plugin_namespace/collection_root/ansible_collections/my_ns/my_col/plugins/lookup/lookup_no_future_boilerplate.py future-import-boilerplate
test/integration/targets/collections_relative_imports/collection_root/ansible_collections/my_ns/my_col/plugins/module_utils/my_util2.py pylint:relative-beyond-top-level
test/integration/targets/collections_relative_imports/collection_root/ansible_collections/my_ns/my_col/plugins/module_utils/my_util3.py pylint:relative-beyond-top-level
test/integration/targets/collections_relative_imports/collection_root/ansible_collections/my_ns/my_col/plugins/modules/my_module.py pylint:relative-beyond-top-level
test/integration/targets/expect/files/test_command.py future-import-boilerplate
test/integration/targets/expect/files/test_command.py metaclass-boilerplate
test/integration/targets/get_url/files/testserver.py future-import-boilerplate
test/integration/targets/get_url/files/testserver.py metaclass-boilerplate
test/integration/targets/group/files/gidget.py future-import-boilerplate
test/integration/targets/group/files/gidget.py metaclass-boilerplate
test/integration/targets/ignore_unreachable/fake_connectors/bad_exec.py future-import-boilerplate
test/integration/targets/ignore_unreachable/fake_connectors/bad_exec.py metaclass-boilerplate
test/integration/targets/ignore_unreachable/fake_connectors/bad_put_file.py future-import-boilerplate
test/integration/targets/ignore_unreachable/fake_connectors/bad_put_file.py metaclass-boilerplate
test/integration/targets/inventory_kubevirt/inventory_diff.py future-import-boilerplate
test/integration/targets/inventory_kubevirt/inventory_diff.py metaclass-boilerplate
test/integration/targets/inventory_kubevirt/server.py future-import-boilerplate
test/integration/targets/inventory_kubevirt/server.py metaclass-boilerplate
test/integration/targets/jinja2_native_types/filter_plugins/native_plugins.py future-import-boilerplate
test/integration/targets/jinja2_native_types/filter_plugins/native_plugins.py metaclass-boilerplate
test/integration/targets/lambda_policy/files/mini_http_lambda.py future-import-boilerplate
test/integration/targets/lambda_policy/files/mini_http_lambda.py metaclass-boilerplate
test/integration/targets/lookup_ini/lookup-8859-15.ini no-smart-quotes
test/integration/targets/module_precedence/lib_with_extension/ping.py future-import-boilerplate
test/integration/targets/module_precedence/lib_with_extension/ping.py metaclass-boilerplate
test/integration/targets/module_precedence/multiple_roles/bar/library/ping.py future-import-boilerplate
test/integration/targets/module_precedence/multiple_roles/bar/library/ping.py metaclass-boilerplate
test/integration/targets/module_precedence/multiple_roles/foo/library/ping.py future-import-boilerplate
test/integration/targets/module_precedence/multiple_roles/foo/library/ping.py metaclass-boilerplate
test/integration/targets/module_precedence/roles_with_extension/foo/library/ping.py future-import-boilerplate
test/integration/targets/module_precedence/roles_with_extension/foo/library/ping.py metaclass-boilerplate
test/integration/targets/module_utils/library/test.py future-import-boilerplate
test/integration/targets/module_utils/library/test.py metaclass-boilerplate
test/integration/targets/module_utils/library/test_env_override.py future-import-boilerplate
test/integration/targets/module_utils/library/test_env_override.py metaclass-boilerplate
test/integration/targets/module_utils/library/test_failure.py future-import-boilerplate
test/integration/targets/module_utils/library/test_failure.py metaclass-boilerplate
test/integration/targets/module_utils/library/test_override.py future-import-boilerplate
test/integration/targets/module_utils/library/test_override.py metaclass-boilerplate
test/integration/targets/module_utils/module_utils/bar0/foo.py pylint:blacklisted-name
test/integration/targets/module_utils/module_utils/foo.py pylint:blacklisted-name
test/integration/targets/module_utils/module_utils/sub/bar/__init__.py pylint:blacklisted-name
test/integration/targets/module_utils/module_utils/sub/bar/bar.py pylint:blacklisted-name
test/integration/targets/module_utils/module_utils/yak/zebra/foo.py pylint:blacklisted-name
test/integration/targets/old_style_modules_posix/library/helloworld.sh shebang
test/integration/targets/pause/test-pause.py future-import-boilerplate
test/integration/targets/pause/test-pause.py metaclass-boilerplate
test/integration/targets/pip/files/ansible_test_pip_chdir/__init__.py future-import-boilerplate
test/integration/targets/pip/files/ansible_test_pip_chdir/__init__.py metaclass-boilerplate
test/integration/targets/pip/files/setup.py future-import-boilerplate
test/integration/targets/pip/files/setup.py metaclass-boilerplate
test/integration/targets/run_modules/library/test.py future-import-boilerplate
test/integration/targets/run_modules/library/test.py metaclass-boilerplate
test/integration/targets/s3_bucket_notification/files/mini_lambda.py future-import-boilerplate
test/integration/targets/s3_bucket_notification/files/mini_lambda.py metaclass-boilerplate
test/integration/targets/script/files/no_shebang.py future-import-boilerplate
test/integration/targets/script/files/no_shebang.py metaclass-boilerplate
test/integration/targets/service/files/ansible_test_service.py future-import-boilerplate
test/integration/targets/service/files/ansible_test_service.py metaclass-boilerplate
test/integration/targets/setup_rpm_repo/files/create-repo.py future-import-boilerplate
test/integration/targets/setup_rpm_repo/files/create-repo.py metaclass-boilerplate
test/integration/targets/sns_topic/files/sns_topic_lambda/sns_topic_lambda.py future-import-boilerplate
test/integration/targets/sns_topic/files/sns_topic_lambda/sns_topic_lambda.py metaclass-boilerplate
test/integration/targets/supervisorctl/files/sendProcessStdin.py future-import-boilerplate
test/integration/targets/supervisorctl/files/sendProcessStdin.py metaclass-boilerplate
test/integration/targets/template/files/encoding_1252_utf-8.expected no-smart-quotes
test/integration/targets/template/files/encoding_1252_windows-1252.expected no-smart-quotes
test/integration/targets/template/files/foo.dos.txt line-endings
test/integration/targets/template/role_filter/filter_plugins/myplugin.py future-import-boilerplate
test/integration/targets/template/role_filter/filter_plugins/myplugin.py metaclass-boilerplate
test/integration/targets/template/templates/encoding_1252.j2 no-smart-quotes
test/integration/targets/test_infra/library/test.py future-import-boilerplate
test/integration/targets/test_infra/library/test.py metaclass-boilerplate
test/integration/targets/unicode/unicode.yml no-smart-quotes
test/integration/targets/uri/files/testserver.py future-import-boilerplate
test/integration/targets/uri/files/testserver.py metaclass-boilerplate
test/integration/targets/var_precedence/ansible-var-precedence-check.py future-import-boilerplate
test/integration/targets/var_precedence/ansible-var-precedence-check.py metaclass-boilerplate
test/integration/targets/vars_prompt/test-vars_prompt.py future-import-boilerplate
test/integration/targets/vars_prompt/test-vars_prompt.py metaclass-boilerplate
test/integration/targets/vault/test-vault-client.py future-import-boilerplate
test/integration/targets/vault/test-vault-client.py metaclass-boilerplate
test/integration/targets/wait_for/files/testserver.py future-import-boilerplate
test/integration/targets/wait_for/files/testserver.py metaclass-boilerplate
test/integration/targets/want_json_modules_posix/library/helloworld.py future-import-boilerplate
test/integration/targets/want_json_modules_posix/library/helloworld.py metaclass-boilerplate
test/integration/targets/win_audit_rule/library/test_get_audit_rule.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_chocolatey/files/tools/chocolateyUninstall.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_chocolatey_source/library/choco_source.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_csharp_utils/library/ansible_basic_tests.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_csharp_utils/library/ansible_basic_tests.ps1 pslint:PSUseDeclaredVarsMoreThanAssignments # test setup requires vars to be set globally and not referenced in the same scope
test/integration/targets/win_csharp_utils/library/ansible_become_tests.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_dsc/files/xTestDsc/1.0.0/DSCResources/ANSIBLE_xSetReboot/ANSIBLE_xSetReboot.psm1 pslint!skip
test/integration/targets/win_dsc/files/xTestDsc/1.0.0/DSCResources/ANSIBLE_xTestResource/ANSIBLE_xTestResource.psm1 pslint!skip
test/integration/targets/win_dsc/files/xTestDsc/1.0.0/xTestDsc.psd1 pslint!skip
test/integration/targets/win_dsc/files/xTestDsc/1.0.1/DSCResources/ANSIBLE_xTestResource/ANSIBLE_xTestResource.psm1 pslint!skip
test/integration/targets/win_dsc/files/xTestDsc/1.0.1/xTestDsc.psd1 pslint!skip
test/integration/targets/win_exec_wrapper/library/test_fail.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_iis_webbinding/library/test_get_webbindings.ps1 pslint:PSUseApprovedVerbs
test/integration/targets/win_module_utils/library/argv_parser_test.ps1 pslint:PSUseApprovedVerbs
test/integration/targets/win_module_utils/library/backup_file_test.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_module_utils/library/command_util_test.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_module_utils/library/legacy_only_new_way_win_line_ending.ps1 line-endings
test/integration/targets/win_module_utils/library/legacy_only_old_way_win_line_ending.ps1 line-endings
test/integration/targets/win_ping/library/win_ping_syntax_error.ps1 pslint!skip
test/integration/targets/win_psmodule/files/module/template.psd1 pslint!skip
test/integration/targets/win_psmodule/files/module/template.psm1 pslint!skip
test/integration/targets/win_psmodule/files/setup_modules.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_reboot/templates/post_reboot.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_regmerge/templates/win_line_ending.j2 line-endings
test/integration/targets/win_script/files/test_script.ps1 pslint:PSAvoidUsingWriteHost # Keep
test/integration/targets/win_script/files/test_script_creates_file.ps1 pslint:PSAvoidUsingCmdletAliases
test/integration/targets/win_script/files/test_script_removes_file.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_script/files/test_script_with_args.ps1 pslint:PSAvoidUsingWriteHost # Keep
test/integration/targets/win_script/files/test_script_with_splatting.ps1 pslint:PSAvoidUsingWriteHost # Keep
test/integration/targets/win_stat/library/test_symlink_file.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_template/files/foo.dos.txt line-endings
test/integration/targets/win_user_right/library/test_get_right.ps1 pslint:PSCustomUseLiteralPath
test/legacy/cleanup_gce.py future-import-boilerplate
test/legacy/cleanup_gce.py metaclass-boilerplate
test/legacy/cleanup_gce.py pylint:blacklisted-name
test/legacy/cleanup_rax.py future-import-boilerplate
test/legacy/cleanup_rax.py metaclass-boilerplate
test/legacy/consul_running.py future-import-boilerplate
test/legacy/consul_running.py metaclass-boilerplate
test/legacy/gce_credentials.py future-import-boilerplate
test/legacy/gce_credentials.py metaclass-boilerplate
test/legacy/gce_credentials.py pylint:blacklisted-name
test/legacy/setup_gce.py future-import-boilerplate
test/legacy/setup_gce.py metaclass-boilerplate
test/lib/ansible_test/_data/requirements/constraints.txt test-constraints
test/lib/ansible_test/_data/requirements/integration.cloud.azure.txt test-constraints
test/lib/ansible_test/_data/sanity/pylint/plugins/string_format.py use-compat-six
test/lib/ansible_test/_data/setup/ConfigureRemotingForAnsible.ps1 pslint:PSCustomUseLiteralPath
test/lib/ansible_test/_data/setup/windows-httptester.ps1 pslint:PSCustomUseLiteralPath
test/units/config/manager/test_find_ini_config_file.py future-import-boilerplate
test/units/contrib/inventory/test_vmware_inventory.py future-import-boilerplate
test/units/contrib/inventory/test_vmware_inventory.py metaclass-boilerplate
test/units/contrib/inventory/test_vmware_inventory.py pylint:blacklisted-name
test/units/executor/test_play_iterator.py pylint:blacklisted-name
test/units/mock/path.py future-import-boilerplate
test/units/mock/path.py metaclass-boilerplate
test/units/mock/yaml_helper.py future-import-boilerplate
test/units/mock/yaml_helper.py metaclass-boilerplate
test/units/module_utils/aws/test_aws_module.py metaclass-boilerplate
test/units/module_utils/basic/test__symbolic_mode_to_octal.py future-import-boilerplate
test/units/module_utils/basic/test_deprecate_warn.py future-import-boilerplate
test/units/module_utils/basic/test_deprecate_warn.py metaclass-boilerplate
test/units/module_utils/basic/test_deprecate_warn.py pylint:ansible-deprecated-no-version
test/units/module_utils/basic/test_exit_json.py future-import-boilerplate
test/units/module_utils/basic/test_get_file_attributes.py future-import-boilerplate
test/units/module_utils/basic/test_heuristic_log_sanitize.py future-import-boilerplate
test/units/module_utils/basic/test_run_command.py future-import-boilerplate
test/units/module_utils/basic/test_run_command.py pylint:blacklisted-name
test/units/module_utils/basic/test_safe_eval.py future-import-boilerplate
test/units/module_utils/basic/test_tmpdir.py future-import-boilerplate
test/units/module_utils/cloud/test_backoff.py future-import-boilerplate
test/units/module_utils/cloud/test_backoff.py metaclass-boilerplate
test/units/module_utils/common/test_dict_transformations.py future-import-boilerplate
test/units/module_utils/common/test_dict_transformations.py metaclass-boilerplate
test/units/module_utils/conftest.py future-import-boilerplate
test/units/module_utils/conftest.py metaclass-boilerplate
test/units/module_utils/facts/base.py future-import-boilerplate
test/units/module_utils/facts/hardware/test_sunos_get_uptime_facts.py future-import-boilerplate
test/units/module_utils/facts/hardware/test_sunos_get_uptime_facts.py metaclass-boilerplate
test/units/module_utils/facts/network/test_generic_bsd.py future-import-boilerplate
test/units/module_utils/facts/other/test_facter.py future-import-boilerplate
test/units/module_utils/facts/other/test_ohai.py future-import-boilerplate
test/units/module_utils/facts/system/test_lsb.py future-import-boilerplate
test/units/module_utils/facts/test_ansible_collector.py future-import-boilerplate
test/units/module_utils/facts/test_collector.py future-import-boilerplate
test/units/module_utils/facts/test_collectors.py future-import-boilerplate
test/units/module_utils/facts/test_facts.py future-import-boilerplate
test/units/module_utils/facts/test_timeout.py future-import-boilerplate
test/units/module_utils/facts/test_utils.py future-import-boilerplate
test/units/module_utils/gcp/test_auth.py future-import-boilerplate
test/units/module_utils/gcp/test_auth.py metaclass-boilerplate
test/units/module_utils/gcp/test_gcp_utils.py future-import-boilerplate
test/units/module_utils/gcp/test_gcp_utils.py metaclass-boilerplate
test/units/module_utils/gcp/test_utils.py future-import-boilerplate
test/units/module_utils/gcp/test_utils.py metaclass-boilerplate
test/units/module_utils/hwc/test_dict_comparison.py future-import-boilerplate
test/units/module_utils/hwc/test_dict_comparison.py metaclass-boilerplate
test/units/module_utils/hwc/test_hwc_utils.py future-import-boilerplate
test/units/module_utils/hwc/test_hwc_utils.py metaclass-boilerplate
test/units/module_utils/json_utils/test_filter_non_json_lines.py future-import-boilerplate
test/units/module_utils/net_tools/test_netbox.py future-import-boilerplate
test/units/module_utils/net_tools/test_netbox.py metaclass-boilerplate
test/units/module_utils/network/avi/test_avi_api_utils.py future-import-boilerplate
test/units/module_utils/network/avi/test_avi_api_utils.py metaclass-boilerplate
test/units/module_utils/network/ftd/test_common.py future-import-boilerplate
test/units/module_utils/network/ftd/test_common.py metaclass-boilerplate
test/units/module_utils/network/ftd/test_configuration.py future-import-boilerplate
test/units/module_utils/network/ftd/test_configuration.py metaclass-boilerplate
test/units/module_utils/network/ftd/test_device.py future-import-boilerplate
test/units/module_utils/network/ftd/test_device.py metaclass-boilerplate
test/units/module_utils/network/ftd/test_fdm_swagger_parser.py future-import-boilerplate
test/units/module_utils/network/ftd/test_fdm_swagger_parser.py metaclass-boilerplate
test/units/module_utils/network/ftd/test_fdm_swagger_validator.py future-import-boilerplate
test/units/module_utils/network/ftd/test_fdm_swagger_validator.py metaclass-boilerplate
test/units/module_utils/network/ftd/test_fdm_swagger_with_real_data.py future-import-boilerplate
test/units/module_utils/network/ftd/test_fdm_swagger_with_real_data.py metaclass-boilerplate
test/units/module_utils/network/ftd/test_upsert_functionality.py future-import-boilerplate
test/units/module_utils/network/ftd/test_upsert_functionality.py metaclass-boilerplate
test/units/module_utils/network/nso/test_nso.py metaclass-boilerplate
test/units/module_utils/parsing/test_convert_bool.py future-import-boilerplate
test/units/module_utils/postgresql/test_postgres.py future-import-boilerplate
test/units/module_utils/postgresql/test_postgres.py metaclass-boilerplate
test/units/module_utils/remote_management/dellemc/test_ome.py future-import-boilerplate
test/units/module_utils/remote_management/dellemc/test_ome.py metaclass-boilerplate
test/units/module_utils/test_database.py future-import-boilerplate
test/units/module_utils/test_database.py metaclass-boilerplate
test/units/module_utils/test_distro.py future-import-boilerplate
test/units/module_utils/test_distro.py metaclass-boilerplate
test/units/module_utils/test_hetzner.py future-import-boilerplate
test/units/module_utils/test_hetzner.py metaclass-boilerplate
test/units/module_utils/test_kubevirt.py future-import-boilerplate
test/units/module_utils/test_kubevirt.py metaclass-boilerplate
test/units/module_utils/test_netapp.py future-import-boilerplate
test/units/module_utils/test_text.py future-import-boilerplate
test/units/module_utils/test_utm_utils.py future-import-boilerplate
test/units/module_utils/test_utm_utils.py metaclass-boilerplate
test/units/module_utils/urls/test_Request.py replace-urlopen
test/units/module_utils/urls/test_fetch_url.py replace-urlopen
test/units/module_utils/xenserver/FakeAnsibleModule.py future-import-boilerplate
test/units/module_utils/xenserver/FakeAnsibleModule.py metaclass-boilerplate
test/units/module_utils/xenserver/FakeXenAPI.py future-import-boilerplate
test/units/module_utils/xenserver/FakeXenAPI.py metaclass-boilerplate
test/units/modules/cloud/google/test_gce_tag.py future-import-boilerplate
test/units/modules/cloud/google/test_gce_tag.py metaclass-boilerplate
test/units/modules/cloud/google/test_gcp_forwarding_rule.py future-import-boilerplate
test/units/modules/cloud/google/test_gcp_forwarding_rule.py metaclass-boilerplate
test/units/modules/cloud/google/test_gcp_url_map.py future-import-boilerplate
test/units/modules/cloud/google/test_gcp_url_map.py metaclass-boilerplate
test/units/modules/cloud/kubevirt/test_kubevirt_rs.py future-import-boilerplate
test/units/modules/cloud/kubevirt/test_kubevirt_rs.py metaclass-boilerplate
test/units/modules/cloud/kubevirt/test_kubevirt_vm.py future-import-boilerplate
test/units/modules/cloud/kubevirt/test_kubevirt_vm.py metaclass-boilerplate
test/units/modules/cloud/linode/conftest.py future-import-boilerplate
test/units/modules/cloud/linode/conftest.py metaclass-boilerplate
test/units/modules/cloud/linode/test_linode.py metaclass-boilerplate
test/units/modules/cloud/linode_v4/conftest.py future-import-boilerplate
test/units/modules/cloud/linode_v4/conftest.py metaclass-boilerplate
test/units/modules/cloud/linode_v4/test_linode_v4.py metaclass-boilerplate
test/units/modules/cloud/misc/test_terraform.py future-import-boilerplate
test/units/modules/cloud/misc/test_terraform.py metaclass-boilerplate
test/units/modules/cloud/misc/virt_net/conftest.py future-import-boilerplate
test/units/modules/cloud/misc/virt_net/conftest.py metaclass-boilerplate
test/units/modules/cloud/misc/virt_net/test_virt_net.py future-import-boilerplate
test/units/modules/cloud/misc/virt_net/test_virt_net.py metaclass-boilerplate
test/units/modules/cloud/openstack/test_os_server.py future-import-boilerplate
test/units/modules/cloud/openstack/test_os_server.py metaclass-boilerplate
test/units/modules/cloud/xenserver/FakeAnsibleModule.py future-import-boilerplate
test/units/modules/cloud/xenserver/FakeAnsibleModule.py metaclass-boilerplate
test/units/modules/cloud/xenserver/FakeXenAPI.py future-import-boilerplate
test/units/modules/cloud/xenserver/FakeXenAPI.py metaclass-boilerplate
test/units/modules/conftest.py future-import-boilerplate
test/units/modules/conftest.py metaclass-boilerplate
test/units/modules/files/test_copy.py future-import-boilerplate
test/units/modules/messaging/rabbitmq/test_rabbimq_user.py future-import-boilerplate
test/units/modules/messaging/rabbitmq/test_rabbimq_user.py metaclass-boilerplate
test/units/modules/monitoring/test_circonus_annotation.py future-import-boilerplate
test/units/modules/monitoring/test_circonus_annotation.py metaclass-boilerplate
test/units/modules/monitoring/test_icinga2_feature.py future-import-boilerplate
test/units/modules/monitoring/test_icinga2_feature.py metaclass-boilerplate
test/units/modules/monitoring/test_pagerduty.py future-import-boilerplate
test/units/modules/monitoring/test_pagerduty.py metaclass-boilerplate
test/units/modules/monitoring/test_pagerduty_alert.py future-import-boilerplate
test/units/modules/monitoring/test_pagerduty_alert.py metaclass-boilerplate
test/units/modules/net_tools/test_nmcli.py future-import-boilerplate
test/units/modules/net_tools/test_nmcli.py metaclass-boilerplate
test/units/modules/network/avi/test_avi_user.py future-import-boilerplate
test/units/modules/network/avi/test_avi_user.py metaclass-boilerplate
test/units/modules/network/check_point/test_checkpoint_access_rule.py future-import-boilerplate
test/units/modules/network/check_point/test_checkpoint_access_rule.py metaclass-boilerplate
test/units/modules/network/check_point/test_checkpoint_host.py future-import-boilerplate
test/units/modules/network/check_point/test_checkpoint_host.py metaclass-boilerplate
test/units/modules/network/check_point/test_checkpoint_session.py future-import-boilerplate
test/units/modules/network/check_point/test_checkpoint_session.py metaclass-boilerplate
test/units/modules/network/check_point/test_checkpoint_task_facts.py future-import-boilerplate
test/units/modules/network/check_point/test_checkpoint_task_facts.py metaclass-boilerplate
test/units/modules/network/cloudvision/test_cv_server_provision.py future-import-boilerplate
test/units/modules/network/cloudvision/test_cv_server_provision.py metaclass-boilerplate
test/units/modules/network/cumulus/test_nclu.py future-import-boilerplate
test/units/modules/network/cumulus/test_nclu.py metaclass-boilerplate
test/units/modules/network/ftd/test_ftd_configuration.py future-import-boilerplate
test/units/modules/network/ftd/test_ftd_configuration.py metaclass-boilerplate
test/units/modules/network/ftd/test_ftd_file_download.py future-import-boilerplate
test/units/modules/network/ftd/test_ftd_file_download.py metaclass-boilerplate
test/units/modules/network/ftd/test_ftd_file_upload.py future-import-boilerplate
test/units/modules/network/ftd/test_ftd_file_upload.py metaclass-boilerplate
test/units/modules/network/ftd/test_ftd_install.py future-import-boilerplate
test/units/modules/network/ftd/test_ftd_install.py metaclass-boilerplate
test/units/modules/network/netscaler/netscaler_module.py future-import-boilerplate
test/units/modules/network/netscaler/netscaler_module.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_cs_action.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_cs_action.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_cs_policy.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_cs_policy.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_cs_vserver.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_cs_vserver.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_gslb_service.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_gslb_service.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_gslb_site.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_gslb_site.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_gslb_vserver.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_gslb_vserver.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_lb_monitor.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_lb_monitor.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_lb_vserver.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_lb_vserver.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_module_utils.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_module_utils.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_nitro_request.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_nitro_request.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_save_config.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_save_config.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_server.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_server.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_service.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_service.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_servicegroup.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_servicegroup.py metaclass-boilerplate
test/units/modules/network/netscaler/test_netscaler_ssl_certkey.py future-import-boilerplate
test/units/modules/network/netscaler/test_netscaler_ssl_certkey.py metaclass-boilerplate
test/units/modules/network/nso/nso_module.py metaclass-boilerplate
test/units/modules/network/nso/test_nso_action.py metaclass-boilerplate
test/units/modules/network/nso/test_nso_config.py metaclass-boilerplate
test/units/modules/network/nso/test_nso_query.py metaclass-boilerplate
test/units/modules/network/nso/test_nso_show.py metaclass-boilerplate
test/units/modules/network/nso/test_nso_verify.py metaclass-boilerplate
test/units/modules/network/nuage/nuage_module.py future-import-boilerplate
test/units/modules/network/nuage/nuage_module.py metaclass-boilerplate
test/units/modules/network/nuage/test_nuage_vspk.py future-import-boilerplate
test/units/modules/network/nuage/test_nuage_vspk.py metaclass-boilerplate
test/units/modules/network/nxos/test_nxos_acl_interface.py metaclass-boilerplate
test/units/modules/network/radware/test_vdirect_commit.py future-import-boilerplate
test/units/modules/network/radware/test_vdirect_commit.py metaclass-boilerplate
test/units/modules/network/radware/test_vdirect_file.py future-import-boilerplate
test/units/modules/network/radware/test_vdirect_file.py metaclass-boilerplate
test/units/modules/network/radware/test_vdirect_runnable.py future-import-boilerplate
test/units/modules/network/radware/test_vdirect_runnable.py metaclass-boilerplate
test/units/modules/notification/test_slack.py future-import-boilerplate
test/units/modules/notification/test_slack.py metaclass-boilerplate
test/units/modules/packaging/language/test_gem.py future-import-boilerplate
test/units/modules/packaging/language/test_gem.py metaclass-boilerplate
test/units/modules/packaging/language/test_pip.py future-import-boilerplate
test/units/modules/packaging/language/test_pip.py metaclass-boilerplate
test/units/modules/packaging/os/conftest.py future-import-boilerplate
test/units/modules/packaging/os/conftest.py metaclass-boilerplate
test/units/modules/packaging/os/test_apk.py future-import-boilerplate
test/units/modules/packaging/os/test_apk.py metaclass-boilerplate
test/units/modules/packaging/os/test_apt.py future-import-boilerplate
test/units/modules/packaging/os/test_apt.py metaclass-boilerplate
test/units/modules/packaging/os/test_apt.py pylint:blacklisted-name
test/units/modules/packaging/os/test_rhn_channel.py future-import-boilerplate
test/units/modules/packaging/os/test_rhn_channel.py metaclass-boilerplate
test/units/modules/packaging/os/test_rhn_register.py future-import-boilerplate
test/units/modules/packaging/os/test_rhn_register.py metaclass-boilerplate
test/units/modules/packaging/os/test_yum.py future-import-boilerplate
test/units/modules/packaging/os/test_yum.py metaclass-boilerplate
test/units/modules/remote_management/dellemc/test_ome_device_info.py future-import-boilerplate
test/units/modules/remote_management/dellemc/test_ome_device_info.py metaclass-boilerplate
test/units/modules/remote_management/lxca/test_lxca_cmms.py future-import-boilerplate
test/units/modules/remote_management/lxca/test_lxca_cmms.py metaclass-boilerplate
test/units/modules/remote_management/lxca/test_lxca_nodes.py future-import-boilerplate
test/units/modules/remote_management/lxca/test_lxca_nodes.py metaclass-boilerplate
test/units/modules/remote_management/oneview/conftest.py future-import-boilerplate
test/units/modules/remote_management/oneview/conftest.py metaclass-boilerplate
test/units/modules/remote_management/oneview/hpe_test_utils.py future-import-boilerplate
test/units/modules/remote_management/oneview/hpe_test_utils.py metaclass-boilerplate
test/units/modules/remote_management/oneview/oneview_module_loader.py future-import-boilerplate
test/units/modules/remote_management/oneview/oneview_module_loader.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_datacenter_info.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_datacenter_info.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_enclosure_info.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_enclosure_info.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_ethernet_network.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_ethernet_network.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_ethernet_network_info.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_ethernet_network_info.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_fc_network.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_fc_network.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_fc_network_info.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_fc_network_info.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_fcoe_network.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_fcoe_network.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_fcoe_network_info.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_fcoe_network_info.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_logical_interconnect_group.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_logical_interconnect_group.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_logical_interconnect_group_info.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_logical_interconnect_group_info.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_network_set.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_network_set.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_network_set_info.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_network_set_info.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_san_manager.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_san_manager.py metaclass-boilerplate
test/units/modules/remote_management/oneview/test_oneview_san_manager_info.py future-import-boilerplate
test/units/modules/remote_management/oneview/test_oneview_san_manager_info.py metaclass-boilerplate
test/units/modules/source_control/bitbucket/test_bitbucket_access_key.py future-import-boilerplate
test/units/modules/source_control/bitbucket/test_bitbucket_access_key.py metaclass-boilerplate
test/units/modules/source_control/bitbucket/test_bitbucket_pipeline_key_pair.py future-import-boilerplate
test/units/modules/source_control/bitbucket/test_bitbucket_pipeline_key_pair.py metaclass-boilerplate
test/units/modules/source_control/bitbucket/test_bitbucket_pipeline_known_host.py future-import-boilerplate
test/units/modules/source_control/bitbucket/test_bitbucket_pipeline_known_host.py metaclass-boilerplate
test/units/modules/source_control/bitbucket/test_bitbucket_pipeline_variable.py future-import-boilerplate
test/units/modules/source_control/bitbucket/test_bitbucket_pipeline_variable.py metaclass-boilerplate
test/units/modules/source_control/gitlab/gitlab.py future-import-boilerplate
test/units/modules/source_control/gitlab/gitlab.py metaclass-boilerplate
test/units/modules/source_control/gitlab/test_gitlab_deploy_key.py future-import-boilerplate
test/units/modules/source_control/gitlab/test_gitlab_deploy_key.py metaclass-boilerplate
test/units/modules/source_control/gitlab/test_gitlab_group.py future-import-boilerplate
test/units/modules/source_control/gitlab/test_gitlab_group.py metaclass-boilerplate
test/units/modules/source_control/gitlab/test_gitlab_hook.py future-import-boilerplate
test/units/modules/source_control/gitlab/test_gitlab_hook.py metaclass-boilerplate
test/units/modules/source_control/gitlab/test_gitlab_project.py future-import-boilerplate
test/units/modules/source_control/gitlab/test_gitlab_project.py metaclass-boilerplate
test/units/modules/source_control/gitlab/test_gitlab_runner.py future-import-boilerplate
test/units/modules/source_control/gitlab/test_gitlab_runner.py metaclass-boilerplate
test/units/modules/source_control/gitlab/test_gitlab_user.py future-import-boilerplate
test/units/modules/source_control/gitlab/test_gitlab_user.py metaclass-boilerplate
test/units/modules/storage/hpe3par/test_ss_3par_cpg.py future-import-boilerplate
test/units/modules/storage/hpe3par/test_ss_3par_cpg.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_elementsw_cluster_config.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_elementsw_cluster_config.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_elementsw_cluster_snmp.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_elementsw_cluster_snmp.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_elementsw_initiators.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_elementsw_initiators.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_aggregate.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_aggregate.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_autosupport.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_autosupport.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_broadcast_domain.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_broadcast_domain.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_cifs.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_cifs.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_cifs_server.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_cifs_server.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_cluster.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_cluster.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_cluster_peer.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_cluster_peer.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_command.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_command.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_export_policy_rule.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_export_policy_rule.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_firewall_policy.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_firewall_policy.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_flexcache.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_flexcache.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_igroup.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_igroup.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_igroup_initiator.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_igroup_initiator.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_info.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_info.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_interface.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_interface.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_ipspace.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_ipspace.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_job_schedule.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_job_schedule.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_lun_copy.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_lun_copy.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_lun_map.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_lun_map.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_motd.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_motd.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_net_ifgrp.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_net_ifgrp.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_net_port.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_net_port.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_net_routes.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_net_routes.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_net_subnet.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_net_subnet.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_nfs.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_nfs.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_nvme.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_nvme.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_nvme_namespace.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_nvme_namespace.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_nvme_subsystem.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_nvme_subsystem.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_portset.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_portset.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_qos_policy_group.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_qos_policy_group.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_quotas.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_quotas.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_security_key_manager.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_security_key_manager.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_service_processor_network.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_service_processor_network.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_snapmirror.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_snapmirror.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_snapshot.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_snapshot.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_snapshot_policy.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_snapshot_policy.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_software_update.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_software_update.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_svm.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_svm.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_ucadapter.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_ucadapter.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_unix_group.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_unix_group.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_unix_user.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_unix_user.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_user.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_user.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_user_role.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_user_role.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_volume.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_volume.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_volume_clone.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_volume_clone.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_vscan.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_vscan.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_vscan_on_access_policy.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_vscan_on_access_policy.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_vscan_on_demand_task.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_vscan_on_demand_task.py metaclass-boilerplate
test/units/modules/storage/netapp/test_na_ontap_vscan_scanner_pool.py future-import-boilerplate
test/units/modules/storage/netapp/test_na_ontap_vscan_scanner_pool.py metaclass-boilerplate
test/units/modules/storage/netapp/test_netapp.py metaclass-boilerplate
test/units/modules/storage/netapp/test_netapp_e_alerts.py future-import-boilerplate
test/units/modules/storage/netapp/test_netapp_e_asup.py future-import-boilerplate
test/units/modules/storage/netapp/test_netapp_e_auditlog.py future-import-boilerplate
test/units/modules/storage/netapp/test_netapp_e_global.py future-import-boilerplate
test/units/modules/storage/netapp/test_netapp_e_host.py future-import-boilerplate
test/units/modules/storage/netapp/test_netapp_e_iscsi_interface.py future-import-boilerplate
test/units/modules/storage/netapp/test_netapp_e_iscsi_target.py future-import-boilerplate
test/units/modules/storage/netapp/test_netapp_e_ldap.py future-import-boilerplate
test/units/modules/storage/netapp/test_netapp_e_mgmt_interface.py future-import-boilerplate
test/units/modules/storage/netapp/test_netapp_e_syslog.py future-import-boilerplate
test/units/modules/system/interfaces_file/test_interfaces_file.py future-import-boilerplate
test/units/modules/system/interfaces_file/test_interfaces_file.py metaclass-boilerplate
test/units/modules/system/interfaces_file/test_interfaces_file.py pylint:blacklisted-name
test/units/modules/system/test_iptables.py future-import-boilerplate
test/units/modules/system/test_iptables.py metaclass-boilerplate
test/units/modules/system/test_java_keystore.py future-import-boilerplate
test/units/modules/system/test_java_keystore.py metaclass-boilerplate
test/units/modules/system/test_known_hosts.py future-import-boilerplate
test/units/modules/system/test_known_hosts.py metaclass-boilerplate
test/units/modules/system/test_known_hosts.py pylint:ansible-bad-function
test/units/modules/system/test_linux_mountinfo.py future-import-boilerplate
test/units/modules/system/test_linux_mountinfo.py metaclass-boilerplate
test/units/modules/system/test_pamd.py metaclass-boilerplate
test/units/modules/system/test_parted.py future-import-boilerplate
test/units/modules/system/test_systemd.py future-import-boilerplate
test/units/modules/system/test_systemd.py metaclass-boilerplate
test/units/modules/system/test_ufw.py future-import-boilerplate
test/units/modules/system/test_ufw.py metaclass-boilerplate
test/units/modules/utils.py future-import-boilerplate
test/units/modules/utils.py metaclass-boilerplate
test/units/modules/web_infrastructure/test_apache2_module.py future-import-boilerplate
test/units/modules/web_infrastructure/test_apache2_module.py metaclass-boilerplate
test/units/modules/web_infrastructure/test_jenkins_plugin.py future-import-boilerplate
test/units/modules/web_infrastructure/test_jenkins_plugin.py metaclass-boilerplate
test/units/parsing/utils/test_addresses.py future-import-boilerplate
test/units/parsing/utils/test_addresses.py metaclass-boilerplate
test/units/parsing/vault/test_vault.py pylint:blacklisted-name
test/units/playbook/role/test_role.py pylint:blacklisted-name
test/units/playbook/test_attribute.py future-import-boilerplate
test/units/playbook/test_attribute.py metaclass-boilerplate
test/units/playbook/test_conditional.py future-import-boilerplate
test/units/playbook/test_conditional.py metaclass-boilerplate
test/units/plugins/action/test_synchronize.py future-import-boilerplate
test/units/plugins/action/test_synchronize.py metaclass-boilerplate
test/units/plugins/httpapi/test_ftd.py future-import-boilerplate
test/units/plugins/httpapi/test_ftd.py metaclass-boilerplate
test/units/plugins/inventory/test_constructed.py future-import-boilerplate
test/units/plugins/inventory/test_constructed.py metaclass-boilerplate
test/units/plugins/inventory/test_group.py future-import-boilerplate
test/units/plugins/inventory/test_group.py metaclass-boilerplate
test/units/plugins/inventory/test_host.py future-import-boilerplate
test/units/plugins/inventory/test_host.py metaclass-boilerplate
test/units/plugins/loader_fixtures/import_fixture.py future-import-boilerplate
test/units/plugins/shell/test_cmd.py future-import-boilerplate
test/units/plugins/shell/test_cmd.py metaclass-boilerplate
test/units/plugins/shell/test_powershell.py future-import-boilerplate
test/units/plugins/shell/test_powershell.py metaclass-boilerplate
test/units/plugins/test_plugins.py pylint:blacklisted-name
test/units/template/test_templar.py pylint:blacklisted-name
test/units/test_constants.py future-import-boilerplate
test/units/test_context.py future-import-boilerplate
test/units/utils/fixtures/collections/ansible_collections/my_namespace/my_collection/plugins/action/my_action.py future-import-boilerplate
test/units/utils/fixtures/collections/ansible_collections/my_namespace/my_collection/plugins/action/my_action.py metaclass-boilerplate
test/units/utils/fixtures/collections/ansible_collections/my_namespace/my_collection/plugins/module_utils/my_other_util.py future-import-boilerplate
test/units/utils/fixtures/collections/ansible_collections/my_namespace/my_collection/plugins/module_utils/my_other_util.py metaclass-boilerplate
test/units/utils/fixtures/collections/ansible_collections/my_namespace/my_collection/plugins/module_utils/my_util.py future-import-boilerplate
test/units/utils/fixtures/collections/ansible_collections/my_namespace/my_collection/plugins/module_utils/my_util.py metaclass-boilerplate
test/units/utils/kubevirt_fixtures.py future-import-boilerplate
test/units/utils/kubevirt_fixtures.py metaclass-boilerplate
test/units/utils/test_cleanup_tmp_file.py future-import-boilerplate
test/units/utils/test_encrypt.py future-import-boilerplate
test/units/utils/test_encrypt.py metaclass-boilerplate
test/units/utils/test_helpers.py future-import-boilerplate
test/units/utils/test_helpers.py metaclass-boilerplate
test/units/utils/test_shlex.py future-import-boilerplate
test/units/utils/test_shlex.py metaclass-boilerplate
test/utils/shippable/check_matrix.py replace-urlopen
test/utils/shippable/timing.py shebang
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,278 |
win_credential unable to use the wildcard character
|
##### SUMMARY
When using the win_credential module, domain suffixes with wildcards will error saying "The parameter is incorrect"
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
win_credential
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```
ansible 2.9.4
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/home/zinkj/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.17 (default, Nov 7 2019, 10:07:09) [GCC 7.4.0]
```
##### CONFIGURATION
(Empty)
##### OS / ENVIRONMENT
Target W10 Enterprise 1909
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```
- name: Set Credential Manager for Domain
become: yes
vars:
ansible_become_pass: '{{ dev_services_password }}'
ansible_become_user: '{{ dev_services_user }}'
win_credential:
name: '*.domain.com'
type: domain_password
username: '{{ dev_services_user }}'
secret: '{{ dev_services_password }}'
persistence: enterprise
state: present
```
##### EXPECTED RESULTS
Windows Credential manager entry added for '*.domain.com"
##### ACTUAL RESULTS
```
null: TASK [bootstrap : Set Credential Manager for Domain] ***************************
null: task path: /mnt/c/regfarm/framework/revitfarm/setup/node/ansible/roles/bootstrap/tasks/bootstrap_tasks.yml:1
null: Monday 10 February 2020 10:01:08 -0500 (0:00:11.313) 0:00:11.393 *******
null: Monday 10 February 2020 10:01:08 -0500 (0:00:11.313) 0:00:11.392 *******
null: Using module file /usr/lib/python2.7/dist-packages/ansible/modules/windows/win_credential.ps1
null: Pipelining is enabled.
null: <127.0.0.1> ESTABLISH SSH CONNECTION FOR USER: <sensitive>
null: <127.0.0.1> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o Port=58011 -o 'IdentityFile="/tmp/ansible-key214012297"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey
-o PasswordAuthentication=no -o 'User="<sensitive>"' -o ConnectTimeout=1800 -o ControlPath=/home/zinkj/.ansible/cp/5138928487 127.0.0.1 'chcp.com 65001 > $null ; PowerShell -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -EncodedCommand UABvAHcAZQByAFMAaABlAGwAbAAgAC0ATgBvAFAAcgBvAGYAaQBsAGUAIAAtAE4AbwBuAEkAbgB0AGUAcgBhAGMAdABpAHYAZQAgAC0ARQB4AGUAYwB1AHQAaQBvAG4AUABvAGwAaQBjAHkAIABVAG4AcgBlAHMAdAByAGkAYwB0AGUAZAAgAC0ARQBuAGMAbwBkAGUAZABDAG8AbQBtAGEAbgBkACAASgBnAEIAagBBAEcAZwBBAFkAdwBCAHcAQQBDADQAQQBZAHcAQgB2AEEARwAwAEEASQBBAEEAMgBBAEQAVQBBAE0AQQBBAHcAQQBEAEUAQQBJAEEAQQArAEEAQwBBAEEASgBBAEIAdQBBAEgAVQBBAGIAQQBCAHMAQQBBAG8AQQBKAEEAQgBsAEEASABnAEEAWgBRAEIAagBBAEYAOABBAGQAdwBCAHkAQQBHAEUAQQBjAEEAQgB3AEEARwBVAEEAYwBnAEIAZgBBAEgATQBBAGQAQQBCAHkAQQBDAEEAQQBQAFEAQQBnAEEAQwBRAEEAYQBRAEIAdQBBAEgAQQBBAGQAUQBCADAAQQBDAEEAQQBmAEEAQQBnAEEARQA4AEEAZABRAEIAMABBAEMAMABBAFUAdwBCADAAQQBIAEkAQQBhAFEAQgB1AEEARwBjAEEAQwBnAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAGcAQQBEADAAQQBJAEEAQQBrAEEARwBVAEEAZQBBAEIAbABBAEcATQBBAFgAdwBCADMAQQBIAEkAQQBZAFEAQgB3AEEASABBAEEAWgBRAEIAeQBBAEYAOABBAGMAdwBCADAAQQBIAEkAQQBMAGcAQgBUAEEASABBAEEAYgBBAEIAcABBAEgAUQBBAEsAQQBCAEEAQQBDAGcAQQBJAGcAQgBnAEEARABBAEEAWQBBAEEAdwBBAEcAQQBBAE0AQQBCAGcAQQBEAEEAQQBJAGcAQQBwAEEAQwB3AEEASQBBAEEAeQBBAEMAdwBBAEkAQQBCAGIAQQBGAE0AQQBkAEEAQgB5AEEARwBrAEEAYgBnAEIAbgBBAEYATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBQAEEASABBAEEAZABBAEIAcABBAEcAOABBAGIAZwBCAHoAQQBGADAAQQBPAGcAQQA2AEEARgBJAEEAWgBRAEIAdABBAEcAOABBAGQAZwBCAGwAQQBFAFUAQQBiAFEAQgB3AEEASABRAEEAZQBRAEIARgBBAEcANABBAGQAQQBCAHkAQQBHAGsAQQBaAFEAQgB6AEEAQwBrAEEAQwBnAEIASgBBAEcAWQBBAEkAQQBBAG8AQQBDADAAQQBiAGcAQgB2AEEASABRAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAHUAQQBFAHcAQQBaAFEAQgB1AEEARwBjAEEAZABBAEIAbwBBAEMAQQBBAEwAUQBCAGwAQQBIAEUAQQBJAEEAQQB5AEEAQwBrAEEASQBBAEIANwBBAEMAQQBBAGQAQQBCAG8AQQBIAEkAQQBiAHcAQgAzAEEAQwBBAEEASQBnAEIAcABBAEcANABBAGQAZwBCAGgAQQBHAHcAQQBhAFEAQgBrAEEAQwBBAEEAYwBBAEIAaABBAEgAawBBAGIAQQBCAHYAQQBHAEUAQQBaAEEAQQBpAEEAQwBBAEEAZgBRAEEASwBBAEYATQBBAFoAUQBCADAAQQBDADAAQQBWAGcAQgBoAEEASABJAEEAYQBRAEIAaABBAEcASQBBAGIAQQBCAGwAQQBDAEEAQQBMAFEAQgBPAEEARwBFAEEAYgBRAEIAbABBAEMAQQBBAGEAZwBCAHoAQQBHADgAQQBiAGcAQgBmAEEASABJAEEAWQBRAEIAMwBBAEMAQQBBAEwAUQBCAFcAQQBHAEUAQQBiAEEAQgAxAEEARwBVAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEUAQQBYAFEAQQBLAEEAQwBRAEEAWgBRAEIANABBAEcAVQBBAFkAdwBCAGYAQQBIAGMAQQBjAGcAQgBoAEEASABBAEEAYwBBAEIAbABBAEgASQBBAEkAQQBBADkAQQBDAEEAQQBXAHcAQgBUAEEARwBNAEEAYwBnAEIAcABBAEgAQQBBAGQAQQBCAEMAQQBHAHcAQQBiAHcAQgBqAEEARwBzAEEAWABRAEEANgBBAEQAbwBBAFEAdwBCAHkAQQBHAFUAQQBZAFEAQgAwAEEARwBVAEEASwBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEEAQQBYAFEAQQBwAEEAQQBvAEEASgBnAEEAawBBAEcAVQBBAGUAQQBCAGwAQQBHAE0AQQBYAHcAQgAzAEEASABJAEEAWQBRAEIAdwBBAEgAQQBBAFoAUQBCAHkAQQBBAD0APQA='
null: <127.0.0.1> (1, '{"exception":"Exception calling \\"Write\\" with \\"1\\" argument(s): \\"CredWriteW(*) failed - The parameter is incorrect (Win32 Error Code 87: 0x00000057)\\"\\nAt line:602 char:13\\r\\n+ $new_credential.Write($false)\\r\\n+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\\n + CategoryInfo : NotSpecified: (:) [], ParentContainsErrorRecordException\\n + FullyQualifiedErrorId : Win32Exception\\r\\n\\r\\nScriptStackTrace:\\r\\nat \\u003cScriptBlock\\u003e, \\u003cNo file\\u003e: line 602\\r\\n","msg":"Unhandled exception while executing module: Exception calling \\"Write\\" with \\"1\\" argument(s): \\"CredWriteW(*) failed - The parameter is incorrect (Win32 Error Code 87: 0x00000057)\\"","failed":true}\r\n', 'OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\r\ndebug1: Reading configuration data
/etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 19: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 104\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\n#< CLIXML\r\n<Objs
Version="1.1.0.1" xmlns="http://schemas.microsoft.com/powershell/2004/04"><Obj S="progress" RefId="0"><TN RefId="0"><T>System.Management.Automation.PSCustomObject</T><T>System.Object</T></TN><MS><I64 N="SourceId">1</I64><PR N="Record"><AV>Preparing modules for first use.</AV><AI>0</AI><Nil /><PI>-1</PI><PC>-1</PC><T>Completed</T><SR>-1</SR><SD> </SD></PR></MS></Obj><S S="Error">#< CLIXML_x000D__x000A_<Objs Version="1.1.0.1" xmlns="http://schemas.microsoft.com/powershell/2004/04"><Obj S="progress" RefId="0"><TN RefId="0"><T>System.Management.Automation.PSCustomObject</T><T>System.Object</T></TN><MS><I64 N="SourceId">1</I64><PR N="Record"><AV>Preparing modules for first use.</AV><AI>0</AI><Nil /><PI>-1</PI><PC>-1</PC><T>Completed</T><SR>-1</SR><SD> </SD></PR></MS></Obj></Objs>_x000D__x000A_</S></Objs>debug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\n')
```
##### ADDITIONAL INFO
Verified on the local machine the follow command works as expected:
cmdkey /add:* /user:foo /pass:bar
Working around the issue by using win_command via cmdkey as above:
```
- name: Set Credential Manager for Domain
become: yes
vars:
ansible_become_pass: '{{ dev_services_password }}'
ansible_become_user: '{{ dev_services_user }}'
win_command: cmdkey /add:*.domain.com /user:{{ dev_services_domain }}\{{ dev_services_user }} /pass:{{ dev_services_password }}
```
|
https://github.com/ansible/ansible/issues/67278
|
https://github.com/ansible/ansible/pull/67549
|
650c3c5df32af3f5faf345ce0fdfc49febf83636
|
d7059881a264dcadc16ddbc264f96aa9cbf8020c
| 2020-02-10T15:12:22Z |
python
| 2020-02-18T21:43:04Z |
changelogs/fragments/win_credential-wildcard.yaml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,278 |
win_credential unable to use the wildcard character
|
##### SUMMARY
When using the win_credential module, domain suffixes with wildcards will error saying "The parameter is incorrect"
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
win_credential
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```
ansible 2.9.4
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/home/zinkj/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.17 (default, Nov 7 2019, 10:07:09) [GCC 7.4.0]
```
##### CONFIGURATION
(Empty)
##### OS / ENVIRONMENT
Target W10 Enterprise 1909
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```
- name: Set Credential Manager for Domain
become: yes
vars:
ansible_become_pass: '{{ dev_services_password }}'
ansible_become_user: '{{ dev_services_user }}'
win_credential:
name: '*.domain.com'
type: domain_password
username: '{{ dev_services_user }}'
secret: '{{ dev_services_password }}'
persistence: enterprise
state: present
```
##### EXPECTED RESULTS
Windows Credential manager entry added for '*.domain.com"
##### ACTUAL RESULTS
```
null: TASK [bootstrap : Set Credential Manager for Domain] ***************************
null: task path: /mnt/c/regfarm/framework/revitfarm/setup/node/ansible/roles/bootstrap/tasks/bootstrap_tasks.yml:1
null: Monday 10 February 2020 10:01:08 -0500 (0:00:11.313) 0:00:11.393 *******
null: Monday 10 February 2020 10:01:08 -0500 (0:00:11.313) 0:00:11.392 *******
null: Using module file /usr/lib/python2.7/dist-packages/ansible/modules/windows/win_credential.ps1
null: Pipelining is enabled.
null: <127.0.0.1> ESTABLISH SSH CONNECTION FOR USER: <sensitive>
null: <127.0.0.1> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o Port=58011 -o 'IdentityFile="/tmp/ansible-key214012297"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey
-o PasswordAuthentication=no -o 'User="<sensitive>"' -o ConnectTimeout=1800 -o ControlPath=/home/zinkj/.ansible/cp/5138928487 127.0.0.1 'chcp.com 65001 > $null ; PowerShell -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -EncodedCommand UABvAHcAZQByAFMAaABlAGwAbAAgAC0ATgBvAFAAcgBvAGYAaQBsAGUAIAAtAE4AbwBuAEkAbgB0AGUAcgBhAGMAdABpAHYAZQAgAC0ARQB4AGUAYwB1AHQAaQBvAG4AUABvAGwAaQBjAHkAIABVAG4AcgBlAHMAdAByAGkAYwB0AGUAZAAgAC0ARQBuAGMAbwBkAGUAZABDAG8AbQBtAGEAbgBkACAASgBnAEIAagBBAEcAZwBBAFkAdwBCAHcAQQBDADQAQQBZAHcAQgB2AEEARwAwAEEASQBBAEEAMgBBAEQAVQBBAE0AQQBBAHcAQQBEAEUAQQBJAEEAQQArAEEAQwBBAEEASgBBAEIAdQBBAEgAVQBBAGIAQQBCAHMAQQBBAG8AQQBKAEEAQgBsAEEASABnAEEAWgBRAEIAagBBAEYAOABBAGQAdwBCAHkAQQBHAEUAQQBjAEEAQgB3AEEARwBVAEEAYwBnAEIAZgBBAEgATQBBAGQAQQBCAHkAQQBDAEEAQQBQAFEAQQBnAEEAQwBRAEEAYQBRAEIAdQBBAEgAQQBBAGQAUQBCADAAQQBDAEEAQQBmAEEAQQBnAEEARQA4AEEAZABRAEIAMABBAEMAMABBAFUAdwBCADAAQQBIAEkAQQBhAFEAQgB1AEEARwBjAEEAQwBnAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAGcAQQBEADAAQQBJAEEAQQBrAEEARwBVAEEAZQBBAEIAbABBAEcATQBBAFgAdwBCADMAQQBIAEkAQQBZAFEAQgB3AEEASABBAEEAWgBRAEIAeQBBAEYAOABBAGMAdwBCADAAQQBIAEkAQQBMAGcAQgBUAEEASABBAEEAYgBBAEIAcABBAEgAUQBBAEsAQQBCAEEAQQBDAGcAQQBJAGcAQgBnAEEARABBAEEAWQBBAEEAdwBBAEcAQQBBAE0AQQBCAGcAQQBEAEEAQQBJAGcAQQBwAEEAQwB3AEEASQBBAEEAeQBBAEMAdwBBAEkAQQBCAGIAQQBGAE0AQQBkAEEAQgB5AEEARwBrAEEAYgBnAEIAbgBBAEYATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBQAEEASABBAEEAZABBAEIAcABBAEcAOABBAGIAZwBCAHoAQQBGADAAQQBPAGcAQQA2AEEARgBJAEEAWgBRAEIAdABBAEcAOABBAGQAZwBCAGwAQQBFAFUAQQBiAFEAQgB3AEEASABRAEEAZQBRAEIARgBBAEcANABBAGQAQQBCAHkAQQBHAGsAQQBaAFEAQgB6AEEAQwBrAEEAQwBnAEIASgBBAEcAWQBBAEkAQQBBAG8AQQBDADAAQQBiAGcAQgB2AEEASABRAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAHUAQQBFAHcAQQBaAFEAQgB1AEEARwBjAEEAZABBAEIAbwBBAEMAQQBBAEwAUQBCAGwAQQBIAEUAQQBJAEEAQQB5AEEAQwBrAEEASQBBAEIANwBBAEMAQQBBAGQAQQBCAG8AQQBIAEkAQQBiAHcAQgAzAEEAQwBBAEEASQBnAEIAcABBAEcANABBAGQAZwBCAGgAQQBHAHcAQQBhAFEAQgBrAEEAQwBBAEEAYwBBAEIAaABBAEgAawBBAGIAQQBCAHYAQQBHAEUAQQBaAEEAQQBpAEEAQwBBAEEAZgBRAEEASwBBAEYATQBBAFoAUQBCADAAQQBDADAAQQBWAGcAQgBoAEEASABJAEEAYQBRAEIAaABBAEcASQBBAGIAQQBCAGwAQQBDAEEAQQBMAFEAQgBPAEEARwBFAEEAYgBRAEIAbABBAEMAQQBBAGEAZwBCAHoAQQBHADgAQQBiAGcAQgBmAEEASABJAEEAWQBRAEIAMwBBAEMAQQBBAEwAUQBCAFcAQQBHAEUAQQBiAEEAQgAxAEEARwBVAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEUAQQBYAFEAQQBLAEEAQwBRAEEAWgBRAEIANABBAEcAVQBBAFkAdwBCAGYAQQBIAGMAQQBjAGcAQgBoAEEASABBAEEAYwBBAEIAbABBAEgASQBBAEkAQQBBADkAQQBDAEEAQQBXAHcAQgBUAEEARwBNAEEAYwBnAEIAcABBAEgAQQBBAGQAQQBCAEMAQQBHAHcAQQBiAHcAQgBqAEEARwBzAEEAWABRAEEANgBBAEQAbwBBAFEAdwBCAHkAQQBHAFUAQQBZAFEAQgAwAEEARwBVAEEASwBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEEAQQBYAFEAQQBwAEEAQQBvAEEASgBnAEEAawBBAEcAVQBBAGUAQQBCAGwAQQBHAE0AQQBYAHcAQgAzAEEASABJAEEAWQBRAEIAdwBBAEgAQQBBAFoAUQBCAHkAQQBBAD0APQA='
null: <127.0.0.1> (1, '{"exception":"Exception calling \\"Write\\" with \\"1\\" argument(s): \\"CredWriteW(*) failed - The parameter is incorrect (Win32 Error Code 87: 0x00000057)\\"\\nAt line:602 char:13\\r\\n+ $new_credential.Write($false)\\r\\n+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\\n + CategoryInfo : NotSpecified: (:) [], ParentContainsErrorRecordException\\n + FullyQualifiedErrorId : Win32Exception\\r\\n\\r\\nScriptStackTrace:\\r\\nat \\u003cScriptBlock\\u003e, \\u003cNo file\\u003e: line 602\\r\\n","msg":"Unhandled exception while executing module: Exception calling \\"Write\\" with \\"1\\" argument(s): \\"CredWriteW(*) failed - The parameter is incorrect (Win32 Error Code 87: 0x00000057)\\"","failed":true}\r\n', 'OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\r\ndebug1: Reading configuration data
/etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 19: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 104\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\n#< CLIXML\r\n<Objs
Version="1.1.0.1" xmlns="http://schemas.microsoft.com/powershell/2004/04"><Obj S="progress" RefId="0"><TN RefId="0"><T>System.Management.Automation.PSCustomObject</T><T>System.Object</T></TN><MS><I64 N="SourceId">1</I64><PR N="Record"><AV>Preparing modules for first use.</AV><AI>0</AI><Nil /><PI>-1</PI><PC>-1</PC><T>Completed</T><SR>-1</SR><SD> </SD></PR></MS></Obj><S S="Error">#< CLIXML_x000D__x000A_<Objs Version="1.1.0.1" xmlns="http://schemas.microsoft.com/powershell/2004/04"><Obj S="progress" RefId="0"><TN RefId="0"><T>System.Management.Automation.PSCustomObject</T><T>System.Object</T></TN><MS><I64 N="SourceId">1</I64><PR N="Record"><AV>Preparing modules for first use.</AV><AI>0</AI><Nil /><PI>-1</PI><PC>-1</PC><T>Completed</T><SR>-1</SR><SD> </SD></PR></MS></Obj></Objs>_x000D__x000A_</S></Objs>debug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\n')
```
##### ADDITIONAL INFO
Verified on the local machine the follow command works as expected:
cmdkey /add:* /user:foo /pass:bar
Working around the issue by using win_command via cmdkey as above:
```
- name: Set Credential Manager for Domain
become: yes
vars:
ansible_become_pass: '{{ dev_services_password }}'
ansible_become_user: '{{ dev_services_user }}'
win_command: cmdkey /add:*.domain.com /user:{{ dev_services_domain }}\{{ dev_services_user }} /pass:{{ dev_services_password }}
```
|
https://github.com/ansible/ansible/issues/67278
|
https://github.com/ansible/ansible/pull/67549
|
650c3c5df32af3f5faf345ce0fdfc49febf83636
|
d7059881a264dcadc16ddbc264f96aa9cbf8020c
| 2020-02-10T15:12:22Z |
python
| 2020-02-18T21:43:04Z |
lib/ansible/modules/windows/win_credential.ps1
|
#!powershell
# Copyright: (c) 2018, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
#AnsibleRequires -CSharpUtil Ansible.Basic
#Requires -Module Ansible.ModuleUtils.AddType
$spec = @{
options = @{
alias = @{ type = "str" }
attributes = @{
type = "list"
elements = "dict"
options = @{
name = @{ type = "str"; required = $true }
data = @{ type = "str" }
data_format = @{ type = "str"; default = "text"; choices = @("base64", "text") }
}
}
comment = @{ type = "str" }
name = @{ type = "str"; required = $true }
persistence = @{ type = "str"; default = "local"; choices = @("enterprise", "local") }
secret = @{ type = "str"; no_log = $true }
secret_format = @{ type = "str"; default = "text"; choices = @("base64", "text") }
state = @{ type = "str"; default = "present"; choices = @("absent", "present") }
type = @{
type = "str"
required = $true
choices = @("domain_password", "domain_certificate", "generic_password", "generic_certificate")
}
update_secret = @{ type = "str"; default = "always"; choices = @("always", "on_create") }
username = @{ type = "str" }
}
required_if = @(
,@("state", "present", @("username"))
)
supports_check_mode = $true
}
$module = [Ansible.Basic.AnsibleModule]::Create($args, $spec)
$alias = $module.Params.alias
$attributes = $module.Params.attributes
$comment = $module.Params.comment
$name = $module.Params.name
$persistence = $module.Params.persistence
$secret = $module.Params.secret
$secret_format = $module.Params.secret_format
$state = $module.Params.state
$type = $module.Params.type
$update_secret = $module.Params.update_secret
$username = $module.Params.username
$module.Diff.before = ""
$module.Diff.after = ""
Add-CSharpType -AnsibleModule $module -References @'
using Microsoft.Win32.SafeHandles;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Runtime.ConstrainedExecution;
using System.Runtime.InteropServices;
using System.Text;
namespace Ansible.CredentialManager
{
internal class NativeHelpers
{
[StructLayout(LayoutKind.Sequential, CharSet = CharSet.Unicode)]
public class CREDENTIAL
{
public CredentialFlags Flags;
public CredentialType Type;
[MarshalAs(UnmanagedType.LPWStr)] public string TargetName;
[MarshalAs(UnmanagedType.LPWStr)] public string Comment;
public FILETIME LastWritten;
public UInt32 CredentialBlobSize;
public IntPtr CredentialBlob;
public CredentialPersist Persist;
public UInt32 AttributeCount;
public IntPtr Attributes;
[MarshalAs(UnmanagedType.LPWStr)] public string TargetAlias;
[MarshalAs(UnmanagedType.LPWStr)] public string UserName;
public static explicit operator Credential(CREDENTIAL v)
{
byte[] secret = new byte[(int)v.CredentialBlobSize];
if (v.CredentialBlob != IntPtr.Zero)
Marshal.Copy(v.CredentialBlob, secret, 0, secret.Length);
List<CredentialAttribute> attributes = new List<CredentialAttribute>();
if (v.AttributeCount > 0)
{
CREDENTIAL_ATTRIBUTE[] rawAttributes = new CREDENTIAL_ATTRIBUTE[v.AttributeCount];
Credential.PtrToStructureArray(rawAttributes, v.Attributes);
attributes = rawAttributes.Select(x => (CredentialAttribute)x).ToList();
}
string userName = v.UserName;
if (v.Type == CredentialType.DomainCertificate || v.Type == CredentialType.GenericCertificate)
userName = Credential.UnmarshalCertificateCredential(userName);
return new Credential
{
Type = v.Type,
TargetName = v.TargetName,
Comment = v.Comment,
LastWritten = (DateTimeOffset)v.LastWritten,
Secret = secret,
Persist = v.Persist,
Attributes = attributes,
TargetAlias = v.TargetAlias,
UserName = userName,
Loaded = true,
};
}
}
[StructLayout(LayoutKind.Sequential)]
public struct CREDENTIAL_ATTRIBUTE
{
[MarshalAs(UnmanagedType.LPWStr)] public string Keyword;
public UInt32 Flags; // Set to 0 and is reserved
public UInt32 ValueSize;
public IntPtr Value;
public static explicit operator CredentialAttribute(CREDENTIAL_ATTRIBUTE v)
{
byte[] value = new byte[v.ValueSize];
Marshal.Copy(v.Value, value, 0, (int)v.ValueSize);
return new CredentialAttribute
{
Keyword = v.Keyword,
Flags = v.Flags,
Value = value,
};
}
}
[StructLayout(LayoutKind.Sequential)]
public struct FILETIME
{
internal UInt32 dwLowDateTime;
internal UInt32 dwHighDateTime;
public static implicit operator long(FILETIME v) { return ((long)v.dwHighDateTime << 32) + v.dwLowDateTime; }
public static explicit operator DateTimeOffset(FILETIME v) { return DateTimeOffset.FromFileTime(v); }
public static explicit operator FILETIME(DateTimeOffset v)
{
return new FILETIME()
{
dwLowDateTime = (UInt32)v.ToFileTime(),
dwHighDateTime = ((UInt32)v.ToFileTime() >> 32),
};
}
}
[Flags]
public enum CredentialCreateFlags : uint
{
PreserveCredentialBlob = 1,
}
[Flags]
public enum CredentialFlags
{
None = 0,
PromptNow = 2,
UsernameTarget = 4,
}
public enum CredMarshalType : uint
{
CertCredential = 1,
UsernameTargetCredential,
BinaryBlobCredential,
UsernameForPackedCredential,
BinaryBlobForSystem,
}
}
internal class NativeMethods
{
[DllImport("advapi32.dll", SetLastError = true, CharSet = CharSet.Unicode)]
public static extern bool CredDeleteW(
[MarshalAs(UnmanagedType.LPWStr)] string TargetName,
CredentialType Type,
UInt32 Flags);
[DllImport("advapi32.dll")]
public static extern void CredFree(
IntPtr Buffer);
[DllImport("advapi32.dll", SetLastError = true, CharSet = CharSet.Unicode)]
public static extern bool CredMarshalCredentialW(
NativeHelpers.CredMarshalType CredType,
SafeMemoryBuffer Credential,
out SafeCredentialBuffer MarshaledCredential);
[DllImport("advapi32.dll", SetLastError = true, CharSet = CharSet.Unicode)]
public static extern bool CredReadW(
[MarshalAs(UnmanagedType.LPWStr)] string TargetName,
CredentialType Type,
UInt32 Flags,
out SafeCredentialBuffer Credential);
[DllImport("advapi32.dll", SetLastError = true, CharSet = CharSet.Unicode)]
public static extern bool CredUnmarshalCredentialW(
[MarshalAs(UnmanagedType.LPWStr)] string MarshaledCredential,
out NativeHelpers.CredMarshalType CredType,
out SafeCredentialBuffer Credential);
[DllImport("advapi32.dll", SetLastError = true, CharSet = CharSet.Unicode)]
public static extern bool CredWriteW(
NativeHelpers.CREDENTIAL Credential,
NativeHelpers.CredentialCreateFlags Flags);
}
internal class SafeCredentialBuffer : SafeHandleZeroOrMinusOneIsInvalid
{
public SafeCredentialBuffer() : base(true) { }
[ReliabilityContract(Consistency.WillNotCorruptState, Cer.MayFail)]
protected override bool ReleaseHandle()
{
NativeMethods.CredFree(handle);
return true;
}
}
internal class SafeMemoryBuffer : SafeHandleZeroOrMinusOneIsInvalid
{
public SafeMemoryBuffer() : base(true) { }
public SafeMemoryBuffer(int cb) : base(true)
{
base.SetHandle(Marshal.AllocHGlobal(cb));
}
public SafeMemoryBuffer(IntPtr handle) : base(true)
{
base.SetHandle(handle);
}
[ReliabilityContract(Consistency.WillNotCorruptState, Cer.MayFail)]
protected override bool ReleaseHandle()
{
Marshal.FreeHGlobal(handle);
return true;
}
}
public class Win32Exception : System.ComponentModel.Win32Exception
{
private string _exception_msg;
public Win32Exception(string message) : this(Marshal.GetLastWin32Error(), message) { }
public Win32Exception(int errorCode, string message) : base(errorCode)
{
_exception_msg = String.Format("{0} - {1} (Win32 Error Code {2}: 0x{3})", message, base.Message, errorCode, errorCode.ToString("X8"));
}
public override string Message { get { return _exception_msg; } }
public static explicit operator Win32Exception(string message) { return new Win32Exception(message); }
}
public enum CredentialPersist
{
Session = 1,
LocalMachine = 2,
Enterprise = 3,
}
public enum CredentialType
{
Generic = 1,
DomainPassword = 2,
DomainCertificate = 3,
DomainVisiblePassword = 4,
GenericCertificate = 5,
DomainExtended = 6,
Maximum = 7,
MaximumEx = 1007,
}
public class CredentialAttribute
{
public string Keyword;
public UInt32 Flags;
public byte[] Value;
}
public class Credential
{
public CredentialType Type;
public string TargetName;
public string Comment;
public DateTimeOffset LastWritten;
public byte[] Secret;
public CredentialPersist Persist;
public List<CredentialAttribute> Attributes = new List<CredentialAttribute>();
public string TargetAlias;
public string UserName;
// Used to track whether the credential has been loaded into the store or not
public bool Loaded { get; internal set; }
public void Delete()
{
if (!Loaded)
return;
if (!NativeMethods.CredDeleteW(TargetName, Type, 0))
throw new Win32Exception(String.Format("CredDeleteW({0}) failed", TargetName));
Loaded = false;
}
public void Write(bool preserveExisting)
{
string userName = UserName;
// Convert the certificate thumbprint to the string expected
if (Type == CredentialType.DomainCertificate || Type == CredentialType.GenericCertificate)
userName = Credential.MarshalCertificateCredential(userName);
NativeHelpers.CREDENTIAL credential = new NativeHelpers.CREDENTIAL
{
Flags = NativeHelpers.CredentialFlags.None,
Type = Type,
TargetName = TargetName,
Comment = Comment,
LastWritten = new NativeHelpers.FILETIME(),
CredentialBlobSize = (UInt32)(Secret == null ? 0 : Secret.Length),
CredentialBlob = IntPtr.Zero, // Must be allocated and freed outside of this to ensure no memory leaks
Persist = Persist,
AttributeCount = (UInt32)(Attributes.Count),
Attributes = IntPtr.Zero, // Attributes must be allocated and freed outside of this to ensure no memory leaks
TargetAlias = TargetAlias,
UserName = userName,
};
using (SafeMemoryBuffer credentialBlob = new SafeMemoryBuffer((int)credential.CredentialBlobSize))
{
if (Secret != null)
Marshal.Copy(Secret, 0, credentialBlob.DangerousGetHandle(), Secret.Length);
credential.CredentialBlob = credentialBlob.DangerousGetHandle();
// Store the CREDENTIAL_ATTRIBUTE value in a safe memory buffer and make sure we dispose in all cases
List<SafeMemoryBuffer> attributeBuffers = new List<SafeMemoryBuffer>();
try
{
int attributeLength = Attributes.Sum(a => Marshal.SizeOf(typeof(NativeHelpers.CREDENTIAL_ATTRIBUTE)));
byte[] attributeBytes = new byte[attributeLength];
int offset = 0;
foreach (CredentialAttribute attribute in Attributes)
{
SafeMemoryBuffer attributeBuffer = new SafeMemoryBuffer(attribute.Value.Length);
attributeBuffers.Add(attributeBuffer);
if (attribute.Value != null)
Marshal.Copy(attribute.Value, 0, attributeBuffer.DangerousGetHandle(), attribute.Value.Length);
NativeHelpers.CREDENTIAL_ATTRIBUTE credentialAttribute = new NativeHelpers.CREDENTIAL_ATTRIBUTE
{
Keyword = attribute.Keyword,
Flags = attribute.Flags,
ValueSize = (UInt32)(attribute.Value == null ? 0 : attribute.Value.Length),
Value = attributeBuffer.DangerousGetHandle(),
};
int attributeStructLength = Marshal.SizeOf(typeof(NativeHelpers.CREDENTIAL_ATTRIBUTE));
byte[] attrBytes = new byte[attributeStructLength];
using (SafeMemoryBuffer tempBuffer = new SafeMemoryBuffer(attributeStructLength))
{
Marshal.StructureToPtr(credentialAttribute, tempBuffer.DangerousGetHandle(), false);
Marshal.Copy(tempBuffer.DangerousGetHandle(), attrBytes, 0, attributeStructLength);
}
Buffer.BlockCopy(attrBytes, 0, attributeBytes, offset, attributeStructLength);
offset += attributeStructLength;
}
using (SafeMemoryBuffer attributes = new SafeMemoryBuffer(attributeBytes.Length))
{
if (attributeBytes.Length != 0)
Marshal.Copy(attributeBytes, 0, attributes.DangerousGetHandle(), attributeBytes.Length);
credential.Attributes = attributes.DangerousGetHandle();
NativeHelpers.CredentialCreateFlags createFlags = 0;
if (preserveExisting)
createFlags |= NativeHelpers.CredentialCreateFlags.PreserveCredentialBlob;
if (!NativeMethods.CredWriteW(credential, createFlags))
throw new Win32Exception(String.Format("CredWriteW({0}) failed", TargetName));
}
}
finally
{
foreach (SafeMemoryBuffer attributeBuffer in attributeBuffers)
attributeBuffer.Dispose();
}
}
Loaded = true;
}
public static Credential GetCredential(string target, CredentialType type)
{
SafeCredentialBuffer buffer;
if (!NativeMethods.CredReadW(target, type, 0, out buffer))
{
int lastErr = Marshal.GetLastWin32Error();
// Not running with Become so cannot manage the user's credentials
if (lastErr == 0x00000520) // ERROR_NO_SUCH_LOGON_SESSION
throw new InvalidOperationException("Failed to access the user's credential store, run the module with become");
else if (lastErr == 0x00000490) // ERROR_NOT_FOUND
return null;
throw new Win32Exception(lastErr, "CredEnumerateW() failed");
}
using (buffer)
{
NativeHelpers.CREDENTIAL credential = (NativeHelpers.CREDENTIAL)Marshal.PtrToStructure(
buffer.DangerousGetHandle(), typeof(NativeHelpers.CREDENTIAL));
return (Credential)credential;
}
}
public static string MarshalCertificateCredential(string thumbprint)
{
// CredWriteW requires the UserName field to be the value of CredMarshalCredentialW() when writting a
// certificate auth. This converts the UserName property to the format required.
// While CERT_CREDENTIAL_INFO is the correct structure, we manually marshal the data in order to
// support different cert hash lengths in the future.
// https://docs.microsoft.com/en-us/windows/desktop/api/wincred/ns-wincred-_cert_credential_info
int hexLength = thumbprint.Length;
byte[] credInfo = new byte[sizeof(UInt32) + (hexLength / 2)];
// First field is cbSize which is a UInt32 value denoting the size of the total structure
Array.Copy(BitConverter.GetBytes((UInt32)credInfo.Length), credInfo, sizeof(UInt32));
// Now copy the byte representation of the thumbprint to the rest of the struct bytes
for (int i = 0; i < hexLength; i += 2)
credInfo[sizeof(UInt32) + (i / 2)] = Convert.ToByte(thumbprint.Substring(i, 2), 16);
IntPtr pCredInfo = Marshal.AllocHGlobal(credInfo.Length);
Marshal.Copy(credInfo, 0, pCredInfo, credInfo.Length);
SafeMemoryBuffer pCredential = new SafeMemoryBuffer(pCredInfo);
NativeHelpers.CredMarshalType marshalType = NativeHelpers.CredMarshalType.CertCredential;
using (pCredential)
{
SafeCredentialBuffer marshaledCredential;
if (!NativeMethods.CredMarshalCredentialW(marshalType, pCredential, out marshaledCredential))
throw new Win32Exception("CredMarshalCredentialW() failed");
using (marshaledCredential)
return Marshal.PtrToStringUni(marshaledCredential.DangerousGetHandle());
}
}
public static string UnmarshalCertificateCredential(string value)
{
NativeHelpers.CredMarshalType credType;
SafeCredentialBuffer pCredInfo;
if (!NativeMethods.CredUnmarshalCredentialW(value, out credType, out pCredInfo))
throw new Win32Exception("CredUnmarshalCredentialW() failed");
using (pCredInfo)
{
if (credType != NativeHelpers.CredMarshalType.CertCredential)
throw new InvalidOperationException(String.Format("Expected unmarshalled cred type of CertCredential, received {0}", credType));
byte[] structSizeBytes = new byte[sizeof(UInt32)];
Marshal.Copy(pCredInfo.DangerousGetHandle(), structSizeBytes, 0, sizeof(UInt32));
UInt32 structSize = BitConverter.ToUInt32(structSizeBytes, 0);
byte[] certInfoBytes = new byte[structSize];
Marshal.Copy(pCredInfo.DangerousGetHandle(), certInfoBytes, 0, certInfoBytes.Length);
StringBuilder hex = new StringBuilder((certInfoBytes.Length - sizeof(UInt32)) * 2);
for (int i = 4; i < certInfoBytes.Length; i++)
hex.AppendFormat("{0:x2}", certInfoBytes[i]);
return hex.ToString().ToUpperInvariant();
}
}
internal static void PtrToStructureArray<T>(T[] array, IntPtr ptr)
{
IntPtr ptrOffset = ptr;
for (int i = 0; i < array.Length; i++, ptrOffset = IntPtr.Add(ptrOffset, Marshal.SizeOf(typeof(T))))
array[i] = (T)Marshal.PtrToStructure(ptrOffset, typeof(T));
}
}
}
'@
Function ConvertTo-CredentialAttribute {
param($Attributes)
$converted_attributes = [System.Collections.Generic.List`1[Ansible.CredentialManager.CredentialAttribute]]@()
foreach ($attribute in $Attributes) {
$new_attribute = New-Object -TypeName Ansible.CredentialManager.CredentialAttribute
$new_attribute.Keyword = $attribute.name
if ($null -ne $attribute.data) {
if ($attribute.data_format -eq "base64") {
$new_attribute.Value = [System.Convert]::FromBase64String($attribute.data)
} else {
$new_attribute.Value = [System.Text.Encoding]::UTF8.GetBytes($attribute.data)
}
}
$converted_attributes.Add($new_attribute) > $null
}
return ,$converted_attributes
}
Function Get-DiffInfo {
param($AnsibleCredential)
$diff = @{
alias = $AnsibleCredential.TargetAlias
attributes = [System.Collections.ArrayList]@()
comment = $AnsibleCredential.Comment
name = $AnsibleCredential.TargetName
persistence = $AnsibleCredential.Persist.ToString()
type = $AnsibleCredential.Type.ToString()
username = $AnsibleCredential.UserName
}
foreach ($attribute in $AnsibleCredential.Attributes) {
$attribute_info = @{
name = $attribute.Keyword
data = $null
}
if ($null -ne $attribute.Value) {
$attribute_info.data = [System.Convert]::ToBase64String($attribute.Value)
}
$diff.attributes.Add($attribute_info) > $null
}
return ,$diff
}
# If the username is a certificate thumbprint, verify it's a valid cert in the CurrentUser/Personal store
if ($null -ne $username -and $type -in @("domain_certificate", "generic_certificate")) {
# Ensure the thumbprint is upper case with no spaces or hyphens
$username = $username.ToUpperInvariant().Replace(" ", "").Replace("-", "")
$certificate = Get-Item -Path Cert:\CurrentUser\My\$username -ErrorAction SilentlyContinue
if ($null -eq $certificate) {
$module.FailJson("Failed to find certificate with the thumbprint $username in the CurrentUser\My store")
}
}
# Convert the input secret to a byte array
if ($null -ne $secret) {
if ($secret_format -eq "base64") {
$secret = [System.Convert]::FromBase64String($secret)
} else {
$secret = [System.Text.Encoding]::Unicode.GetBytes($secret)
}
}
$persistence = switch ($persistence) {
"local" { [Ansible.CredentialManager.CredentialPersist]::LocalMachine }
"enterprise" { [Ansible.CredentialManager.CredentialPersist]::Enterprise }
}
$type = switch ($type) {
"domain_password" { [Ansible.CredentialManager.CredentialType]::DomainPassword }
"domain_certificate" { [Ansible.CredentialManager.CredentialType]::DomainCertificate }
"generic_password" { [Ansible.CredentialManager.CredentialType]::Generic }
"generic_certificate" { [Ansible.CredentialManager.CredentialType]::GenericCertificate }
}
$existing_credential = [Ansible.CredentialManager.Credential]::GetCredential($name, $type)
if ($null -ne $existing_credential) {
$module.Diff.before = Get-DiffInfo -AnsibleCredential $existing_credential
}
if ($state -eq "absent") {
if ($null -ne $existing_credential) {
if (-not $module.CheckMode) {
$existing_credential.Delete()
}
$module.Result.changed = $true
}
} else {
if ($null -eq $existing_credential) {
$new_credential = New-Object -TypeName Ansible.CredentialManager.Credential
$new_credential.Type = $type
$new_credential.TargetName = $name
$new_credential.Comment = $comment
$new_credential.Secret = $secret
$new_credential.Persist = $persistence
$new_credential.TargetAlias = $alias
$new_credential.UserName = $username
if ($null -ne $attributes) {
$new_credential.Attributes = ConvertTo-CredentialAttribute -Attributes $attributes
}
if (-not $module.CheckMode) {
$new_credential.Write($false)
}
$module.Result.changed = $true
} else {
$changed = $false
$preserve_blob = $false
# make sure we do case comparison for the comment
if ($existing_credential.Comment -cne $comment) {
$existing_credential.Comment = $comment
$changed = $true
}
if ($existing_credential.Persist -ne $persistence) {
$existing_credential.Persist = $persistence
$changed = $true
}
if ($existing_credential.TargetAlias -ne $alias) {
$existing_credential.TargetAlias = $alias
$changed = $true
}
if ($existing_credential.UserName -ne $username) {
$existing_credential.UserName = $username
$changed = $true
}
if ($null -ne $attributes) {
$attribute_changed = $false
$new_attributes = ConvertTo-CredentialAttribute -Attributes $attributes
if ($new_attributes.Count -ne $existing_credential.Attributes.Count) {
$attribute_changed = $true
} else {
for ($i = 0; $i -lt $new_attributes.Count; $i++) {
$new_keyword = $new_attributes[$i].Keyword
$new_value = $new_attributes[$i].Value
if ($null -eq $new_value) {
$new_value = ""
} else {
$new_value = [System.Convert]::ToBase64String($new_value)
}
$existing_keyword = $existing_credential.Attributes[$i].Keyword
$existing_value = $existing_credential.Attributes[$i].Value
if ($null -eq $existing_value) {
$existing_value = ""
} else {
$existing_value = [System.Convert]::ToBase64String($existing_value)
}
if (($new_keyword -cne $existing_keyword) -or ($new_value -ne $existing_value)) {
$attribute_changed = $true
break
}
}
}
if ($attribute_changed) {
$existing_credential.Attributes = $new_attributes
$changed = $true
}
}
if ($null -eq $secret) {
# If we haven't explicitly set a secret, tell Windows to preserve the existing blob
$preserve_blob = $true
$existing_credential.Secret = $null
} elseif ($update_secret -eq "always") {
# We should only set the password if we can't read the existing one or it doesn't match our secret
if ($existing_credential.Secret.Length -eq 0) {
# We cannot read the secret so don't know if its the configured secret
$existing_credential.Secret = $secret
$changed = $true
} else {
# We can read the secret so compare with our input
$input_secret_b64 = [System.Convert]::ToBase64String($secret)
$actual_secret_b64 = [System.Convert]::ToBase64String($existing_credential.Secret)
if ($input_secret_b64 -ne $actual_secret_b64) {
$existing_credential.Secret = $secret
$changed = $true
}
}
}
if ($changed -and -not $module.CheckMode) {
$existing_credential.Write($preserve_blob)
}
$module.Result.changed = $changed
}
if ($module.CheckMode) {
# We cannot reliably get the credential in check mode, set it based on the input
$module.Diff.after = @{
alias = $alias
attributes = $attributes
comment = $comment
name = $name
persistence = $persistence.ToString()
type = $type.ToString()
username = $username
}
} else {
# Get a new copy of the credential and use that to set the after diff
$new_credential = [Ansible.CredentialManager.Credential]::GetCredential($name, $type)
$module.Diff.after = Get-DiffInfo -AnsibleCredential $new_credential
}
}
$module.ExitJson()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,278 |
win_credential unable to use the wildcard character
|
##### SUMMARY
When using the win_credential module, domain suffixes with wildcards will error saying "The parameter is incorrect"
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
win_credential
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```
ansible 2.9.4
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/home/zinkj/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.17 (default, Nov 7 2019, 10:07:09) [GCC 7.4.0]
```
##### CONFIGURATION
(Empty)
##### OS / ENVIRONMENT
Target W10 Enterprise 1909
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```
- name: Set Credential Manager for Domain
become: yes
vars:
ansible_become_pass: '{{ dev_services_password }}'
ansible_become_user: '{{ dev_services_user }}'
win_credential:
name: '*.domain.com'
type: domain_password
username: '{{ dev_services_user }}'
secret: '{{ dev_services_password }}'
persistence: enterprise
state: present
```
##### EXPECTED RESULTS
Windows Credential manager entry added for '*.domain.com"
##### ACTUAL RESULTS
```
null: TASK [bootstrap : Set Credential Manager for Domain] ***************************
null: task path: /mnt/c/regfarm/framework/revitfarm/setup/node/ansible/roles/bootstrap/tasks/bootstrap_tasks.yml:1
null: Monday 10 February 2020 10:01:08 -0500 (0:00:11.313) 0:00:11.393 *******
null: Monday 10 February 2020 10:01:08 -0500 (0:00:11.313) 0:00:11.392 *******
null: Using module file /usr/lib/python2.7/dist-packages/ansible/modules/windows/win_credential.ps1
null: Pipelining is enabled.
null: <127.0.0.1> ESTABLISH SSH CONNECTION FOR USER: <sensitive>
null: <127.0.0.1> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o Port=58011 -o 'IdentityFile="/tmp/ansible-key214012297"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey
-o PasswordAuthentication=no -o 'User="<sensitive>"' -o ConnectTimeout=1800 -o ControlPath=/home/zinkj/.ansible/cp/5138928487 127.0.0.1 'chcp.com 65001 > $null ; PowerShell -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -EncodedCommand UABvAHcAZQByAFMAaABlAGwAbAAgAC0ATgBvAFAAcgBvAGYAaQBsAGUAIAAtAE4AbwBuAEkAbgB0AGUAcgBhAGMAdABpAHYAZQAgAC0ARQB4AGUAYwB1AHQAaQBvAG4AUABvAGwAaQBjAHkAIABVAG4AcgBlAHMAdAByAGkAYwB0AGUAZAAgAC0ARQBuAGMAbwBkAGUAZABDAG8AbQBtAGEAbgBkACAASgBnAEIAagBBAEcAZwBBAFkAdwBCAHcAQQBDADQAQQBZAHcAQgB2AEEARwAwAEEASQBBAEEAMgBBAEQAVQBBAE0AQQBBAHcAQQBEAEUAQQBJAEEAQQArAEEAQwBBAEEASgBBAEIAdQBBAEgAVQBBAGIAQQBCAHMAQQBBAG8AQQBKAEEAQgBsAEEASABnAEEAWgBRAEIAagBBAEYAOABBAGQAdwBCAHkAQQBHAEUAQQBjAEEAQgB3AEEARwBVAEEAYwBnAEIAZgBBAEgATQBBAGQAQQBCAHkAQQBDAEEAQQBQAFEAQQBnAEEAQwBRAEEAYQBRAEIAdQBBAEgAQQBBAGQAUQBCADAAQQBDAEEAQQBmAEEAQQBnAEEARQA4AEEAZABRAEIAMABBAEMAMABBAFUAdwBCADAAQQBIAEkAQQBhAFEAQgB1AEEARwBjAEEAQwBnAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAGcAQQBEADAAQQBJAEEAQQBrAEEARwBVAEEAZQBBAEIAbABBAEcATQBBAFgAdwBCADMAQQBIAEkAQQBZAFEAQgB3AEEASABBAEEAWgBRAEIAeQBBAEYAOABBAGMAdwBCADAAQQBIAEkAQQBMAGcAQgBUAEEASABBAEEAYgBBAEIAcABBAEgAUQBBAEsAQQBCAEEAQQBDAGcAQQBJAGcAQgBnAEEARABBAEEAWQBBAEEAdwBBAEcAQQBBAE0AQQBCAGcAQQBEAEEAQQBJAGcAQQBwAEEAQwB3AEEASQBBAEEAeQBBAEMAdwBBAEkAQQBCAGIAQQBGAE0AQQBkAEEAQgB5AEEARwBrAEEAYgBnAEIAbgBBAEYATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBQAEEASABBAEEAZABBAEIAcABBAEcAOABBAGIAZwBCAHoAQQBGADAAQQBPAGcAQQA2AEEARgBJAEEAWgBRAEIAdABBAEcAOABBAGQAZwBCAGwAQQBFAFUAQQBiAFEAQgB3AEEASABRAEEAZQBRAEIARgBBAEcANABBAGQAQQBCAHkAQQBHAGsAQQBaAFEAQgB6AEEAQwBrAEEAQwBnAEIASgBBAEcAWQBBAEkAQQBBAG8AQQBDADAAQQBiAGcAQgB2AEEASABRAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAHUAQQBFAHcAQQBaAFEAQgB1AEEARwBjAEEAZABBAEIAbwBBAEMAQQBBAEwAUQBCAGwAQQBIAEUAQQBJAEEAQQB5AEEAQwBrAEEASQBBAEIANwBBAEMAQQBBAGQAQQBCAG8AQQBIAEkAQQBiAHcAQgAzAEEAQwBBAEEASQBnAEIAcABBAEcANABBAGQAZwBCAGgAQQBHAHcAQQBhAFEAQgBrAEEAQwBBAEEAYwBBAEIAaABBAEgAawBBAGIAQQBCAHYAQQBHAEUAQQBaAEEAQQBpAEEAQwBBAEEAZgBRAEEASwBBAEYATQBBAFoAUQBCADAAQQBDADAAQQBWAGcAQgBoAEEASABJAEEAYQBRAEIAaABBAEcASQBBAGIAQQBCAGwAQQBDAEEAQQBMAFEAQgBPAEEARwBFAEEAYgBRAEIAbABBAEMAQQBBAGEAZwBCAHoAQQBHADgAQQBiAGcAQgBmAEEASABJAEEAWQBRAEIAMwBBAEMAQQBBAEwAUQBCAFcAQQBHAEUAQQBiAEEAQgAxAEEARwBVAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEUAQQBYAFEAQQBLAEEAQwBRAEEAWgBRAEIANABBAEcAVQBBAFkAdwBCAGYAQQBIAGMAQQBjAGcAQgBoAEEASABBAEEAYwBBAEIAbABBAEgASQBBAEkAQQBBADkAQQBDAEEAQQBXAHcAQgBUAEEARwBNAEEAYwBnAEIAcABBAEgAQQBBAGQAQQBCAEMAQQBHAHcAQQBiAHcAQgBqAEEARwBzAEEAWABRAEEANgBBAEQAbwBBAFEAdwBCAHkAQQBHAFUAQQBZAFEAQgAwAEEARwBVAEEASwBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEEAQQBYAFEAQQBwAEEAQQBvAEEASgBnAEEAawBBAEcAVQBBAGUAQQBCAGwAQQBHAE0AQQBYAHcAQgAzAEEASABJAEEAWQBRAEIAdwBBAEgAQQBBAFoAUQBCAHkAQQBBAD0APQA='
null: <127.0.0.1> (1, '{"exception":"Exception calling \\"Write\\" with \\"1\\" argument(s): \\"CredWriteW(*) failed - The parameter is incorrect (Win32 Error Code 87: 0x00000057)\\"\\nAt line:602 char:13\\r\\n+ $new_credential.Write($false)\\r\\n+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\\n + CategoryInfo : NotSpecified: (:) [], ParentContainsErrorRecordException\\n + FullyQualifiedErrorId : Win32Exception\\r\\n\\r\\nScriptStackTrace:\\r\\nat \\u003cScriptBlock\\u003e, \\u003cNo file\\u003e: line 602\\r\\n","msg":"Unhandled exception while executing module: Exception calling \\"Write\\" with \\"1\\" argument(s): \\"CredWriteW(*) failed - The parameter is incorrect (Win32 Error Code 87: 0x00000057)\\"","failed":true}\r\n', 'OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\r\ndebug1: Reading configuration data
/etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 19: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 104\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\n#< CLIXML\r\n<Objs
Version="1.1.0.1" xmlns="http://schemas.microsoft.com/powershell/2004/04"><Obj S="progress" RefId="0"><TN RefId="0"><T>System.Management.Automation.PSCustomObject</T><T>System.Object</T></TN><MS><I64 N="SourceId">1</I64><PR N="Record"><AV>Preparing modules for first use.</AV><AI>0</AI><Nil /><PI>-1</PI><PC>-1</PC><T>Completed</T><SR>-1</SR><SD> </SD></PR></MS></Obj><S S="Error">#< CLIXML_x000D__x000A_<Objs Version="1.1.0.1" xmlns="http://schemas.microsoft.com/powershell/2004/04"><Obj S="progress" RefId="0"><TN RefId="0"><T>System.Management.Automation.PSCustomObject</T><T>System.Object</T></TN><MS><I64 N="SourceId">1</I64><PR N="Record"><AV>Preparing modules for first use.</AV><AI>0</AI><Nil /><PI>-1</PI><PC>-1</PC><T>Completed</T><SR>-1</SR><SD> </SD></PR></MS></Obj></Objs>_x000D__x000A_</S></Objs>debug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\n')
```
##### ADDITIONAL INFO
Verified on the local machine the follow command works as expected:
cmdkey /add:* /user:foo /pass:bar
Working around the issue by using win_command via cmdkey as above:
```
- name: Set Credential Manager for Domain
become: yes
vars:
ansible_become_pass: '{{ dev_services_password }}'
ansible_become_user: '{{ dev_services_user }}'
win_command: cmdkey /add:*.domain.com /user:{{ dev_services_domain }}\{{ dev_services_user }} /pass:{{ dev_services_password }}
```
|
https://github.com/ansible/ansible/issues/67278
|
https://github.com/ansible/ansible/pull/67549
|
650c3c5df32af3f5faf345ce0fdfc49febf83636
|
d7059881a264dcadc16ddbc264f96aa9cbf8020c
| 2020-02-10T15:12:22Z |
python
| 2020-02-18T21:43:04Z |
test/integration/targets/win_credential/tasks/tests.yml
|
---
- name: fail to run the module without become
win_credential:
name: '{{ test_hostname }}'
type: domain_password
username: DOMAIN\username
secret: password
state: present
register: fail_no_become
failed_when: '"Failed to access the user''s credential store, run the module with become" not in fail_no_become.msg'
- name: create domain user credential (check mode)
win_credential:
name: '{{ test_hostname }}'
type: domain_password
username: DOMAIN\username
secret: password
state: present
register: domain_user_check
check_mode: True
vars: &become_vars
ansible_become: True
ansible_become_method: runas
ansible_become_user: '{{ ansible_user }}'
ansible_become_pass: '{{ ansible_password }}'
- name: get result of create domain user credential (check mode)
test_cred_facts:
name: '{{ test_hostname }}'
type: domain_password
register: domain_user_actual_check
vars: *become_vars
- name: asset create domain user credential (check mode)
assert:
that:
- domain_user_check is changed
- not domain_user_actual_check.exists
- name: create domain user credential
win_credential:
name: '{{ test_hostname }}'
type: domain_password
username: DOMAIN\username
secret: password
state: present
register: domain_user
vars: *become_vars
- name: get result of create domain user credential
test_cred_facts:
name: '{{ test_hostname }}'
type: domain_password
register: domain_user_actual
vars: *become_vars
- name: asset create domain user credential
assert:
that:
- domain_user is changed
- domain_user_actual.exists
- domain_user_actual.alias == None
- domain_user_actual.attributes == []
- domain_user_actual.comment == None
- domain_user_actual.name == test_hostname
- domain_user_actual.persistence == "LocalMachine"
- domain_user_actual.secret == ""
- domain_user_actual.type == "DomainPassword"
- domain_user_actual.username == "DOMAIN\\username"
- name: create domain user credential again always update
win_credential:
name: '{{ test_hostname }}'
type: domain_password
username: DOMAIN\username
secret: password
state: present
register: domain_user_again_always
vars: *become_vars
- name: create domain user credential again on_create
win_credential:
name: '{{ test_hostname }}'
type: domain_password
username: DOMAIN\username
secret: password
update_secret: on_create
state: present
register: domain_user_again_on_create
vars: *become_vars
- name: assert create domain user credential again
assert:
that:
- domain_user_again_always is changed
- not domain_user_again_on_create is changed
- name: update credential (check mode)
win_credential:
name: '{{ test_hostname }}'
type: domain_password
username: DOMAIN\username2
alias: ansible
attributes:
- name: attribute 1
data: attribute 1 value
- name: attribute 2
data: '{{ "attribute 2 value" | b64encode }}'
data_format: base64
comment: Credential comment
persistence: enterprise
state: present
register: update_cred_check
check_mode: True
vars: *become_vars
- name: get result of update credential (check mode)
test_cred_facts:
name: '{{ test_hostname }}'
type: domain_password
register: update_cred_actual_check
vars: *become_vars
- name: assert update credential (check mode)
assert:
that:
- update_cred_check is changed
- update_cred_actual_check.exists
- update_cred_actual_check.alias == None
- update_cred_actual_check.attributes == []
- update_cred_actual_check.comment == None
- update_cred_actual_check.name == test_hostname
- update_cred_actual_check.persistence == "LocalMachine"
- update_cred_actual_check.secret == ""
- update_cred_actual_check.type == "DomainPassword"
- update_cred_actual_check.username == "DOMAIN\\username"
- name: update credential
win_credential:
name: '{{ test_hostname }}'
type: domain_password
username: DOMAIN\username2
alias: ansible
attributes:
- name: attribute 1
data: attribute 1 value
- name: attribute 2
data: '{{ "attribute 2 value" | b64encode }}'
data_format: base64
comment: Credential comment
persistence: enterprise
state: present
register: update_cred
vars: *become_vars
- name: get result of update credential
test_cred_facts:
name: '{{ test_hostname }}'
type: domain_password
register: update_cred_actual
vars: *become_vars
- name: assert update credential
assert:
that:
- update_cred is changed
- update_cred_actual.exists
- update_cred_actual.alias == "ansible"
- update_cred_actual.attributes|count == 2
- update_cred_actual.attributes[0].name == "attribute 1"
- update_cred_actual.attributes[0].data == "attribute 1 value"|b64encode
- update_cred_actual.attributes[1].name == "attribute 2"
- update_cred_actual.attributes[1].data == "attribute 2 value"|b64encode
- update_cred_actual.comment == "Credential comment"
- update_cred_actual.name == test_hostname
- update_cred_actual.persistence == "Enterprise"
- update_cred_actual.secret == ""
- update_cred_actual.type == "DomainPassword"
- update_cred_actual.username == "DOMAIN\\username2"
- name: update credential again
win_credential:
name: '{{ test_hostname }}'
type: domain_password
username: DOMAIN\username2
alias: ansible
attributes:
- name: attribute 1
data: attribute 1 value
- name: attribute 2
data: '{{ "attribute 2 value" | b64encode }}'
data_format: base64
comment: Credential comment
persistence: enterprise
state: present
register: update_cred_again
vars: *become_vars
- name: assert update credential again
assert:
that:
- not update_cred_again is changed
- name: add new attribute
win_credential:
name: '{{ test_hostname }}'
type: domain_password
username: DOMAIN\username2
alias: ansible
attributes:
- name: attribute 1
data: attribute 1 value
- name: attribute 2
data: '{{ "attribute 2 value" | b64encode }}'
data_format: base64
- name: attribute 3
data: attribute 3 value
comment: Credential comment
persistence: enterprise
state: present
register: add_attribute
vars: *become_vars
- name: get result of add new attribute
test_cred_facts:
name: '{{ test_hostname }}'
type: domain_password
register: add_attribute_actual
vars: *become_vars
- name: assert add new attribute
assert:
that:
- add_attribute is changed
- add_attribute_actual.attributes|count == 3
- add_attribute_actual.attributes[0].name == "attribute 1"
- add_attribute_actual.attributes[0].data == "attribute 1 value"|b64encode
- add_attribute_actual.attributes[1].name == "attribute 2"
- add_attribute_actual.attributes[1].data == "attribute 2 value"|b64encode
- add_attribute_actual.attributes[2].name == "attribute 3"
- add_attribute_actual.attributes[2].data == "attribute 3 value"|b64encode
- name: remove attribute
win_credential:
name: '{{ test_hostname }}'
type: domain_password
username: DOMAIN\username2
alias: ansible
attributes:
- name: attribute 1
data: attribute 1 value
- name: attribute 2
data: '{{ "attribute 2 value" | b64encode }}'
data_format: base64
comment: Credential comment
persistence: enterprise
state: present
register: remove_attribute
vars: *become_vars
- name: get result of remove attribute
test_cred_facts:
name: '{{ test_hostname }}'
type: domain_password
register: remove_attribute_actual
vars: *become_vars
- name: assert remove attribute
assert:
that:
- remove_attribute is changed
- remove_attribute_actual.attributes|count == 2
- remove_attribute_actual.attributes[0].name == "attribute 1"
- remove_attribute_actual.attributes[0].data == "attribute 1 value"|b64encode
- remove_attribute_actual.attributes[1].name == "attribute 2"
- remove_attribute_actual.attributes[1].data == "attribute 2 value"|b64encode
- name: edit attribute
win_credential:
name: '{{ test_hostname }}'
type: domain_password
username: DOMAIN\username2
alias: ansible
attributes:
- name: attribute 1
data: attribute 1 value new
- name: attribute 2
data: '{{ "attribute 2 value" | b64encode }}'
data_format: base64
comment: Credential comment
persistence: enterprise
state: present
register: edit_attribute
vars: *become_vars
- name: get result of edit attribute
test_cred_facts:
name: '{{ test_hostname }}'
type: domain_password
register: edit_attribute_actual
vars: *become_vars
- name: assert remove attribute
assert:
that:
- edit_attribute is changed
- edit_attribute_actual.attributes|count == 2
- edit_attribute_actual.attributes[0].name == "attribute 1"
- edit_attribute_actual.attributes[0].data == "attribute 1 value new"|b64encode
- edit_attribute_actual.attributes[1].name == "attribute 2"
- edit_attribute_actual.attributes[1].data == "attribute 2 value"|b64encode
- name: remove credential (check mode)
win_credential:
name: '{{ test_hostname }}'
type: domain_password
state: absent
register: remove_cred_check
check_mode: True
vars: *become_vars
- name: get result of remove credential (check mode)
test_cred_facts:
name: '{{ test_hostname }}'
type: domain_password
register: remove_cred_actual_check
vars: *become_vars
- name: assert remove credential (check mode)
assert:
that:
- remove_cred_check is changed
- remove_cred_actual_check.exists
- name: remove credential
win_credential:
name: '{{ test_hostname }}'
type: domain_password
state: absent
register: remove_cred
vars: *become_vars
- name: get result of remove credential
test_cred_facts:
name: '{{ test_hostname }}'
type: domain_password
register: remove_cred_actual
vars: *become_vars
- name: assert remove credential
assert:
that:
- remove_cred is changed
- not remove_cred_actual.exists
- name: remove credential again
win_credential:
name: '{{ test_hostname }}'
type: domain_password
state: absent
register: remove_cred_again
vars: *become_vars
- name: assert remove credential again
assert:
that:
- not remove_cred_again is changed
- name: create generic password (check mode)
win_credential:
name: '{{ test_hostname }}'
type: generic_password
persistence: enterprise
username: genericuser
secret: genericpass
state: present
register: generic_password_check
check_mode: True
vars: *become_vars
- name: get result of create generic password (check mode)
test_cred_facts:
name: '{{ test_hostname }}'
type: generic_password
register: generic_password_actual_check
vars: *become_vars
- name: assert result of create generic password (check mode)
assert:
that:
- generic_password_check is changed
- not generic_password_actual_check.exists
- name: create generic password
win_credential:
name: '{{ test_hostname }}'
type: generic_password
persistence: enterprise
username: genericuser
secret: genericpass
state: present
register: generic_password
vars: *become_vars
- name: get result of create generic password
test_cred_facts:
name: '{{ test_hostname }}'
type: generic_password
register: generic_password_actual
vars: *become_vars
- name: set encoded password result
set_fact:
encoded_pass: '{{ "genericpass" | string | b64encode(encoding="utf-16-le") }}'
- name: assert create generic password
assert:
that:
- generic_password is changed
- generic_password_actual.exists
- generic_password_actual.alias == None
- generic_password_actual.attributes == []
- generic_password_actual.comment == None
- generic_password_actual.name == test_hostname
- generic_password_actual.persistence == "Enterprise"
- generic_password_actual.secret == encoded_pass
- generic_password_actual.type == "Generic"
- generic_password_actual.username == "genericuser"
- name: create generic password again
win_credential:
name: '{{ test_hostname }}'
type: generic_password
persistence: enterprise
username: genericuser
secret: genericpass
state: present
register: generic_password_again
vars: *become_vars
- name: assert create generic password again
assert:
that:
- not generic_password_again is changed
- name: fail to create certificate cred with invalid thumbprint
win_credential:
name: '{{ test_hostname }}'
type: domain_certificate
username: 00112233445566778899AABBCCDDEEFF00112233
state: present
register: fail_invalid_cert
failed_when: fail_invalid_cert.msg != "Failed to find certificate with the thumbprint 00112233445566778899AABBCCDDEEFF00112233 in the CurrentUser\\My store"
vars: *become_vars
- name: create domain certificate cred (check mode)
win_credential:
name: '{{ test_hostname }}'
type: domain_certificate
username: '{{ cert_thumbprint }}'
state: present
register: domain_cert_check
check_mode: True
vars: *become_vars
- name: get result of create domain certificate cred (check mode)
test_cred_facts:
name: '{{ test_hostname }}'
type: domain_certificate
register: domain_cert_actual_check
vars: *become_vars
- name: assert create domain certificate cred (check mode)
assert:
that:
- domain_cert_check is changed
- not domain_cert_actual_check.exists
- name: create domain certificate cred
win_credential:
name: '{{ test_hostname }}'
type: domain_certificate
username: '{{ cert_thumbprint }}'
state: present
register: domain_cert
vars: *become_vars
- name: get result of create domain certificate cred
test_cred_facts:
name: '{{ test_hostname }}'
type: domain_certificate
register: domain_cert_actual
vars: *become_vars
- name: assert create domain certificate cred
assert:
that:
- domain_cert is changed
- domain_cert_actual.exists
- domain_cert_actual.alias == None
- domain_cert_actual.attributes == []
- domain_cert_actual.comment == None
- domain_cert_actual.name == test_hostname
- domain_cert_actual.persistence == "LocalMachine"
- domain_cert_actual.secret == ""
- domain_cert_actual.type == "DomainCertificate"
- domain_cert_actual.username == cert_thumbprint
- name: create domain certificate cred again
win_credential:
name: '{{ test_hostname }}'
type: domain_certificate
username: '{{ cert_thumbprint }}'
state: present
register: domain_cert_again
vars: *become_vars
- name: assert create domain certificate cred again
assert:
that:
- not domain_cert_again is changed
- name: create generic certificate cred (check mode)
win_credential:
name: '{{ test_hostname }}'
type: generic_certificate
username: '{{ cert_thumbprint }}'
secret: '{{ "pin code" | b64encode }}'
secret_format: base64
state: present
register: generic_cert_check
check_mode: True
vars: *become_vars
- name: get result of create generic certificate cred (check mode)
test_cred_facts:
name: '{{ test_hostname }}'
type: generic_certificate
register: generic_cert_actual_check
vars: *become_vars
- name: assert create generic certificate cred (check mode)
assert:
that:
- generic_cert_check is changed
- not generic_cert_actual_check.exists
- name: create generic certificate cred
win_credential:
name: '{{ test_hostname }}'
type: generic_certificate
username: '{{ cert_thumbprint }}'
secret: '{{ "pin code" | b64encode }}'
secret_format: base64
state: present
register: generic_cert
vars: *become_vars
- name: get result of create generic certificate cred
test_cred_facts:
name: '{{ test_hostname }}'
type: generic_certificate
register: generic_cert_actual
vars: *become_vars
- name: assert create generic certificate cred
assert:
that:
- generic_cert is changed
- generic_cert_actual.exists
- generic_cert_actual.alias == None
- generic_cert_actual.attributes == []
- generic_cert_actual.comment == None
- generic_cert_actual.name == test_hostname
- generic_cert_actual.persistence == "LocalMachine"
- generic_cert_actual.secret == "pin code" | b64encode
- generic_cert_actual.type == "GenericCertificate"
- generic_cert_actual.username == cert_thumbprint
- name: create generic certificate cred again
win_credential:
name: '{{ test_hostname }}'
type: generic_certificate
username: '{{ cert_thumbprint }}'
state: present
register: generic_cert_again
vars: *become_vars
- name: assert create generic certificate cred again
assert:
that:
- not generic_cert_again is changed
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,088 |
zabbix_template dump - omit date field
|
##### SUMMARY
What you think about adding an module option to omit the date field on a template dump. Currently if you export a template the copy-content to file is always changed=True as the date field of the export is always another. If we omit the field it would report the correct change state.
Would add a PR if others like the idea.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
zabbix_template
|
https://github.com/ansible/ansible/issues/67088
|
https://github.com/ansible/ansible/pull/67302
|
cceb517aff5775cf8acf06453998a1120f66efd6
|
8aec05847334449ecff64de8888569ad5ce2e239
| 2020-02-04T14:58:27Z |
python
| 2020-02-19T06:56:40Z |
changelogs/fragments/67302-zabbix_template_info-add-omit_date-field.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,088 |
zabbix_template dump - omit date field
|
##### SUMMARY
What you think about adding an module option to omit the date field on a template dump. Currently if you export a template the copy-content to file is always changed=True as the date field of the export is always another. If we omit the field it would report the correct change state.
Would add a PR if others like the idea.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
zabbix_template
|
https://github.com/ansible/ansible/issues/67088
|
https://github.com/ansible/ansible/pull/67302
|
cceb517aff5775cf8acf06453998a1120f66efd6
|
8aec05847334449ecff64de8888569ad5ce2e239
| 2020-02-04T14:58:27Z |
python
| 2020-02-19T06:56:40Z |
lib/ansible/modules/monitoring/zabbix/zabbix_template.py
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
# (c) 2017, sookido
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'}
DOCUMENTATION = r'''
---
module: zabbix_template
short_description: Create/update/delete/dump Zabbix template
description:
- This module allows you to create, modify, delete and dump Zabbix templates.
- Multiple templates can be created or modified at once if passing JSON or XML to module.
version_added: "2.5"
author:
- "sookido (@sookido)"
- "Logan Vig (@logan2211)"
- "Dusan Matejka (@D3DeFi)"
requirements:
- "python >= 2.6"
- "zabbix-api >= 0.5.4"
options:
template_name:
description:
- Name of Zabbix template.
- Required when I(template_json) or I(template_xml) are not used.
- Mutually exclusive with I(template_json) and I(template_xml).
required: false
type: str
template_json:
description:
- JSON dump of templates to import.
- Multiple templates can be imported this way.
- Mutually exclusive with I(template_name) and I(template_xml).
required: false
type: json
template_xml:
description:
- XML dump of templates to import.
- Multiple templates can be imported this way.
- You are advised to pass XML structure matching the structure used by your version of Zabbix server.
- Custom XML structure can be imported as long as it is valid, but may not yield consistent idempotent
results on subsequent runs.
- Mutually exclusive with I(template_name) and I(template_json).
required: false
version_added: '2.9'
type: str
template_groups:
description:
- List of host groups to add template to when template is created.
- Replaces the current host groups the template belongs to if the template is already present.
- Required when creating a new template with C(state=present) and I(template_name) is used.
Not required when updating an existing template.
required: false
type: list
elements: str
link_templates:
description:
- List of template names to be linked to the template.
- Templates that are not specified and are linked to the existing template will be only unlinked and not
cleared from the template.
required: false
type: list
elements: str
clear_templates:
description:
- List of template names to be unlinked and cleared from the template.
- This option is ignored if template is being created for the first time.
required: false
type: list
elements: str
macros:
description:
- List of user macros to create for the template.
- Macros that are not specified and are present on the existing template will be replaced.
- See examples on how to pass macros.
required: false
type: list
elements: dict
suboptions:
name:
description:
- Name of the macro.
- Must be specified in {$NAME} format.
type: str
value:
description:
- Value of the macro.
type: str
dump_format:
description:
- Format to use when dumping template with C(state=dump).
- This option is deprecated and will eventually be removed in 2.14.
required: false
choices: [json, xml]
default: "json"
version_added: '2.9'
type: str
state:
description:
- Required state of the template.
- On C(state=present) template will be created/imported or updated depending if it is already present.
- On C(state=dump) template content will get dumped into required format specified in I(dump_format).
- On C(state=absent) template will be deleted.
- The C(state=dump) is deprecated and will eventually be removed in 2.14. The M(zabbix_template_info) module should be used instead.
required: false
choices: [present, absent, dump]
default: "present"
type: str
extends_documentation_fragment:
- zabbix
'''
EXAMPLES = r'''
---
- name: Create a new Zabbix template linked to groups, macros and templates
local_action:
module: zabbix_template
server_url: http://127.0.0.1
login_user: username
login_password: password
template_name: ExampleHost
template_groups:
- Role
- Role2
link_templates:
- Example template1
- Example template2
macros:
- macro: '{$EXAMPLE_MACRO1}'
value: 30000
- macro: '{$EXAMPLE_MACRO2}'
value: 3
- macro: '{$EXAMPLE_MACRO3}'
value: 'Example'
state: present
- name: Unlink and clear templates from the existing Zabbix template
local_action:
module: zabbix_template
server_url: http://127.0.0.1
login_user: username
login_password: password
template_name: ExampleHost
clear_templates:
- Example template3
- Example template4
state: present
- name: Import Zabbix templates from JSON
local_action:
module: zabbix_template
server_url: http://127.0.0.1
login_user: username
login_password: password
template_json: "{{ lookup('file', 'zabbix_apache2.json') }}"
state: present
- name: Import Zabbix templates from XML
local_action:
module: zabbix_template
server_url: http://127.0.0.1
login_user: username
login_password: password
template_xml: "{{ lookup('file', 'zabbix_apache2.json') }}"
state: present
- name: Import Zabbix template from Ansible dict variable
zabbix_template:
login_user: username
login_password: password
server_url: http://127.0.0.1
template_json:
zabbix_export:
version: '3.2'
templates:
- name: Template for Testing
description: 'Testing template import'
template: Test Template
groups:
- name: Templates
applications:
- name: Test Application
state: present
- name: Configure macros on the existing Zabbix template
local_action:
module: zabbix_template
server_url: http://127.0.0.1
login_user: username
login_password: password
template_name: Template
macros:
- macro: '{$TEST_MACRO}'
value: 'Example'
state: present
- name: Delete Zabbix template
local_action:
module: zabbix_template
server_url: http://127.0.0.1
login_user: username
login_password: password
template_name: Template
state: absent
- name: Dump Zabbix template as JSON
local_action:
module: zabbix_template
server_url: http://127.0.0.1
login_user: username
login_password: password
template_name: Template
state: dump
register: template_dump
- name: Dump Zabbix template as XML
local_action:
module: zabbix_template
server_url: http://127.0.0.1
login_user: username
login_password: password
template_name: Template
dump_format: xml
state: dump
register: template_dump
'''
RETURN = r'''
---
template_json:
description: The JSON dump of the template
returned: when state is dump
type: str
sample: {
"zabbix_export":{
"date":"2017-11-29T16:37:24Z",
"templates":[{
"templates":[],
"description":"",
"httptests":[],
"screens":[],
"applications":[],
"discovery_rules":[],
"groups":[{"name":"Templates"}],
"name":"Test Template",
"items":[],
"macros":[],
"template":"test"
}],
"version":"3.2",
"groups":[{
"name":"Templates"
}]
}
}
template_xml:
description: dump of the template in XML representation
returned: when state is dump and dump_format is xml
type: str
sample: |-
<?xml version="1.0" ?>
<zabbix_export>
<version>4.2</version>
<date>2019-07-12T13:37:26Z</date>
<groups>
<group>
<name>Templates</name>
</group>
</groups>
<templates>
<template>
<template>test</template>
<name>Test Template</name>
<description/>
<groups>
<group>
<name>Templates</name>
</group>
</groups>
<applications/>
<items/>
<discovery_rules/>
<httptests/>
<macros/>
<templates/>
<screens/>
<tags/>
</template>
</templates>
</zabbix_export>
'''
import atexit
import json
import traceback
import xml.etree.ElementTree as ET
from distutils.version import LooseVersion
from ansible.module_utils.basic import AnsibleModule, missing_required_lib
from ansible.module_utils._text import to_native
try:
from zabbix_api import ZabbixAPI, ZabbixAPIException
HAS_ZABBIX_API = True
except ImportError:
ZBX_IMP_ERR = traceback.format_exc()
HAS_ZABBIX_API = False
class Template(object):
def __init__(self, module, zbx):
self._module = module
self._zapi = zbx
# check if host group exists
def check_host_group_exist(self, group_names):
for group_name in group_names:
result = self._zapi.hostgroup.get({'filter': {'name': group_name}})
if not result:
self._module.fail_json(msg="Hostgroup not found: %s" %
group_name)
return True
# get group ids by group names
def get_group_ids_by_group_names(self, group_names):
group_ids = []
if group_names is None or len(group_names) == 0:
return group_ids
if self.check_host_group_exist(group_names):
group_list = self._zapi.hostgroup.get(
{'output': 'extend',
'filter': {'name': group_names}})
for group in group_list:
group_id = group['groupid']
group_ids.append({'groupid': group_id})
return group_ids
def get_template_ids(self, template_list):
template_ids = []
if template_list is None or len(template_list) == 0:
return template_ids
for template in template_list:
template_list = self._zapi.template.get(
{'output': 'extend',
'filter': {'host': template}})
if len(template_list) < 1:
continue
else:
template_id = template_list[0]['templateid']
template_ids.append(template_id)
return template_ids
def add_template(self, template_name, group_ids, link_template_ids, macros):
if self._module.check_mode:
self._module.exit_json(changed=True)
self._zapi.template.create({'host': template_name, 'groups': group_ids, 'templates': link_template_ids,
'macros': macros})
def check_template_changed(self, template_ids, template_groups, link_templates, clear_templates,
template_macros, template_content, template_type):
"""Compares template parameters to already existing values if any are found.
template_json - JSON structures are compared as deep sorted dictionaries,
template_xml - XML structures are compared as strings, but filtered and formatted first,
If none above is used, all the other arguments are compared to their existing counterparts
retrieved from Zabbix API."""
changed = False
# Compare filtered and formatted XMLs strings for any changes. It is expected that provided
# XML has same structure as Zabbix uses (e.g. it was optimally exported via Zabbix GUI or API)
if template_content is not None and template_type == 'xml':
existing_template = self.dump_template(template_ids, template_type='xml')
if self.filter_xml_template(template_content) != self.filter_xml_template(existing_template):
changed = True
return changed
existing_template = self.dump_template(template_ids, template_type='json')
# Compare JSON objects as deep sorted python dictionaries
if template_content is not None and template_type == 'json':
parsed_template_json = self.load_json_template(template_content)
if self.diff_template(parsed_template_json, existing_template):
changed = True
return changed
# If neither template_json or template_xml were used, user provided all parameters via module options
if template_groups is not None:
existing_groups = [g['name'] for g in existing_template['zabbix_export']['groups']]
if set(template_groups) != set(existing_groups):
changed = True
if 'templates' not in existing_template['zabbix_export']['templates'][0]:
existing_template['zabbix_export']['templates'][0]['templates'] = []
# Check if any new templates would be linked or any existing would be unlinked
exist_child_templates = [t['name'] for t in existing_template['zabbix_export']['templates'][0]['templates']]
if link_templates is not None:
if set(link_templates) != set(exist_child_templates):
changed = True
else:
if set([]) != set(exist_child_templates):
changed = True
# Mark that there will be changes when at least one existing template will be unlinked
if clear_templates is not None:
for t in clear_templates:
if t in exist_child_templates:
changed = True
break
if 'macros' not in existing_template['zabbix_export']['templates'][0]:
existing_template['zabbix_export']['templates'][0]['macros'] = []
if template_macros is not None:
existing_macros = existing_template['zabbix_export']['templates'][0]['macros']
if template_macros != existing_macros:
changed = True
return changed
def update_template(self, template_ids, group_ids, link_template_ids, clear_template_ids, template_macros):
template_changes = {}
if group_ids is not None:
template_changes.update({'groups': group_ids})
if link_template_ids is not None:
template_changes.update({'templates': link_template_ids})
else:
template_changes.update({'templates': []})
if clear_template_ids is not None:
template_changes.update({'templates_clear': clear_template_ids})
if template_macros is not None:
template_changes.update({'macros': template_macros})
if template_changes:
# If we got here we know that only one template was provided via template_name
template_changes.update({'templateid': template_ids[0]})
self._zapi.template.update(template_changes)
def delete_template(self, templateids):
if self._module.check_mode:
self._module.exit_json(changed=True)
self._zapi.template.delete(templateids)
def ordered_json(self, obj):
# Deep sort json dicts for comparison
if isinstance(obj, dict):
return sorted((k, self.ordered_json(v)) for k, v in obj.items())
if isinstance(obj, list):
return sorted(self.ordered_json(x) for x in obj)
else:
return obj
def dump_template(self, template_ids, template_type='json'):
if self._module.check_mode:
self._module.exit_json(changed=True)
try:
dump = self._zapi.configuration.export({'format': template_type, 'options': {'templates': template_ids}})
if template_type == 'xml':
return str(ET.tostring(ET.fromstring(dump.encode('utf-8')), encoding='utf-8').decode('utf-8'))
else:
return self.load_json_template(dump)
except ZabbixAPIException as e:
self._module.fail_json(msg='Unable to export template: %s' % e)
def diff_template(self, template_json_a, template_json_b):
# Compare 2 zabbix templates and return True if they differ.
template_json_a = self.filter_template(template_json_a)
template_json_b = self.filter_template(template_json_b)
if self.ordered_json(template_json_a) == self.ordered_json(template_json_b):
return False
return True
def filter_template(self, template_json):
# Filter the template json to contain only the keys we will update
keep_keys = set(['graphs', 'templates', 'triggers', 'value_maps'])
unwanted_keys = set(template_json['zabbix_export']) - keep_keys
for unwanted_key in unwanted_keys:
del template_json['zabbix_export'][unwanted_key]
# Versions older than 2.4 do not support description field within template
desc_not_supported = False
if LooseVersion(self._zapi.api_version()).version[:2] < LooseVersion('2.4').version:
desc_not_supported = True
# Filter empty attributes from template object to allow accurate comparison
for template in template_json['zabbix_export']['templates']:
for key in list(template.keys()):
if not template[key] or (key == 'description' and desc_not_supported):
template.pop(key)
return template_json
def filter_xml_template(self, template_xml):
"""Filters out keys from XML template that may wary between exports (e.g date or version) and
keys that are not imported via this module.
It is advised that provided XML template exactly matches XML structure used by Zabbix"""
# Strip last new line and convert string to ElementTree
parsed_xml_root = self.load_xml_template(template_xml.strip())
keep_keys = ['graphs', 'templates', 'triggers', 'value_maps']
# Remove unwanted XML nodes
for node in list(parsed_xml_root):
if node.tag not in keep_keys:
parsed_xml_root.remove(node)
# Filter empty attributes from template objects to allow accurate comparison
for template in list(parsed_xml_root.find('templates')):
for element in list(template):
if element.text is None and len(list(element)) == 0:
template.remove(element)
# Filter new lines and indentation
xml_root_text = list(line.strip() for line in ET.tostring(parsed_xml_root, encoding='utf8', method='xml').decode().split('\n'))
return ''.join(xml_root_text)
def load_json_template(self, template_json):
try:
return json.loads(template_json)
except ValueError as e:
self._module.fail_json(msg='Invalid JSON provided', details=to_native(e), exception=traceback.format_exc())
def load_xml_template(self, template_xml):
try:
return ET.fromstring(template_xml)
except ET.ParseError as e:
self._module.fail_json(msg='Invalid XML provided', details=to_native(e), exception=traceback.format_exc())
def import_template(self, template_content, template_type='json'):
# rules schema latest version
update_rules = {
'applications': {
'createMissing': True,
'deleteMissing': True
},
'discoveryRules': {
'createMissing': True,
'updateExisting': True,
'deleteMissing': True
},
'graphs': {
'createMissing': True,
'updateExisting': True,
'deleteMissing': True
},
'groups': {
'createMissing': True
},
'httptests': {
'createMissing': True,
'updateExisting': True,
'deleteMissing': True
},
'items': {
'createMissing': True,
'updateExisting': True,
'deleteMissing': True
},
'templates': {
'createMissing': True,
'updateExisting': True
},
'templateLinkage': {
'createMissing': True
},
'templateScreens': {
'createMissing': True,
'updateExisting': True,
'deleteMissing': True
},
'triggers': {
'createMissing': True,
'updateExisting': True,
'deleteMissing': True
},
'valueMaps': {
'createMissing': True,
'updateExisting': True
}
}
try:
# old api version support here
api_version = self._zapi.api_version()
# updateExisting for application removed from zabbix api after 3.2
if LooseVersion(api_version).version[:2] <= LooseVersion('3.2').version:
update_rules['applications']['updateExisting'] = True
# templateLinkage.deleteMissing only available in 4.0 branch higher .16 and higher 4.4.4
# it's not available in 4.2 branches or lower 4.0.16
if LooseVersion(api_version).version[:2] == LooseVersion('4.0').version and \
LooseVersion(api_version).version[:3] >= LooseVersion('4.0.16').version:
update_rules['templateLinkage']['deleteMissing'] = True
if LooseVersion(api_version).version[:3] >= LooseVersion('4.4.4').version:
update_rules['templateLinkage']['deleteMissing'] = True
import_data = {'format': template_type, 'source': template_content, 'rules': update_rules}
self._zapi.configuration.import_(import_data)
except ZabbixAPIException as e:
self._module.fail_json(msg='Unable to import template', details=to_native(e),
exception=traceback.format_exc())
def main():
module = AnsibleModule(
argument_spec=dict(
server_url=dict(type='str', required=True, aliases=['url']),
login_user=dict(type='str', required=True),
login_password=dict(type='str', required=True, no_log=True),
http_login_user=dict(type='str', required=False, default=None),
http_login_password=dict(type='str', required=False, default=None, no_log=True),
validate_certs=dict(type='bool', required=False, default=True),
template_name=dict(type='str', required=False),
template_json=dict(type='json', required=False),
template_xml=dict(type='str', required=False),
template_groups=dict(type='list', required=False),
link_templates=dict(type='list', required=False),
clear_templates=dict(type='list', required=False),
macros=dict(type='list', required=False),
dump_format=dict(type='str', required=False, default='json', choices=['json', 'xml']),
state=dict(type='str', default="present", choices=['present', 'absent', 'dump']),
timeout=dict(type='int', default=10)
),
required_one_of=[
['template_name', 'template_json', 'template_xml']
],
mutually_exclusive=[
['template_name', 'template_json', 'template_xml']
],
required_if=[
['state', 'absent', ['template_name']],
['state', 'dump', ['template_name']]
],
supports_check_mode=True
)
if not HAS_ZABBIX_API:
module.fail_json(msg=missing_required_lib('zabbix-api', url='https://pypi.org/project/zabbix-api/'), exception=ZBX_IMP_ERR)
server_url = module.params['server_url']
login_user = module.params['login_user']
login_password = module.params['login_password']
http_login_user = module.params['http_login_user']
http_login_password = module.params['http_login_password']
validate_certs = module.params['validate_certs']
template_name = module.params['template_name']
template_json = module.params['template_json']
template_xml = module.params['template_xml']
template_groups = module.params['template_groups']
link_templates = module.params['link_templates']
clear_templates = module.params['clear_templates']
template_macros = module.params['macros']
dump_format = module.params['dump_format']
state = module.params['state']
timeout = module.params['timeout']
zbx = None
try:
zbx = ZabbixAPI(server_url, timeout=timeout, user=http_login_user, passwd=http_login_password,
validate_certs=validate_certs)
zbx.login(login_user, login_password)
atexit.register(zbx.logout)
except ZabbixAPIException as e:
module.fail_json(msg="Failed to connect to Zabbix server: %s" % e)
template = Template(module, zbx)
# Identify template names for IDs retrieval
# Template names are expected to reside in ['zabbix_export']['templates'][*]['template'] for both data types
template_content, template_type = None, None
if template_json is not None:
template_type = 'json'
template_content = template_json
json_parsed = template.load_json_template(template_content)
template_names = list(t['template'] for t in json_parsed['zabbix_export']['templates'])
elif template_xml is not None:
template_type = 'xml'
template_content = template_xml
xml_parsed = template.load_xml_template(template_content)
template_names = list(t.find('template').text for t in list(xml_parsed.find('templates')))
else:
template_names = [template_name]
template_ids = template.get_template_ids(template_names)
if state == "absent":
if not template_ids:
module.exit_json(changed=False, msg="Template not found. No changed: %s" % template_name)
template.delete_template(template_ids)
module.exit_json(changed=True, result="Successfully deleted template %s" % template_name)
elif state == "dump":
module.deprecate("The 'dump' state has been deprecated and will be removed, use 'zabbix_template_info' module instead.", version='2.14')
if not template_ids:
module.fail_json(msg='Template not found: %s' % template_name)
if dump_format == 'json':
module.exit_json(changed=False, template_json=template.dump_template(template_ids, template_type='json'))
elif dump_format == 'xml':
module.exit_json(changed=False, template_xml=template.dump_template(template_ids, template_type='xml'))
elif state == "present":
# Load all subelements for template that were provided by user
group_ids = None
if template_groups is not None:
group_ids = template.get_group_ids_by_group_names(template_groups)
link_template_ids = None
if link_templates is not None:
link_template_ids = template.get_template_ids(link_templates)
clear_template_ids = None
if clear_templates is not None:
clear_template_ids = template.get_template_ids(clear_templates)
if template_macros is not None:
# Zabbix configuration.export does not differentiate python types (numbers are returned as strings)
for macroitem in template_macros:
for key in macroitem:
macroitem[key] = str(macroitem[key])
if not template_ids:
# Assume new templates are being added when no ID's were found
if template_content is not None:
template.import_template(template_content, template_type)
module.exit_json(changed=True, result="Template import successful")
else:
if group_ids is None:
module.fail_json(msg='template_groups are required when creating a new Zabbix template')
template.add_template(template_name, group_ids, link_template_ids, template_macros)
module.exit_json(changed=True, result="Successfully added template: %s" % template_name)
else:
changed = template.check_template_changed(template_ids, template_groups, link_templates, clear_templates,
template_macros, template_content, template_type)
if module.check_mode:
module.exit_json(changed=changed)
if changed:
if template_type is not None:
template.import_template(template_content, template_type)
else:
template.update_template(template_ids, group_ids, link_template_ids, clear_template_ids,
template_macros)
module.exit_json(changed=changed, result="Template successfully updated")
if __name__ == '__main__':
main()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,088 |
zabbix_template dump - omit date field
|
##### SUMMARY
What you think about adding an module option to omit the date field on a template dump. Currently if you export a template the copy-content to file is always changed=True as the date field of the export is always another. If we omit the field it would report the correct change state.
Would add a PR if others like the idea.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
zabbix_template
|
https://github.com/ansible/ansible/issues/67088
|
https://github.com/ansible/ansible/pull/67302
|
cceb517aff5775cf8acf06453998a1120f66efd6
|
8aec05847334449ecff64de8888569ad5ce2e239
| 2020-02-04T14:58:27Z |
python
| 2020-02-19T06:56:40Z |
lib/ansible/modules/monitoring/zabbix/zabbix_template_info.py
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright: (c) 2019, sky-joker
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'}
DOCUMENTATION = '''
module: zabbix_template_info
short_description: Gather information about Zabbix template
author:
- sky-joker (@sky-joker)
version_added: '2.10'
description:
- This module allows you to search for Zabbix template.
requirements:
- "python >= 2.6"
- "zabbix-api >= 0.5.4"
options:
template_name:
description:
- Name of the template in Zabbix.
required: true
type: str
format:
description:
- Format to use when dumping template.
choices: ['json', 'xml']
default: json
type: str
extends_documentation_fragment:
- zabbix
'''
EXAMPLES = '''
- name: Get Zabbix template as JSON
zabbix_template_info:
server_url: "http://zabbix.example.com/zabbix/"
login_user: admin
login_password: secret
template_name: Template
format: json
register: template_json
- name: Get Zabbix template as XML
zabbix_template_info:
server_url: "http://zabbix.example.com/zabbix/"
login_user: admin
login_password: secret
template_name: Template
format: xml
register: template_json
'''
RETURN = '''
---
template_json:
description: The JSON of the template
returned: when format is json
type: str
sample: {
"zabbix_export": {
"version": "4.0",
"date": "2019-10-27T14:17:24Z",
"groups": [
{
"name": "Templates"
}
],
"templates": [
{
"template": "Test Template",
"name": "Template for Testing",
"description": "Testing template import",
"groups": [
{
"name": "Templates"
}
],
"applications": [
{
"name": "Test Application"
}
],
"items": [],
"discovery_rules": [],
"httptests": [],
"macros": [],
"templates": [],
"screens": []
}
]
}
}
template_xml:
description: The XML of the template
returned: when format is xml
type: str
sample: >-
<zabbix_export>
<version>4.0</version>
<date>2019-10-27T14:49:57Z</date>
<groups>
<group>
<name>Templates</name>
</group>
</groups>
<templates>
<template>
<template>Test Template</template>
<name>Template for Testing</name>
<description>Testing template import</description>
<groups>
<group>
<name>Templates</name>
</group>
</groups>
<applications>
<application>
<name>Test Application</name>
</application>
</applications>
<items />
<discovery_rules />
<httptests />
<macros />
<templates />
<screens />
</template>
</templates>
</zabbix_export>
'''
import atexit
import traceback
import json
import xml.etree.ElementTree as ET
try:
from zabbix_api import ZabbixAPI
from zabbix_api import Already_Exists
from zabbix_api import ZabbixAPIException
HAS_ZABBIX_API = True
except ImportError:
ZBX_IMP_ERR = traceback.format_exc()
HAS_ZABBIX_API = False
from ansible.module_utils.basic import AnsibleModule, missing_required_lib
from ansible.module_utils._text import to_native
class TemplateInfo(object):
def __init__(self, module, zbx):
self._module = module
self._zapi = zbx
def get_template_id(self, template_name):
template_id = []
try:
template_list = self._zapi.template.get(
{
'output': 'extend',
'filter': {
'host': template_name
}
}
)
except ZabbixAPIException as e:
self._module.fail_json(msg='Failed to get template: %s' % e)
if template_list:
template_id.append(template_list[0]['templateid'])
return template_id
def load_json_template(self, template_json):
try:
return json.loads(template_json)
except ValueError as e:
self._module.fail_json(msg='Invalid JSON provided', details=to_native(e), exception=traceback.format_exc())
def dump_template(self, template_id, template_type='json'):
try:
dump = self._zapi.configuration.export({'format': template_type, 'options': {'templates': template_id}})
if template_type == 'xml':
return str(ET.tostring(ET.fromstring(dump.encode('utf-8')), encoding='utf-8').decode('utf-8'))
else:
return self.load_json_template(dump)
except ZabbixAPIException as e:
self._module.fail_json(msg='Unable to export template: %s' % e)
def main():
module = AnsibleModule(
argument_spec=dict(
server_url=dict(type='str', required=True, aliases=['url']),
login_user=dict(type='str', required=True),
login_password=dict(type='str', required=True, no_log=True),
http_login_user=dict(type='str', required=False, default=None),
http_login_password=dict(type='str', required=False, default=None, no_log=True),
validate_certs=dict(type='bool', required=False, default=True),
timeout=dict(type='int', default=10),
template_name=dict(type='str', required=True),
format=dict(type='str', choices=['json', 'xml'], default='json')
),
supports_check_mode=False
)
if not HAS_ZABBIX_API:
module.fail_json(msg=missing_required_lib('zabbix-api', url='https://pypi.org/project/zabbix-api/'),
exception=ZBX_IMP_ERR)
server_url = module.params['server_url']
login_user = module.params['login_user']
login_password = module.params['login_password']
http_login_user = module.params['http_login_user']
http_login_password = module.params['http_login_password']
validate_certs = module.params['validate_certs']
timeout = module.params['timeout']
template_name = module.params['template_name']
format = module.params['format']
try:
zbx = ZabbixAPI(server_url, timeout=timeout, user=http_login_user, passwd=http_login_password,
validate_certs=validate_certs)
zbx.login(login_user, login_password)
atexit.register(zbx.logout)
except Exception as e:
module.fail_json(msg="Failed to connect to Zabbix server: %s" % e)
template_info = TemplateInfo(module, zbx)
template_id = template_info.get_template_id(template_name)
if not template_id:
module.fail_json(msg='Template not found: %s' % template_name)
if format == 'json':
module.exit_json(changed=False, template_json=template_info.dump_template(template_id, template_type='json'))
elif format == 'xml':
module.exit_json(changed=False, template_xml=template_info.dump_template(template_id, template_type='xml'))
if __name__ == "__main__":
main()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,100 |
vmware_guest and vmware_guest_disk modules fail to return Storage DRS recommendations when using datastore clusters
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
When passing in a `datastore_cluster` name to the `vmware_guest_disk` or `vmware_guest` modules, returning a datastore recommendation is failing and falling back on the default behavior (iterating through available datastores). This breaks Storage DRS behavior such as "Keep VMDKs Together"
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
vmware_guest_disk
vmware_guest
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.10.0.dev0
config file = /etc/ansible/ansible.cfg
configured module search path = ['/export/home/rr86949e/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /tech/home/rr86949e/debug/ansible/lib/ansible
executable location = /tech/home/rr86949e/debug/ansible/bin/ansible
python version = 3.6.8 (default, Jun 11 2019, 15:15:01) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
DEFAULT_ROLES_PATH(/etc/ansible/ansible.cfg) = ['/etc/ansible/roles', '/usr/share/ansible/roles']
HOST_KEY_CHECKING(/etc/ansible/ansible.cfg) = False
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
RedHat Linux Enterprise Server 7.6
vSphere Client version 6.5.0.23000
pyvmomi 6.7.3
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
- name: Add disks to virtual machine, extend existing disk or remove disk
vmware_guest_disk:
hostname: "{{ lookup('env', 'VMWARE_HOST') }}"
username: "{{ lookup('env', 'VMWARE_USER') }}"
password: "{{ lookup('env', 'VMWARE_PASSWORD') }}"
datacenter: "{{ datacenter_name }}"
validate_certs: no
name: "{{ vm_hostname }}"
disk: "{{ disk_params }}"
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
With Storage DRS enabled and "Keep VMDKs together" checked in vSphere, the recommended datastore for the disk should be the same as the datastore used for the VM's OS disk. If you follow the manual steps through the vSphere UI, you can see the storage recommendation to place any new hard disks on the same datastore in the datastore cluster as the VM's OS disk.
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
The new disk is on a separate datastore in the datastore cluster from that of the VM's OS disk. This does not match behavior through the vSphere UI with the same settings enabled.
##### ADDITIONAL
Using the steps from https://docs.ansible.com/ansible/latest/dev_guide/developing_modules_general.html I was able
to debug this using `pdb` and see the following error messages from executing the line below: https://github.com/ansible/ansible/blob/fe454d27a1aa5386801563ffe8dde44064f84302/lib/ansible/modules/cloud/vmware/vmware_guest_disk.py#L765:
```
-> rec = self.content.storageResourceManager.RecommendDatastores(storageSpec=storage_spec)
(Pdb) n
pyVmomi.VmomiSupport.vmodl.fault.InvalidArgument: (vmodl.fault.InvalidArgument) {
dynamicType = <unset>,
dynamicProperty = (vmodl.DynamicProperty) [],
msg = 'A specified parameter was not correct: configSpec',
faultCause = <unset>,
faultMessage = (vmodl.LocalizableMessage) [],
invalidProperty = 'configSpec'
}
```
I was able to fix it by adding the following lines before the `try` block to update the `storage_spec` object:
```python
if sdrs_status:
# We can get storage recommendation only if SDRS is enabled on given datastorage cluster
pod_sel_spec = vim.storageDrs.PodSelectionSpec()
pod_sel_spec.storagePod = datastore_cluster_obj
storage_spec = vim.storageDrs.StoragePlacementSpec()
storage_spec.podSelectionSpec = pod_sel_spec
storage_spec.type = 'create'
# my changes here
storage_spec.configSpec = self.config_spec
storage_spec.resourcePool = self.vm.resourcePool
# end of my changes
try:
```
It is no longer failing and falling back to default behavior here, but I am still seeing a different recommendation returned from what I would get via the UI. At the very least, I would like to fix this initial issue and have the actual return recommendation looked into as well.
|
https://github.com/ansible/ansible/issues/67100
|
https://github.com/ansible/ansible/pull/67221
|
faa9533734a1ee3f0bb563704b277ffcc3a2423f
|
59289183527940162ae8c59a3746baf664171390
| 2020-02-04T20:17:14Z |
python
| 2020-02-19T13:42:27Z |
changelogs/fragments/67221-vmware-guest-disk-storage-drs-fix.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,100 |
vmware_guest and vmware_guest_disk modules fail to return Storage DRS recommendations when using datastore clusters
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
When passing in a `datastore_cluster` name to the `vmware_guest_disk` or `vmware_guest` modules, returning a datastore recommendation is failing and falling back on the default behavior (iterating through available datastores). This breaks Storage DRS behavior such as "Keep VMDKs Together"
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
vmware_guest_disk
vmware_guest
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.10.0.dev0
config file = /etc/ansible/ansible.cfg
configured module search path = ['/export/home/rr86949e/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /tech/home/rr86949e/debug/ansible/lib/ansible
executable location = /tech/home/rr86949e/debug/ansible/bin/ansible
python version = 3.6.8 (default, Jun 11 2019, 15:15:01) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
DEFAULT_ROLES_PATH(/etc/ansible/ansible.cfg) = ['/etc/ansible/roles', '/usr/share/ansible/roles']
HOST_KEY_CHECKING(/etc/ansible/ansible.cfg) = False
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
RedHat Linux Enterprise Server 7.6
vSphere Client version 6.5.0.23000
pyvmomi 6.7.3
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
- name: Add disks to virtual machine, extend existing disk or remove disk
vmware_guest_disk:
hostname: "{{ lookup('env', 'VMWARE_HOST') }}"
username: "{{ lookup('env', 'VMWARE_USER') }}"
password: "{{ lookup('env', 'VMWARE_PASSWORD') }}"
datacenter: "{{ datacenter_name }}"
validate_certs: no
name: "{{ vm_hostname }}"
disk: "{{ disk_params }}"
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
With Storage DRS enabled and "Keep VMDKs together" checked in vSphere, the recommended datastore for the disk should be the same as the datastore used for the VM's OS disk. If you follow the manual steps through the vSphere UI, you can see the storage recommendation to place any new hard disks on the same datastore in the datastore cluster as the VM's OS disk.
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
The new disk is on a separate datastore in the datastore cluster from that of the VM's OS disk. This does not match behavior through the vSphere UI with the same settings enabled.
##### ADDITIONAL
Using the steps from https://docs.ansible.com/ansible/latest/dev_guide/developing_modules_general.html I was able
to debug this using `pdb` and see the following error messages from executing the line below: https://github.com/ansible/ansible/blob/fe454d27a1aa5386801563ffe8dde44064f84302/lib/ansible/modules/cloud/vmware/vmware_guest_disk.py#L765:
```
-> rec = self.content.storageResourceManager.RecommendDatastores(storageSpec=storage_spec)
(Pdb) n
pyVmomi.VmomiSupport.vmodl.fault.InvalidArgument: (vmodl.fault.InvalidArgument) {
dynamicType = <unset>,
dynamicProperty = (vmodl.DynamicProperty) [],
msg = 'A specified parameter was not correct: configSpec',
faultCause = <unset>,
faultMessage = (vmodl.LocalizableMessage) [],
invalidProperty = 'configSpec'
}
```
I was able to fix it by adding the following lines before the `try` block to update the `storage_spec` object:
```python
if sdrs_status:
# We can get storage recommendation only if SDRS is enabled on given datastorage cluster
pod_sel_spec = vim.storageDrs.PodSelectionSpec()
pod_sel_spec.storagePod = datastore_cluster_obj
storage_spec = vim.storageDrs.StoragePlacementSpec()
storage_spec.podSelectionSpec = pod_sel_spec
storage_spec.type = 'create'
# my changes here
storage_spec.configSpec = self.config_spec
storage_spec.resourcePool = self.vm.resourcePool
# end of my changes
try:
```
It is no longer failing and falling back to default behavior here, but I am still seeing a different recommendation returned from what I would get via the UI. At the very least, I would like to fix this initial issue and have the actual return recommendation looked into as well.
|
https://github.com/ansible/ansible/issues/67100
|
https://github.com/ansible/ansible/pull/67221
|
faa9533734a1ee3f0bb563704b277ffcc3a2423f
|
59289183527940162ae8c59a3746baf664171390
| 2020-02-04T20:17:14Z |
python
| 2020-02-19T13:42:27Z |
lib/ansible/modules/cloud/vmware/vmware_guest_disk.py
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright: (c) 2018, Ansible Project
# Copyright: (c) 2018, Abhijeet Kasurde <[email protected]>
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {
'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'
}
DOCUMENTATION = '''
---
module: vmware_guest_disk
short_description: Manage disks related to virtual machine in given vCenter infrastructure
description:
- This module can be used to add, remove and update disks belonging to given virtual machine.
- All parameters and VMware object names are case sensitive.
- This module is destructive in nature, please read documentation carefully before proceeding.
- Be careful while removing disk specified as this may lead to data loss.
version_added: 2.8
author:
- Abhijeet Kasurde (@Akasurde) <[email protected]>
notes:
- Tested on vSphere 6.0 and 6.5
requirements:
- "python >= 2.6"
- PyVmomi
options:
name:
description:
- Name of the virtual machine.
- This is a required parameter, if parameter C(uuid) or C(moid) is not supplied.
type: str
uuid:
description:
- UUID of the instance to gather facts if known, this is VMware's unique identifier.
- This is a required parameter, if parameter C(name) or C(moid) is not supplied.
type: str
moid:
description:
- Managed Object ID of the instance to manage if known, this is a unique identifier only within a single vCenter instance.
- This is required if C(name) or C(uuid) is not supplied.
version_added: '2.9'
type: str
folder:
description:
- Destination folder, absolute or relative path to find an existing guest.
- This is a required parameter, only if multiple VMs are found with same name.
- The folder should include the datacenter. ESX's datacenter is ha-datacenter
- 'Examples:'
- ' folder: /ha-datacenter/vm'
- ' folder: ha-datacenter/vm'
- ' folder: /datacenter1/vm'
- ' folder: datacenter1/vm'
- ' folder: /datacenter1/vm/folder1'
- ' folder: datacenter1/vm/folder1'
- ' folder: /folder1/datacenter1/vm'
- ' folder: folder1/datacenter1/vm'
- ' folder: /folder1/datacenter1/vm/folder2'
type: str
datacenter:
description:
- The datacenter name to which virtual machine belongs to.
required: True
type: str
use_instance_uuid:
description:
- Whether to use the VMware instance UUID rather than the BIOS UUID.
default: no
type: bool
version_added: '2.8'
disk:
description:
- A list of disks to add.
- The virtual disk related information is provided using this list.
- All values and parameters are case sensitive.
- 'Valid attributes are:'
- ' - C(size[_tb,_gb,_mb,_kb]) (integer): Disk storage size in specified unit.'
- ' If C(size) specified then unit must be specified. There is no space allowed in between size number and unit.'
- ' Only first occurrence in disk element will be considered, even if there are multiple size* parameters available.'
- ' - C(type) (string): Valid values are:'
- ' - C(thin) thin disk'
- ' - C(eagerzeroedthick) eagerzeroedthick disk'
- ' - C(thick) thick disk'
- ' Default: C(thick) thick disk, no eagerzero.'
- ' - C(disk_mode) (string): Type of disk mode. Valid values are:'
- ' - C(persistent) Changes are immediately and permanently written to the virtual disk. This is default.'
- ' - C(independent_persistent) Same as persistent, but not affected by snapshots.'
- ' - C(independent_nonpersistent) Changes to virtual disk are made to a redo log and discarded at power off, but not affected by snapshots.'
- ' - C(datastore) (string): Name of datastore or datastore cluster to be used for the disk.'
- ' - C(autoselect_datastore) (bool): Select the less used datastore. Specify only if C(datastore) is not specified.'
- ' - C(scsi_controller) (integer): SCSI controller number. Valid value range from 0 to 3.'
- ' Only 4 SCSI controllers are allowed per VM.'
- ' Care should be taken while specifying C(scsi_controller) is 0 and C(unit_number) as 0 as this disk may contain OS.'
- ' - C(unit_number) (integer): Disk Unit Number. Valid value range from 0 to 15. Only 15 disks are allowed per SCSI Controller.'
- ' - C(scsi_type) (string): Type of SCSI controller. This value is required only for the first occurrence of SCSI Controller.'
- ' This value is ignored, if SCSI Controller is already present or C(state) is C(absent).'
- ' Valid values are C(buslogic), C(lsilogic), C(lsilogicsas) and C(paravirtual).'
- ' C(paravirtual) is default value for this parameter.'
- ' - C(destroy) (bool): If C(state) is C(absent), make sure the disk file is deleted from the datastore (default C(yes)).'
- ' Added in version 2.10.'
- ' - C(filename) (string): Existing disk image to be used. Filename must already exist on the datastore.'
- ' Specify filename string in C([datastore_name] path/to/file.vmdk) format. Added in version 2.10.'
- ' - C(state) (string): State of disk. This is either "absent" or "present".'
- ' If C(state) is set to C(absent), disk will be removed permanently from virtual machine configuration and from VMware storage.'
- ' If C(state) is set to C(present), disk will be added if not present at given SCSI Controller and Unit Number.'
- ' If C(state) is set to C(present) and disk exists with different size, disk size is increased.'
- ' Reducing disk size is not allowed.'
suboptions:
iolimit:
description:
- Section specifies the shares and limit for storage I/O resource.
suboptions:
limit:
description:
- Section specifies values for limit where the utilization of a virtual machine will not exceed, even if there are available resources.
shares:
description:
- Specifies different types of shares user can add for the given disk.
suboptions:
level:
description:
- Specifies different level for the shares section.
- Valid values are low, normal, high, custom.
level_value:
description:
- Custom value when C(level) is set as C(custom).
type: int
type: list
elements: dict
shares:
description:
- section for iolimit section tells about what are all different types of shares user can add for disk.
suboptions:
level:
description:
- tells about different level for the shares section, valid values are low,normal,high,custom.
type: str
level_value:
description:
- custom value when level is set as custom.
type: int
type: list
elements: dict
default: []
type: list
extends_documentation_fragment: vmware.documentation
'''
EXAMPLES = '''
- name: Add disks to virtual machine using UUID
vmware_guest_disk:
hostname: "{{ vcenter_hostname }}"
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
datacenter: "{{ datacenter_name }}"
validate_certs: no
uuid: 421e4592-c069-924d-ce20-7e7533fab926
disk:
- size_mb: 10
type: thin
datastore: datacluster0
state: present
scsi_controller: 1
unit_number: 1
scsi_type: 'paravirtual'
disk_mode: 'persistent'
- size_gb: 10
type: eagerzeroedthick
state: present
autoselect_datastore: True
scsi_controller: 2
scsi_type: 'buslogic'
unit_number: 12
disk_mode: 'independent_persistent'
- size: 10Gb
type: eagerzeroedthick
state: present
autoselect_datastore: True
scsi_controller: 2
scsi_type: 'buslogic'
unit_number: 1
disk_mode: 'independent_nonpersistent'
- filename: "[datastore1] path/to/existing/disk.vmdk"
delegate_to: localhost
register: disk_facts
- name: Add disks with specified shares to the virtual machine
vmware_guest_disk:
hostname: "{{ vcenter_hostname }}"
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
datacenter: "{{ datacenter_name }}"
validate_certs: no
disk:
- size_gb: 1
type: thin
datastore: datacluster0
state: present
scsi_controller: 1
unit_number: 1
disk_mode: 'independent_persistent'
shares:
level: custom
level_value: 1300
delegate_to: localhost
register: test_custom_shares
- name: create new disk with custom IO limits and shares in IO Limits
vmware_guest_disk:
hostname: "{{ vcenter_hostname }}"
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
datacenter: "{{ datacenter_name }}"
validate_certs: no
disk:
- size_gb: 1
type: thin
datastore: datacluster0
state: present
scsi_controller: 1
unit_number: 1
disk_mode: 'independent_persistent'
iolimit:
limit: 1506
shares:
level: custom
level_value: 1305
delegate_to: localhost
register: test_custom_IoLimit_shares
- name: Remove disks from virtual machine using name
vmware_guest_disk:
hostname: "{{ vcenter_hostname }}"
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
datacenter: "{{ datacenter_name }}"
validate_certs: no
name: VM_225
disk:
- state: absent
scsi_controller: 1
unit_number: 1
delegate_to: localhost
register: disk_facts
- name: Remove disk from virtual machine using moid
vmware_guest_disk:
hostname: "{{ vcenter_hostname }}"
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
datacenter: "{{ datacenter_name }}"
validate_certs: no
moid: vm-42
disk:
- state: absent
scsi_controller: 1
unit_number: 1
delegate_to: localhost
register: disk_facts
- name: Remove disk from virtual machine but keep the VMDK file on the datastore
vmware_guest_disk:
hostname: "{{ vcenter_hostname }}"
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
datacenter: "{{ datacenter_name }}"
validate_certs: no
name: VM_225
disk:
- state: absent
scsi_controller: 1
unit_number: 2
destroy: no
delegate_to: localhost
register: disk_facts
'''
RETURN = """
disk_status:
description: metadata about the virtual machine's disks after managing them
returned: always
type: dict
sample: {
"0": {
"backing_datastore": "datastore2",
"backing_disk_mode": "persistent",
"backing_eagerlyscrub": false,
"backing_filename": "[datastore2] VM_225/VM_225.vmdk",
"backing_thinprovisioned": false,
"backing_writethrough": false,
"capacity_in_bytes": 10485760,
"capacity_in_kb": 10240,
"controller_key": 1000,
"key": 2000,
"label": "Hard disk 1",
"summary": "10,240 KB",
"unit_number": 0
},
}
"""
import re
try:
from pyVmomi import vim
except ImportError:
pass
from ansible.module_utils.basic import AnsibleModule
from ansible.module_utils._text import to_native
from ansible.module_utils.vmware import PyVmomi, vmware_argument_spec, wait_for_task, find_obj, get_all_objs
class PyVmomiHelper(PyVmomi):
def __init__(self, module):
super(PyVmomiHelper, self).__init__(module)
self.desired_disks = self.params['disk'] # Match with vmware_guest parameter
self.vm = None
self.scsi_device_type = dict(lsilogic=vim.vm.device.VirtualLsiLogicController,
paravirtual=vim.vm.device.ParaVirtualSCSIController,
buslogic=vim.vm.device.VirtualBusLogicController,
lsilogicsas=vim.vm.device.VirtualLsiLogicSASController)
self.config_spec = vim.vm.ConfigSpec()
self.config_spec.deviceChange = []
def create_scsi_controller(self, scsi_type, scsi_bus_number):
"""
Create SCSI Controller with given SCSI Type and SCSI Bus Number
Args:
scsi_type: Type of SCSI
scsi_bus_number: SCSI Bus number to be assigned
Returns: Virtual device spec for SCSI Controller
"""
scsi_ctl = vim.vm.device.VirtualDeviceSpec()
scsi_ctl.operation = vim.vm.device.VirtualDeviceSpec.Operation.add
scsi_ctl.device = self.scsi_device_type[scsi_type]()
scsi_ctl.device.unitNumber = 3
scsi_ctl.device.busNumber = scsi_bus_number
scsi_ctl.device.hotAddRemove = True
scsi_ctl.device.sharedBus = 'noSharing'
scsi_ctl.device.scsiCtlrUnitNumber = 7
return scsi_ctl
@staticmethod
def create_scsi_disk(scsi_ctl_key, disk_index, disk_mode, disk_filename):
"""
Create Virtual Device Spec for virtual disk
Args:
scsi_ctl_key: Unique SCSI Controller Key
disk_index: Disk unit number at which disk needs to be attached
disk_mode: Disk mode
disk_filename: Path to the disk file on the datastore
Returns: Virtual Device Spec for virtual disk
"""
disk_spec = vim.vm.device.VirtualDeviceSpec()
disk_spec.operation = vim.vm.device.VirtualDeviceSpec.Operation.add
disk_spec.device = vim.vm.device.VirtualDisk()
disk_spec.device.backing = vim.vm.device.VirtualDisk.FlatVer2BackingInfo()
disk_spec.device.backing.diskMode = disk_mode
disk_spec.device.controllerKey = scsi_ctl_key
disk_spec.device.unitNumber = disk_index
if disk_filename is not None:
disk_spec.device.backing.fileName = disk_filename
else:
disk_spec.fileOperation = vim.vm.device.VirtualDeviceSpec.FileOperation.create
return disk_spec
def reconfigure_vm(self, config_spec, device_type):
"""
Reconfigure virtual machine after modifying device spec
Args:
config_spec: Config Spec
device_type: Type of device being modified
Returns: Boolean status 'changed' and actual task result
"""
changed, results = (False, '')
try:
# Perform actual VM reconfiguration
task = self.vm.ReconfigVM_Task(spec=config_spec)
changed, results = wait_for_task(task)
except vim.fault.InvalidDeviceSpec as invalid_device_spec:
self.module.fail_json(msg="Failed to manage %s on given virtual machine due to invalid"
" device spec : %s" % (device_type, to_native(invalid_device_spec.msg)),
details="Please check ESXi server logs for more details.")
except vim.fault.RestrictedVersion as e:
self.module.fail_json(msg="Failed to reconfigure virtual machine due to"
" product versioning restrictions: %s" % to_native(e.msg))
return changed, results
def get_ioandshares_diskconfig(self, disk_spec, disk):
io_disk_spec = vim.StorageResourceManager.IOAllocationInfo()
if 'iolimit' in disk:
io_disk_spec.limit = disk['iolimit']['limit']
if 'shares' in disk['iolimit']:
shares_spec = vim.SharesInfo()
shares_spec.level = disk['iolimit']['shares']['level']
if shares_spec.level == 'custom':
shares_spec.shares = disk['iolimit']['shares']['level_value']
io_disk_spec.shares = shares_spec
disk_spec.device.storageIOAllocation = io_disk_spec
if 'shares' in disk:
shares_spec = vim.SharesInfo()
shares_spec.level = disk['shares']['level']
if shares_spec.level == 'custom':
shares_spec.shares = disk['shares']['level_value']
io_disk_spec.shares = shares_spec
disk_spec.device.storageIOAllocation = io_disk_spec
return disk_spec
def ensure_disks(self, vm_obj=None):
"""
Manage internal state of virtual machine disks
Args:
vm_obj: Managed object of virtual machine
"""
# Set vm object
self.vm = vm_obj
# Sanitize user input
disk_data = self.sanitize_disk_inputs()
# Create stateful information about SCSI devices
current_scsi_info = dict()
results = dict(changed=False, disk_data=None, disk_changes=dict())
# Deal with SCSI Controller
for device in vm_obj.config.hardware.device:
if isinstance(device, tuple(self.scsi_device_type.values())):
# Found SCSI device
if device.busNumber not in current_scsi_info:
device_bus_number = 1000 + device.busNumber
current_scsi_info[device_bus_number] = dict(disks=dict())
scsi_changed = False
for disk in disk_data:
scsi_controller = disk['scsi_controller'] + 1000
if scsi_controller not in current_scsi_info and disk['state'] == 'present':
scsi_ctl = self.create_scsi_controller(disk['scsi_type'], disk['scsi_controller'])
current_scsi_info[scsi_controller] = dict(disks=dict())
self.config_spec.deviceChange.append(scsi_ctl)
scsi_changed = True
if scsi_changed:
self.reconfigure_vm(self.config_spec, 'SCSI Controller')
self.config_spec = vim.vm.ConfigSpec()
self.config_spec.deviceChange = []
# Deal with Disks
for device in vm_obj.config.hardware.device:
if isinstance(device, vim.vm.device.VirtualDisk):
# Found Virtual Disk device
if device.controllerKey not in current_scsi_info:
current_scsi_info[device.controllerKey] = dict(disks=dict())
current_scsi_info[device.controllerKey]['disks'][device.unitNumber] = device
vm_name = self.vm.name
disk_change_list = []
for disk in disk_data:
disk_change = False
scsi_controller = disk['scsi_controller'] + 1000 # VMware auto assign 1000 + SCSI Controller
if disk['disk_unit_number'] not in current_scsi_info[scsi_controller]['disks'] and disk['state'] == 'present':
# Add new disk
disk_spec = self.create_scsi_disk(scsi_controller, disk['disk_unit_number'], disk['disk_mode'], disk['filename'])
if disk['filename'] is None:
disk_spec.device.capacityInKB = disk['size']
if disk['disk_type'] == 'thin':
disk_spec.device.backing.thinProvisioned = True
elif disk['disk_type'] == 'eagerzeroedthick':
disk_spec.device.backing.eagerlyScrub = True
if disk['filename'] is None:
disk_spec.device.backing.fileName = "[%s] %s/%s_%s_%s.vmdk" % (
disk['datastore'].name,
vm_name, vm_name,
str(scsi_controller),
str(disk['disk_unit_number']))
else:
disk_spec.device.backing.fileName = disk['filename']
disk_spec.device.backing.datastore = disk['datastore']
disk_spec = self.get_ioandshares_diskconfig(disk_spec, disk)
self.config_spec.deviceChange.append(disk_spec)
disk_change = True
current_scsi_info[scsi_controller]['disks'][disk['disk_unit_number']] = disk_spec.device
results['disk_changes'][disk['disk_index']] = "Disk created."
elif disk['disk_unit_number'] in current_scsi_info[scsi_controller]['disks']:
if disk['state'] == 'present':
disk_spec = vim.vm.device.VirtualDeviceSpec()
# set the operation to edit so that it knows to keep other settings
disk_spec.device = current_scsi_info[scsi_controller]['disks'][disk['disk_unit_number']]
# Edit and no resizing allowed
if disk['size'] < disk_spec.device.capacityInKB:
self.module.fail_json(msg="Given disk size at disk index [%s] is smaller than found (%d < %d)."
"Reducing disks is not allowed." % (disk['disk_index'],
disk['size'],
disk_spec.device.capacityInKB))
if disk['size'] != disk_spec.device.capacityInKB:
disk_spec.operation = vim.vm.device.VirtualDeviceSpec.Operation.edit
disk_spec = self.get_ioandshares_diskconfig(disk_spec, disk)
disk_spec.device.capacityInKB = disk['size']
self.config_spec.deviceChange.append(disk_spec)
disk_change = True
results['disk_changes'][disk['disk_index']] = "Disk size increased."
else:
results['disk_changes'][disk['disk_index']] = "Disk already exists."
elif disk['state'] == 'absent':
# Disk already exists, deleting
disk_spec = vim.vm.device.VirtualDeviceSpec()
disk_spec.operation = vim.vm.device.VirtualDeviceSpec.Operation.remove
if disk['destroy'] is True:
disk_spec.fileOperation = vim.vm.device.VirtualDeviceSpec.FileOperation.destroy
disk_spec.device = current_scsi_info[scsi_controller]['disks'][disk['disk_unit_number']]
self.config_spec.deviceChange.append(disk_spec)
disk_change = True
results['disk_changes'][disk['disk_index']] = "Disk deleted."
if disk_change:
# Adding multiple disks in a single attempt raises weird errors
# So adding single disk at a time.
self.reconfigure_vm(self.config_spec, 'disks')
self.config_spec = vim.vm.ConfigSpec()
self.config_spec.deviceChange = []
disk_change_list.append(disk_change)
if any(disk_change_list):
results['changed'] = True
results['disk_data'] = self.gather_disk_facts(vm_obj=self.vm)
self.module.exit_json(**results)
def sanitize_disk_inputs(self):
"""
Check correctness of disk input provided by user
Returns: A list of dictionary containing disk information
"""
disks_data = list()
if not self.desired_disks:
self.module.exit_json(changed=False, msg="No disks provided for virtual"
" machine '%s' for management." % self.vm.name)
for disk_index, disk in enumerate(self.desired_disks):
# Initialize default value for disk
current_disk = dict(disk_index=disk_index,
state='present',
destroy=True,
filename=None,
datastore=None,
autoselect_datastore=True,
disk_unit_number=0,
scsi_controller=0,
disk_mode='persistent')
# Check state
if 'state' in disk:
if disk['state'] not in ['absent', 'present']:
self.module.fail_json(msg="Invalid state provided '%s' for disk index [%s]."
" State can be either - 'absent', 'present'" % (disk['state'],
disk_index))
else:
current_disk['state'] = disk['state']
if current_disk['state'] == 'absent':
current_disk['destroy'] = disk['destroy']
elif current_disk['state'] == 'present':
# Select datastore or datastore cluster
if 'datastore' in disk:
if 'autoselect_datastore' in disk:
self.module.fail_json(msg="Please specify either 'datastore' "
"or 'autoselect_datastore' for disk index [%s]" % disk_index)
# Check if given value is datastore or datastore cluster
datastore_name = disk['datastore']
datastore_cluster = find_obj(self.content, [vim.StoragePod], datastore_name)
if datastore_cluster:
# If user specified datastore cluster so get recommended datastore
datastore_name = self.get_recommended_datastore(datastore_cluster_obj=datastore_cluster)
# Check if get_recommended_datastore or user specified datastore exists or not
datastore = find_obj(self.content, [vim.Datastore], datastore_name)
if datastore is None:
self.module.fail_json(msg="Failed to find datastore named '%s' "
"in given configuration." % disk['datastore'])
current_disk['datastore'] = datastore
current_disk['autoselect_datastore'] = False
elif 'autoselect_datastore' in disk:
# Find datastore which fits requirement
datastores = get_all_objs(self.content, [vim.Datastore])
if not datastores:
self.module.fail_json(msg="Failed to gather information about"
" available datastores in given datacenter.")
datastore = None
datastore_freespace = 0
for ds in datastores:
if ds.summary.freeSpace > datastore_freespace:
# If datastore field is provided, filter destination datastores
datastore = ds
datastore_freespace = ds.summary.freeSpace
current_disk['datastore'] = datastore
if 'datastore' not in disk and 'autoselect_datastore' not in disk and 'filename' not in disk:
self.module.fail_json(msg="Either 'datastore' or 'autoselect_datastore' is"
" required parameter while creating disk for "
"disk index [%s]." % disk_index)
if 'filename' in disk:
current_disk['filename'] = disk['filename']
if [x for x in disk.keys() if x.startswith('size_') or x == 'size']:
# size, size_tb, size_gb, size_mb, size_kb
disk_size_parse_failed = False
if 'size' in disk:
size_regex = re.compile(r'(\d+(?:\.\d+)?)([tgmkTGMK][bB])')
disk_size_m = size_regex.match(disk['size'])
if disk_size_m:
expected = disk_size_m.group(1)
unit = disk_size_m.group(2)
else:
disk_size_parse_failed = True
try:
if re.match(r'\d+\.\d+', expected):
# We found float value in string, let's typecast it
expected = float(expected)
else:
# We found int value in string, let's typecast it
expected = int(expected)
except (TypeError, ValueError, NameError):
disk_size_parse_failed = True
else:
# Even multiple size_ parameter provided by user,
# consider first value only
param = [x for x in disk.keys() if x.startswith('size_')][0]
unit = param.split('_')[-1]
disk_size = disk[param]
if isinstance(disk_size, (float, int)):
disk_size = str(disk_size)
try:
if re.match(r'\d+\.\d+', disk_size):
# We found float value in string, let's typecast it
expected = float(disk_size)
else:
# We found int value in string, let's typecast it
expected = int(disk_size)
except (TypeError, ValueError, NameError):
disk_size_parse_failed = True
if disk_size_parse_failed:
# Common failure
self.module.fail_json(msg="Failed to parse disk size for disk index [%s],"
" please review value provided"
" using documentation." % disk_index)
disk_units = dict(tb=3, gb=2, mb=1, kb=0)
unit = unit.lower()
if unit in disk_units:
current_disk['size'] = expected * (1024 ** disk_units[unit])
else:
self.module.fail_json(msg="%s is not a supported unit for disk size for disk index [%s]."
" Supported units are ['%s']." % (unit,
disk_index,
"', '".join(disk_units.keys())))
elif current_disk['filename'] is None:
# No size found but disk, fail
self.module.fail_json(msg="No size, size_kb, size_mb, size_gb or size_tb"
" attribute found into disk index [%s] configuration." % disk_index)
# Check SCSI controller key
if 'scsi_controller' in disk:
try:
temp_disk_controller = int(disk['scsi_controller'])
except ValueError:
self.module.fail_json(msg="Invalid SCSI controller ID '%s' specified"
" at index [%s]" % (disk['scsi_controller'], disk_index))
if temp_disk_controller not in range(0, 4):
# Only 4 SCSI controllers are allowed per VM
self.module.fail_json(msg="Invalid SCSI controller ID specified [%s],"
" please specify value between 0 to 3 only." % temp_disk_controller)
current_disk['scsi_controller'] = temp_disk_controller
else:
self.module.fail_json(msg="Please specify 'scsi_controller' under disk parameter"
" at index [%s], which is required while creating disk." % disk_index)
# Check for disk unit number
if 'unit_number' in disk:
try:
temp_disk_unit_number = int(disk['unit_number'])
except ValueError:
self.module.fail_json(msg="Invalid Disk unit number ID '%s'"
" specified at index [%s]" % (disk['unit_number'], disk_index))
if temp_disk_unit_number not in range(0, 16):
self.module.fail_json(msg="Invalid Disk unit number ID specified for disk [%s] at index [%s],"
" please specify value between 0 to 15"
" only (excluding 7)." % (temp_disk_unit_number, disk_index))
if temp_disk_unit_number == 7:
self.module.fail_json(msg="Invalid Disk unit number ID specified for disk at index [%s],"
" please specify value other than 7 as it is reserved"
"for SCSI Controller" % disk_index)
current_disk['disk_unit_number'] = temp_disk_unit_number
else:
self.module.fail_json(msg="Please specify 'unit_number' under disk parameter"
" at index [%s], which is required while creating disk." % disk_index)
# Type of Disk
disk_type = disk.get('type', 'thick').lower()
if disk_type not in ['thin', 'thick', 'eagerzeroedthick']:
self.module.fail_json(msg="Invalid 'disk_type' specified for disk index [%s]. Please specify"
" 'disk_type' value from ['thin', 'thick', 'eagerzeroedthick']." % disk_index)
current_disk['disk_type'] = disk_type
# Mode of Disk
temp_disk_mode = disk.get('disk_mode', 'persistent').lower()
if temp_disk_mode not in ['persistent', 'independent_persistent', 'independent_nonpersistent']:
self.module.fail_json(msg="Invalid 'disk_mode' specified for disk index [%s]. Please specify"
" 'disk_mode' value from ['persistent', 'independent_persistent', 'independent_nonpersistent']." % disk_index)
current_disk['disk_mode'] = temp_disk_mode
# SCSI Controller Type
scsi_contrl_type = disk.get('scsi_type', 'paravirtual').lower()
if scsi_contrl_type not in self.scsi_device_type.keys():
self.module.fail_json(msg="Invalid 'scsi_type' specified for disk index [%s]. Please specify"
" 'scsi_type' value from ['%s']" % (disk_index,
"', '".join(self.scsi_device_type.keys())))
current_disk['scsi_type'] = scsi_contrl_type
if 'shares' in disk:
current_disk['shares'] = disk['shares']
if 'iolimit' in disk:
current_disk['iolimit'] = disk['iolimit']
disks_data.append(current_disk)
return disks_data
def get_recommended_datastore(self, datastore_cluster_obj):
"""
Return Storage DRS recommended datastore from datastore cluster
Args:
datastore_cluster_obj: datastore cluster managed object
Returns: Name of recommended datastore from the given datastore cluster,
Returns None if no datastore recommendation found.
"""
# Check if Datastore Cluster provided by user is SDRS ready
sdrs_status = datastore_cluster_obj.podStorageDrsEntry.storageDrsConfig.podConfig.enabled
if sdrs_status:
# We can get storage recommendation only if SDRS is enabled on given datastorage cluster
pod_sel_spec = vim.storageDrs.PodSelectionSpec()
pod_sel_spec.storagePod = datastore_cluster_obj
storage_spec = vim.storageDrs.StoragePlacementSpec()
storage_spec.podSelectionSpec = pod_sel_spec
storage_spec.type = 'create'
try:
rec = self.content.storageResourceManager.RecommendDatastores(storageSpec=storage_spec)
rec_action = rec.recommendations[0].action[0]
return rec_action.destination.name
except Exception:
# There is some error so we fall back to general workflow
pass
datastore = None
datastore_freespace = 0
for ds in datastore_cluster_obj.childEntity:
if ds.summary.freeSpace > datastore_freespace:
# If datastore field is provided, filter destination datastores
datastore = ds
datastore_freespace = ds.summary.freeSpace
if datastore:
return datastore.name
return None
@staticmethod
def gather_disk_facts(vm_obj):
"""
Gather facts about VM's disks
Args:
vm_obj: Managed object of virtual machine
Returns: A list of dict containing disks information
"""
disks_facts = dict()
if vm_obj is None:
return disks_facts
disk_index = 0
for disk in vm_obj.config.hardware.device:
if isinstance(disk, vim.vm.device.VirtualDisk):
if disk.storageIOAllocation is None:
disk.storageIOAllocation = vim.StorageResourceManager.IOAllocationInfo()
disk.storageIOAllocation.shares = vim.SharesInfo()
if disk.shares is None:
disk.shares = vim.SharesInfo()
disks_facts[disk_index] = dict(
key=disk.key,
label=disk.deviceInfo.label,
summary=disk.deviceInfo.summary,
backing_filename=disk.backing.fileName,
backing_datastore=disk.backing.datastore.name,
backing_disk_mode=disk.backing.diskMode,
backing_writethrough=disk.backing.writeThrough,
backing_thinprovisioned=disk.backing.thinProvisioned,
backing_eagerlyscrub=bool(disk.backing.eagerlyScrub),
controller_key=disk.controllerKey,
unit_number=disk.unitNumber,
iolimit_limit=disk.storageIOAllocation.limit,
iolimit_shares_level=disk.storageIOAllocation.shares.level,
iolimit_shares_limit=disk.storageIOAllocation.shares.shares,
shares_level=disk.shares.level,
shares_limit=disk.shares.shares,
capacity_in_kb=disk.capacityInKB,
capacity_in_bytes=disk.capacityInBytes,
)
disk_index += 1
return disks_facts
def main():
argument_spec = vmware_argument_spec()
argument_spec.update(
name=dict(type='str'),
uuid=dict(type='str'),
moid=dict(type='str'),
folder=dict(type='str'),
datacenter=dict(type='str', required=True),
disk=dict(type='list', default=[]),
use_instance_uuid=dict(type='bool', default=False),
)
module = AnsibleModule(
argument_spec=argument_spec,
required_one_of=[
['name', 'uuid', 'moid']
]
)
if module.params['folder']:
# FindByInventoryPath() does not require an absolute path
# so we should leave the input folder path unmodified
module.params['folder'] = module.params['folder'].rstrip('/')
pyv = PyVmomiHelper(module)
# Check if the VM exists before continuing
vm = pyv.get_vm()
if not vm:
# We unable to find the virtual machine user specified
# Bail out
vm_id = (module.params.get('name') or module.params.get('uuid') or module.params.get('moid'))
module.fail_json(msg="Unable to manage disks for non-existing"
" virtual machine '%s'." % vm_id)
# VM exists
try:
pyv.ensure_disks(vm_obj=vm)
except Exception as exc:
module.fail_json(msg="Failed to manage disks for virtual machine"
" '%s' with exception : %s" % (vm.name,
to_native(exc)))
if __name__ == '__main__':
main()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 61,985 |
Multiline support for the regex_replace filter
|
##### SUMMARY
I think the `regex_replace` jinja filter should support `multiline` as the `regex_search` do.
Or maybe use a `flags` parameter to support any `re` flag.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
- `lib/ansible/plugins/filter/core.py`
##### ADDITIONAL INFORMATION
Example: comment a multiline variable.
```yaml
val: "{{ val | regex_replace('^', '#', multiline=True) }}"
```
|
https://github.com/ansible/ansible/issues/61985
|
https://github.com/ansible/ansible/pull/65051
|
b3db41e6d8d8fe0b2d559a35ab2a4557b26c56a8
|
e867535a5700ab5cc38e57593e24926bb3f4b903
| 2019-09-09T09:57:21Z |
python
| 2020-02-19T17:19:40Z |
changelogs/fragments/65051-regex-replace-multiline.yaml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 61,985 |
Multiline support for the regex_replace filter
|
##### SUMMARY
I think the `regex_replace` jinja filter should support `multiline` as the `regex_search` do.
Or maybe use a `flags` parameter to support any `re` flag.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
- `lib/ansible/plugins/filter/core.py`
##### ADDITIONAL INFORMATION
Example: comment a multiline variable.
```yaml
val: "{{ val | regex_replace('^', '#', multiline=True) }}"
```
|
https://github.com/ansible/ansible/issues/61985
|
https://github.com/ansible/ansible/pull/65051
|
b3db41e6d8d8fe0b2d559a35ab2a4557b26c56a8
|
e867535a5700ab5cc38e57593e24926bb3f4b903
| 2019-09-09T09:57:21Z |
python
| 2020-02-19T17:19:40Z |
docs/docsite/rst/user_guide/playbooks_filters.rst
|
.. _playbooks_filters:
*******
Filters
*******
Filters let you transform data inside template expressions. This page documents mainly Ansible-specific filters, but you can use any of the standard filters shipped with Jinja2 - see the list of :ref:`builtin filters <jinja2:builtin-filters>` in the official Jinja2 template documentation. You can also use :ref:`Python methods <jinja2:python-methods>` to manipulate variables. A few useful filters are typically added with each new Ansible release. The development documentation shows
how to create custom Ansible filters as plugins, though we generally welcome new filters into the core code so everyone can use them.
Templating happens on the Ansible controller, **not** on the target host, so filters execute on the controller and manipulate data locally.
.. contents::
:local:
Handling undefined variables
============================
Filters can help you manage missing or undefined variables by providing defaults or making some variable optional. If you configure Ansible to ignore most undefined variables, you can mark some variables as requiring values with the ``mandatory`` filter.
.. _defaulting_undefined_variables:
Providing default values
------------------------
You can provide default values for variables directly in your templates using the Jinja2 'default' filter. This is often a better approach than failing if a variable is not defined::
{{ some_variable | default(5) }}
In the above example, if the variable 'some_variable' is not defined, Ansible uses the default value 5, rather than raising an "undefined variable" error and failing. If you are working within a role, you can also add a ``defaults/main.yml`` to define the default values for variables in your role.
Beginning in version 2.8, attempting to access an attribute of an Undefined value in Jinja will return another Undefined value, rather than throwing an error immediately. This means that you can now simply use
a default with a value in a nested data structure (i.e :code:`{{ foo.bar.baz | default('DEFAULT') }}`) when you do not know if the intermediate values are defined.
If you want to use the default value when variables evaluate to false or an empty string you have to set the second parameter to ``true``::
{{ lookup('env', 'MY_USER') | default('admin', true) }}
.. _omitting_undefined_variables:
Making variables optional
-------------------------
In some cases, you want to make a variable optional. For example, if you want to use a system default for some items and control the value for others. To make a variable optional, set the default value to the special variable ``omit``::
- name: touch files with an optional mode
file:
dest: "{{ item.path }}"
state: touch
mode: "{{ item.mode | default(omit) }}"
loop:
- path: /tmp/foo
- path: /tmp/bar
- path: /tmp/baz
mode: "0444"
In this example, the default mode for the files ``/tmp/foo`` and ``/tmp/bar`` is determined by the umask of the system. Ansible does not send a value for ``mode``. Only the third file, ``/tmp/baz``, receives the `mode=0444` option.
.. note:: If you are "chaining" additional filters after the ``default(omit)`` filter, you should instead do something like this:
``"{{ foo | default(None) | some_filter or omit }}"``. In this example, the default ``None`` (Python null) value will cause the
later filters to fail, which will trigger the ``or omit`` portion of the logic. Using ``omit`` in this manner is very specific to
the later filters you're chaining though, so be prepared for some trial and error if you do this.
.. _forcing_variables_to_be_defined:
Defining mandatory values
-------------------------
If you configure Ansible to ignore undefined variables, you may want to define some values as mandatory. By default, Ansible fails if a variable in your playbook or command is undefined. You can configure Ansible to allow undefined variables by setting :ref:`DEFAULT_UNDEFINED_VAR_BEHAVIOR` to ``false``. In that case, you may want to require some variables to be defined. You can do with this with::
{{ variable | mandatory }}
The variable value will be used as is, but the template evaluation will raise an error if it is undefined.
Defining different values for true/false/null
=============================================
You can create a test, then define one value to use when the test returns true and another when the test returns false (new in version 1.9)::
{{ (name == "John") | ternary('Mr','Ms') }}
In addition, you can define a one value to use on true, one value on false and a third value on null (new in version 2.8)::
{{ enabled | ternary('no shutdown', 'shutdown', omit) }}
Manipulating data types
=======================
Sometimes a variables file or registered variable contains a dictionary when your playbook needs a list. Sometimes you have a list when your template needs a dictionary. These filters help you transform these data types.
.. _dict_filter:
Transforming dictionaries into lists
------------------------------------
.. versionadded:: 2.6
To turn a dictionary into a list of items, suitable for looping, use `dict2items`::
{{ dict | dict2items }}
Which turns::
tags:
Application: payment
Environment: dev
into::
- key: Application
value: payment
- key: Environment
value: dev
.. versionadded:: 2.8
``dict2items`` accepts 2 keyword arguments, ``key_name`` and ``value_name`` that allow configuration of the names of the keys to use for the transformation::
{{ files | dict2items(key_name='file', value_name='path') }}
Which turns::
files:
users: /etc/passwd
groups: /etc/group
into::
- file: users
path: /etc/passwd
- file: groups
path: /etc/group
Transforming lists into dictionaries
------------------------------------
.. versionadded:: 2.7
This filter turns a list of dicts with 2 keys, into a dict, mapping the values of those keys into ``key: value`` pairs::
{{ tags | items2dict }}
Which turns::
tags:
- key: Application
value: payment
- key: Environment
value: dev
into::
Application: payment
Environment: dev
This is the reverse of the ``dict2items`` filter.
``items2dict`` accepts 2 keyword arguments, ``key_name`` and ``value_name`` that allow configuration of the names of the keys to use for the transformation::
{{ tags | items2dict(key_name='key', value_name='value') }}
Discovering the data type
-------------------------
.. versionadded:: 2.3
If you are unsure of the underlying Python type of a variable, you can use the ``type_debug`` filter to display it. This is useful in debugging when you need a particular type of variable::
{{ myvar | type_debug }}
Forcing the data type
---------------------
You can cast values as certain types. For example, if you expect the input "True" from a :ref:`vars_prompt <playbooks_prompts>` and you want Ansible to recognize it as a Boolean value instead of a string::
- debug:
msg: test
when: some_string_value | bool
.. versionadded:: 1.6
.. _filters_for_formatting_data:
Controlling data formats: YAML and JSON
=======================================
The following filters will take a data structure in a template and manipulate it or switch it from or to JSON or YAML format. These are occasionally useful for debugging::
{{ some_variable | to_json }}
{{ some_variable | to_yaml }}
For human readable output, you can use::
{{ some_variable | to_nice_json }}
{{ some_variable | to_nice_yaml }}
You can change the indentation of either format::
{{ some_variable | to_nice_json(indent=2) }}
{{ some_variable | to_nice_yaml(indent=8) }}
The ``to_yaml`` and ``to_nice_yaml`` filters use the `PyYAML library`_ which has a default 80 symbol string length limit. That causes unexpected line break after 80th symbol (if there is a space after 80th symbol)
To avoid such behavior and generate long lines, use the ``width`` option. You must use a hardcoded number to define the width, instead of a construction like ``float("inf")``, because the filter does not support proxying Python functions. For example::
{{ some_variable | to_yaml(indent=8, width=1337) }}
{{ some_variable | to_nice_yaml(indent=8, width=1337) }}
The filter does support passing through other YAML parameters. For a full list, see the `PyYAML documentation`_.
If you are reading in some already formatted data::
{{ some_variable | from_json }}
{{ some_variable | from_yaml }}
for example::
tasks:
- shell: cat /some/path/to/file.json
register: result
- set_fact:
myvar: "{{ result.stdout | from_json }}"
.. versionadded:: 2.7
To parse multi-document YAML strings, the ``from_yaml_all`` filter is provided.
The ``from_yaml_all`` filter will return a generator of parsed YAML documents.
for example::
tasks:
- shell: cat /some/path/to/multidoc-file.yaml
register: result
- debug:
msg: '{{ item }}'
loop: '{{ result.stdout | from_yaml_all | list }}'
Combining and selecting data
============================
These filters let you manipulate data from multiple sources and types and manage large data structures, giving you precise control over complex data.
.. _zip_filter:
Combining items from multiple lists: zip and zip_longest
--------------------------------------------------------
.. versionadded:: 2.3
To get a list combining the elements of other lists use ``zip``::
- name: give me list combo of two lists
debug:
msg: "{{ [1,2,3,4,5] | zip(['a','b','c','d','e','f']) | list }}"
- name: give me shortest combo of two lists
debug:
msg: "{{ [1,2,3] | zip(['a','b','c','d','e','f']) | list }}"
To always exhaust all list use ``zip_longest``::
- name: give me longest combo of three lists , fill with X
debug:
msg: "{{ [1,2,3] | zip_longest(['a','b','c','d','e','f'], [21, 22, 23], fillvalue='X') | list }}"
Similarly to the output of the ``items2dict`` filter mentioned above, these filters can be used to construct a ``dict``::
{{ dict(keys_list | zip(values_list)) }}
Which turns::
keys_list:
- one
- two
values_list:
- apple
- orange
into::
one: apple
two: orange
Combining objects and subelements
---------------------------------
.. versionadded:: 2.7
The ``subelements`` filter produces a product of an object and the subelement values of that object, similar to the ``subelements`` lookup. This lets you specify individual subelements to use in a template. For example, this expression::
{{ users | subelements('groups', skip_missing=True) }}
turns this data::
users:
- name: alice
authorized:
- /tmp/alice/onekey.pub
- /tmp/alice/twokey.pub
groups:
- wheel
- docker
- name: bob
authorized:
- /tmp/bob/id_rsa.pub
groups:
- docker
Into this data::
-
- name: alice
groups:
- wheel
- docker
authorized:
- /tmp/alice/onekey.pub
- /tmp/alice/twokey.pub
- wheel
-
- name: alice
groups:
- wheel
- docker
authorized:
- /tmp/alice/onekey.pub
- /tmp/alice/twokey.pub
- docker
-
- name: bob
authorized:
- /tmp/bob/id_rsa.pub
groups:
- docker
- docker
You can use the transformed data with ``loop`` to iterate over the same subelement for multiple objects::
- name: Set authorized ssh key, extracting just that data from 'users'
authorized_key:
user: "{{ item.0.name }}"
key: "{{ lookup('file', item.1) }}"
loop: "{{ users | subelements('authorized') }}"
.. _combine_filter:
Combining hashes/dictionaries
-----------------------------
.. versionadded:: 2.0
The ``combine`` filter allows hashes to be merged.
For example, the following would override keys in one hash::
{{ {'a':1, 'b':2} | combine({'b':3}) }}
The resulting hash would be::
{'a':1, 'b':3}
The filter can also take multiple arguments to merge::
{{ a | combine(b, c, d) }}
{{ [a, b, c, d] | combine }}
In this case, keys in ``d`` would override those in ``c``, which would
override those in ``b``, and so on.
The filter also accepts two optional parameters: ``recursive`` and ``list_merge``.
recursive
Is a boolean, default to ``False``.
Should the ``combine`` recursively merge nested hashes.
Note: It does **not** depend on the value of the ``hash_behaviour`` setting in ``ansible.cfg``.
list_merge
Is a string, its possible values are ``replace`` (default), ``keep``, ``append``, ``prepend``, ``append_rp`` or ``prepend_rp``.
It modifies the behaviour of ``combine`` when the hashes to merge contain arrays/lists.
.. code-block:: yaml
default:
a:
x: default
y: default
b: default
c: default
patch:
a:
y: patch
z: patch
b: patch
If ``recursive=False`` (the default), nested hash aren't merged::
{{ default | combine(patch) }}
This would result in::
a:
y: patch
z: patch
b: patch
c: default
If ``recursive=True``, recurse into nested hash and merge their keys::
{{ default | combine(patch, recursive=True) }}
This would result in::
a:
x: default
y: patch
z: patch
b: patch
c: default
If ``list_merge='replace'`` (the default), arrays from the right hash will "replace" the ones in the left hash::
default:
a:
- default
patch:
a:
- patch
.. code-block:: jinja
{{ default | combine(patch) }}
This would result in::
a:
- patch
If ``list_merge='keep'``, arrays from the left hash will be kept::
{{ default | combine(patch, list_merge='keep') }}
This would result in::
a:
- default
If ``list_merge='append'``, arrays from the right hash will be appended to the ones in the left hash::
{{ default | combine(patch, list_merge='append') }}
This would result in::
a:
- default
- patch
If ``list_merge='prepend'``, arrays from the right hash will be prepended to the ones in the left hash::
{{ default | combine(patch, list_merge='prepend') }}
This would result in::
a:
- patch
- default
If ``list_merge='append_rp'``, arrays from the right hash will be appended to the ones in the left hash.
Elements of arrays in the left hash that are also in the corresponding array of the right hash will be removed ("rp" stands for "remove present").
Duplicate elements that aren't in both hashes are kept::
default:
a:
- 1
- 1
- 2
- 3
patch:
a:
- 3
- 4
- 5
- 5
.. code-block:: jinja
{{ default | combine(patch, list_merge='append_rp') }}
This would result in::
a:
- 1
- 1
- 2
- 3
- 4
- 5
- 5
If ``list_merge='prepend_rp'``, the behavior is similar to the one for ``append_rp``, but elements of arrays in the right hash are prepended::
{{ default | combine(patch, list_merge='prepend_rp') }}
This would result in::
a:
- 3
- 4
- 5
- 5
- 1
- 1
- 2
``recursive`` and ``list_merge`` can be used together::
default:
a:
a':
x: default_value
y: default_value
list:
- default_value
b:
- 1
- 1
- 2
- 3
patch:
a:
a':
y: patch_value
z: patch_value
list:
- patch_value
b:
- 3
- 4
- 4
- key: value
.. code-block:: jinja
{{ default | combine(patch, recursive=True, list_merge='append_rp') }}
This would result in::
a:
a':
x: default_value
y: patch_value
z: patch_value
list:
- default_value
- patch_value
b:
- 1
- 1
- 2
- 3
- 4
- 4
- key: value
.. _extract_filter:
Selecting values from arrays or hashtables
-------------------------------------------
.. versionadded:: 2.1
The `extract` filter is used to map from a list of indices to a list of
values from a container (hash or array)::
{{ [0,2] | map('extract', ['x','y','z']) | list }}
{{ ['x','y'] | map('extract', {'x': 42, 'y': 31}) | list }}
The results of the above expressions would be::
['x', 'z']
[42, 31]
The filter can take another argument::
{{ groups['x'] | map('extract', hostvars, 'ec2_ip_address') | list }}
This takes the list of hosts in group 'x', looks them up in `hostvars`,
and then looks up the `ec2_ip_address` of the result. The final result
is a list of IP addresses for the hosts in group 'x'.
The third argument to the filter can also be a list, for a recursive
lookup inside the container::
{{ ['a'] | map('extract', b, ['x','y']) | list }}
This would return a list containing the value of `b['a']['x']['y']`.
Combining lists
---------------
This set of filters returns a list of combined lists.
permutations
^^^^^^^^^^^^
To get permutations of a list::
- name: give me largest permutations (order matters)
debug:
msg: "{{ [1,2,3,4,5] | permutations | list }}"
- name: give me permutations of sets of three
debug:
msg: "{{ [1,2,3,4,5] | permutations(3) | list }}"
combinations
^^^^^^^^^^^^
Combinations always require a set size::
- name: give me combinations for sets of two
debug:
msg: "{{ [1,2,3,4,5] | combinations(2) | list }}"
Also see the :ref:`zip_filter`
products
^^^^^^^^
The product filter returns the `cartesian product <https://docs.python.org/3/library/itertools.html#itertools.product>`_ of the input iterables.
This is roughly equivalent to nested for-loops in a generator expression.
For example::
- name: generate multiple hostnames
debug:
msg: "{{ ['foo', 'bar'] | product(['com']) | map('join', '.') | join(',') }}"
This would result in::
{ "msg": "foo.com,bar.com" }
.. json_query_filter:
Selecting JSON data: JSON queries
---------------------------------
Sometimes you end up with a complex data structure in JSON format and you need to extract only a small set of data within it. The **json_query** filter lets you query a complex JSON structure and iterate over it using a loop structure.
.. note:: This filter is built upon **jmespath**, and you can use the same syntax. For examples, see `jmespath examples <http://jmespath.org/examples.html>`_.
Consider this data structure::
{
"domain_definition": {
"domain": {
"cluster": [
{
"name": "cluster1"
},
{
"name": "cluster2"
}
],
"server": [
{
"name": "server11",
"cluster": "cluster1",
"port": "8080"
},
{
"name": "server12",
"cluster": "cluster1",
"port": "8090"
},
{
"name": "server21",
"cluster": "cluster2",
"port": "9080"
},
{
"name": "server22",
"cluster": "cluster2",
"port": "9090"
}
],
"library": [
{
"name": "lib1",
"target": "cluster1"
},
{
"name": "lib2",
"target": "cluster2"
}
]
}
}
}
To extract all clusters from this structure, you can use the following query::
- name: "Display all cluster names"
debug:
var: item
loop: "{{ domain_definition | json_query('domain.cluster[*].name') }}"
Same thing for all server names::
- name: "Display all server names"
debug:
var: item
loop: "{{ domain_definition | json_query('domain.server[*].name') }}"
This example shows ports from cluster1::
- name: "Display all ports from cluster1"
debug:
var: item
loop: "{{ domain_definition | json_query(server_name_cluster1_query) }}"
vars:
server_name_cluster1_query: "domain.server[?cluster=='cluster1'].port"
.. note:: You can use a variable to make the query more readable.
Or, alternatively print out the ports in a comma separated string::
- name: "Display all ports from cluster1 as a string"
debug:
msg: "{{ domain_definition | json_query('domain.server[?cluster==`cluster1`].port') | join(', ') }}"
.. note:: Here, quoting literals using backticks avoids escaping quotes and maintains readability.
Or, using YAML `single quote escaping <https://yaml.org/spec/current.html#id2534365>`_::
- name: "Display all ports from cluster1"
debug:
var: item
loop: "{{ domain_definition | json_query('domain.server[?cluster==''cluster1''].port') }}"
.. note:: Escaping single quotes within single quotes in YAML is done by doubling the single quote.
In this example, we get a hash map with all ports and names of a cluster::
- name: "Display all server ports and names from cluster1"
debug:
var: item
loop: "{{ domain_definition | json_query(server_name_cluster1_query) }}"
vars:
server_name_cluster1_query: "domain.server[?cluster=='cluster2'].{name: name, port: port}"
Randomizing data
================
When you need a randomly generated value, use one of these filters.
.. _random_mac_filter:
Random MAC addresses
--------------------
.. versionadded:: 2.6
This filter can be used to generate a random MAC address from a string prefix.
To get a random MAC address from a string prefix starting with '52:54:00'::
"{{ '52:54:00' | random_mac }}"
# => '52:54:00:ef:1c:03'
Note that if anything is wrong with the prefix string, the filter will issue an error.
.. versionadded:: 2.9
As of Ansible version 2.9, you can also initialize the random number generator from a seed. This way, you can create random-but-idempotent MAC addresses::
"{{ '52:54:00' | random_mac(seed=inventory_hostname) }}"
.. _random_filter:
Random items or numbers
-----------------------
This filter can be used similar to the default Jinja2 random filter (returning a random item from a sequence of
items), but can also generate a random number based on a range.
To get a random item from a list::
"{{ ['a','b','c'] | random }}"
# => 'c'
To get a random number between 0 and a specified number::
"{{ 60 | random }} * * * * root /script/from/cron"
# => '21 * * * * root /script/from/cron'
Get a random number from 0 to 100 but in steps of 10::
{{ 101 | random(step=10) }}
# => 70
Get a random number from 1 to 100 but in steps of 10::
{{ 101 | random(1, 10) }}
# => 31
{{ 101 | random(start=1, step=10) }}
# => 51
It's also possible to initialize the random number generator from a seed. This way, you can create random-but-idempotent numbers::
"{{ 60 | random(seed=inventory_hostname) }} * * * * root /script/from/cron"
Shuffling a list
----------------
This filter will randomize an existing list, giving a different order every invocation.
To get a random list from an existing list::
{{ ['a','b','c'] | shuffle }}
# => ['c','a','b']
{{ ['a','b','c'] | shuffle }}
# => ['b','c','a']
It's also possible to shuffle a list idempotent. All you need is a seed.::
{{ ['a','b','c'] | shuffle(seed=inventory_hostname) }}
# => ['b','a','c']
The shuffle filter returns a list whenever possible. If you use it with a non 'listable' item, the filter does nothing.
.. _list_filters:
List filters
============
These filters all operate on list variables.
To get the minimum value from list of numbers::
{{ list1 | min }}
To get the maximum value from a list of numbers::
{{ [3, 4, 2] | max }}
.. versionadded:: 2.5
Flatten a list (same thing the `flatten` lookup does)::
{{ [3, [4, 2] ] | flatten }}
Flatten only the first level of a list (akin to the `items` lookup)::
{{ [3, [4, [2]] ] | flatten(levels=1) }}
.. _set_theory_filters:
Set theory filters
==================
These functions return a unique set from sets or lists.
.. versionadded:: 1.4
To get a unique set from a list::
{{ list1 | unique }}
To get a union of two lists::
{{ list1 | union(list2) }}
To get the intersection of 2 lists (unique list of all items in both)::
{{ list1 | intersect(list2) }}
To get the difference of 2 lists (items in 1 that don't exist in 2)::
{{ list1 | difference(list2) }}
To get the symmetric difference of 2 lists (items exclusive to each list)::
{{ list1 | symmetric_difference(list2) }}
.. _math_stuff:
Math filters
============
.. versionadded:: 1.9
Get the logarithm (default is e)::
{{ myvar | log }}
Get the base 10 logarithm::
{{ myvar | log(10) }}
Give me the power of 2! (or 5)::
{{ myvar | pow(2) }}
{{ myvar | pow(5) }}
Square root, or the 5th::
{{ myvar | root }}
{{ myvar | root(5) }}
Note that jinja2 already provides some like abs() and round().
Network filters
===============
These filters help you with common network tasks.
.. _ipaddr_filter:
IP address filters
------------------
.. versionadded:: 1.9
To test if a string is a valid IP address::
{{ myvar | ipaddr }}
You can also require a specific IP protocol version::
{{ myvar | ipv4 }}
{{ myvar | ipv6 }}
IP address filter can also be used to extract specific information from an IP
address. For example, to get the IP address itself from a CIDR, you can use::
{{ '192.0.2.1/24' | ipaddr('address') }}
More information about ``ipaddr`` filter and complete usage guide can be found
in :ref:`playbooks_filters_ipaddr`.
.. _network_filters:
Network CLI filters
-------------------
.. versionadded:: 2.4
To convert the output of a network device CLI command into structured JSON
output, use the ``parse_cli`` filter::
{{ output | parse_cli('path/to/spec') }}
The ``parse_cli`` filter will load the spec file and pass the command output
through it, returning JSON output. The YAML spec file defines how to parse the CLI output.
The spec file should be valid formatted YAML. It defines how to parse the CLI
output and return JSON data. Below is an example of a valid spec file that
will parse the output from the ``show vlan`` command.
.. code-block:: yaml
---
vars:
vlan:
vlan_id: "{{ item.vlan_id }}"
name: "{{ item.name }}"
enabled: "{{ item.state != 'act/lshut' }}"
state: "{{ item.state }}"
keys:
vlans:
value: "{{ vlan }}"
items: "^(?P<vlan_id>\\d+)\\s+(?P<name>\\w+)\\s+(?P<state>active|act/lshut|suspended)"
state_static:
value: present
The spec file above will return a JSON data structure that is a list of hashes
with the parsed VLAN information.
The same command could be parsed into a hash by using the key and values
directives. Here is an example of how to parse the output into a hash
value using the same ``show vlan`` command.
.. code-block:: yaml
---
vars:
vlan:
key: "{{ item.vlan_id }}"
values:
vlan_id: "{{ item.vlan_id }}"
name: "{{ item.name }}"
enabled: "{{ item.state != 'act/lshut' }}"
state: "{{ item.state }}"
keys:
vlans:
value: "{{ vlan }}"
items: "^(?P<vlan_id>\\d+)\\s+(?P<name>\\w+)\\s+(?P<state>active|act/lshut|suspended)"
state_static:
value: present
Another common use case for parsing CLI commands is to break a large command
into blocks that can be parsed. This can be done using the ``start_block`` and
``end_block`` directives to break the command into blocks that can be parsed.
.. code-block:: yaml
---
vars:
interface:
name: "{{ item[0].match[0] }}"
state: "{{ item[1].state }}"
mode: "{{ item[2].match[0] }}"
keys:
interfaces:
value: "{{ interface }}"
start_block: "^Ethernet.*$"
end_block: "^$"
items:
- "^(?P<name>Ethernet\\d\\/\\d*)"
- "admin state is (?P<state>.+),"
- "Port mode is (.+)"
The example above will parse the output of ``show interface`` into a list of
hashes.
The network filters also support parsing the output of a CLI command using the
TextFSM library. To parse the CLI output with TextFSM use the following
filter::
{{ output.stdout[0] | parse_cli_textfsm('path/to/fsm') }}
Use of the TextFSM filter requires the TextFSM library to be installed.
Network XML filters
-------------------
.. versionadded:: 2.5
To convert the XML output of a network device command into structured JSON
output, use the ``parse_xml`` filter::
{{ output | parse_xml('path/to/spec') }}
The ``parse_xml`` filter will load the spec file and pass the command output
through formatted as JSON.
The spec file should be valid formatted YAML. It defines how to parse the XML
output and return JSON data.
Below is an example of a valid spec file that
will parse the output from the ``show vlan | display xml`` command.
.. code-block:: yaml
---
vars:
vlan:
vlan_id: "{{ item.vlan_id }}"
name: "{{ item.name }}"
desc: "{{ item.desc }}"
enabled: "{{ item.state.get('inactive') != 'inactive' }}"
state: "{% if item.state.get('inactive') == 'inactive'%} inactive {% else %} active {% endif %}"
keys:
vlans:
value: "{{ vlan }}"
top: configuration/vlans/vlan
items:
vlan_id: vlan-id
name: name
desc: description
state: ".[@inactive='inactive']"
The spec file above will return a JSON data structure that is a list of hashes
with the parsed VLAN information.
The same command could be parsed into a hash by using the key and values
directives. Here is an example of how to parse the output into a hash
value using the same ``show vlan | display xml`` command.
.. code-block:: yaml
---
vars:
vlan:
key: "{{ item.vlan_id }}"
values:
vlan_id: "{{ item.vlan_id }}"
name: "{{ item.name }}"
desc: "{{ item.desc }}"
enabled: "{{ item.state.get('inactive') != 'inactive' }}"
state: "{% if item.state.get('inactive') == 'inactive'%} inactive {% else %} active {% endif %}"
keys:
vlans:
value: "{{ vlan }}"
top: configuration/vlans/vlan
items:
vlan_id: vlan-id
name: name
desc: description
state: ".[@inactive='inactive']"
The value of ``top`` is the XPath relative to the XML root node.
In the example XML output given below, the value of ``top`` is ``configuration/vlans/vlan``,
which is an XPath expression relative to the root node (<rpc-reply>).
``configuration`` in the value of ``top`` is the outer most container node, and ``vlan``
is the inner-most container node.
``items`` is a dictionary of key-value pairs that map user-defined names to XPath expressions
that select elements. The Xpath expression is relative to the value of the XPath value contained in ``top``.
For example, the ``vlan_id`` in the spec file is a user defined name and its value ``vlan-id`` is the
relative to the value of XPath in ``top``
Attributes of XML tags can be extracted using XPath expressions. The value of ``state`` in the spec
is an XPath expression used to get the attributes of the ``vlan`` tag in output XML.::
<rpc-reply>
<configuration>
<vlans>
<vlan inactive="inactive">
<name>vlan-1</name>
<vlan-id>200</vlan-id>
<description>This is vlan-1</description>
</vlan>
</vlans>
</configuration>
</rpc-reply>
.. note:: For more information on supported XPath expressions, see `<https://docs.python.org/2/library/xml.etree.elementtree.html#xpath-support>`_.
Network VLAN filters
--------------------
.. versionadded:: 2.8
Use the ``vlan_parser`` filter to manipulate an unsorted list of VLAN integers into a
sorted string list of integers according to IOS-like VLAN list rules. This list has the following properties:
* Vlans are listed in ascending order.
* Three or more consecutive VLANs are listed with a dash.
* The first line of the list can be first_line_len characters long.
* Subsequent list lines can be other_line_len characters.
To sort a VLAN list::
{{ [3003, 3004, 3005, 100, 1688, 3002, 3999] | vlan_parser }}
This example renders the following sorted list::
['100,1688,3002-3005,3999']
Another example Jinja template::
{% set parsed_vlans = vlans | vlan_parser %}
switchport trunk allowed vlan {{ parsed_vlans[0] }}
{% for i in range (1, parsed_vlans | count) %}
switchport trunk allowed vlan add {{ parsed_vlans[i] }}
This allows for dynamic generation of VLAN lists on a Cisco IOS tagged interface. You can store an exhaustive raw list of the exact VLANs required for an interface and then compare that to the parsed IOS output that would actually be generated for the configuration.
.. _hash_filters:
Encryption filters
==================
.. versionadded:: 1.9
To get the sha1 hash of a string::
{{ 'test1' | hash('sha1') }}
To get the md5 hash of a string::
{{ 'test1' | hash('md5') }}
Get a string checksum::
{{ 'test2' | checksum }}
Other hashes (platform dependent)::
{{ 'test2' | hash('blowfish') }}
To get a sha512 password hash (random salt)::
{{ 'passwordsaresecret' | password_hash('sha512') }}
To get a sha256 password hash with a specific salt::
{{ 'secretpassword' | password_hash('sha256', 'mysecretsalt') }}
An idempotent method to generate unique hashes per system is to use a salt that is consistent between runs::
{{ 'secretpassword' | password_hash('sha512', 65534 | random(seed=inventory_hostname) | string) }}
Hash types available depend on the master system running ansible,
'hash' depends on hashlib password_hash depends on passlib (https://passlib.readthedocs.io/en/stable/lib/passlib.hash.html).
.. versionadded:: 2.7
Some hash types allow providing a rounds parameter::
{{ 'secretpassword' | password_hash('sha256', 'mysecretsalt', rounds=10000) }}
.. _other_useful_filters:
Text filters
============
These filters work with strings and text.
.. _comment_filter:
Adding comments to files
------------------------
The `comment` filter lets you turn text in a template into comments in a file, with a variety of comment styles. By default Ansible uses ``#`` to start a comment line and adds a blank comment line above and below your comment text. For example the following::
{{ "Plain style (default)" | comment }}
produces this output:
.. code-block:: text
#
# Plain style (default)
#
Ansible offers styles for comments in C (``//...``), C block
(``/*...*/``), Erlang (``%...``) and XML (``<!--...-->``)::
{{ "C style" | comment('c') }}
{{ "C block style" | comment('cblock') }}
{{ "Erlang style" | comment('erlang') }}
{{ "XML style" | comment('xml') }}
You can define a custom comment character. This filter::
{{ "My Special Case" | comment(decoration="! ") }}
produces:
.. code-block:: text
!
! My Special Case
!
You can fully customize the comment style::
{{ "Custom style" | comment('plain', prefix='#######\n#', postfix='#\n#######\n ###\n #') }}
That creates the following output:
.. code-block:: text
#######
#
# Custom style
#
#######
###
#
The filter can also be applied to any Ansible variable. For example to
make the output of the ``ansible_managed`` variable more readable, we can
change the definition in the ``ansible.cfg`` file to this:
.. code-block:: jinja
[defaults]
ansible_managed = This file is managed by Ansible.%n
template: {file}
date: %Y-%m-%d %H:%M:%S
user: {uid}
host: {host}
and then use the variable with the `comment` filter::
{{ ansible_managed | comment }}
which produces this output:
.. code-block:: sh
#
# This file is managed by Ansible.
#
# template: /home/ansible/env/dev/ansible_managed/roles/role1/templates/test.j2
# date: 2015-09-10 11:02:58
# user: ansible
# host: myhost
#
Splitting URLs
--------------
.. versionadded:: 2.4
The ``urlsplit`` filter extracts the fragment, hostname, netloc, password, path, port, query, scheme, and username from an URL. With no arguments, returns a dictionary of all the fields::
{{ "http://user:[email protected]:9000/dir/index.html?query=term#fragment" | urlsplit('hostname') }}
# => 'www.acme.com'
{{ "http://user:[email protected]:9000/dir/index.html?query=term#fragment" | urlsplit('netloc') }}
# => 'user:[email protected]:9000'
{{ "http://user:[email protected]:9000/dir/index.html?query=term#fragment" | urlsplit('username') }}
# => 'user'
{{ "http://user:[email protected]:9000/dir/index.html?query=term#fragment" | urlsplit('password') }}
# => 'password'
{{ "http://user:[email protected]:9000/dir/index.html?query=term#fragment" | urlsplit('path') }}
# => '/dir/index.html'
{{ "http://user:[email protected]:9000/dir/index.html?query=term#fragment" | urlsplit('port') }}
# => '9000'
{{ "http://user:[email protected]:9000/dir/index.html?query=term#fragment" | urlsplit('scheme') }}
# => 'http'
{{ "http://user:[email protected]:9000/dir/index.html?query=term#fragment" | urlsplit('query') }}
# => 'query=term'
{{ "http://user:[email protected]:9000/dir/index.html?query=term#fragment" | urlsplit('fragment') }}
# => 'fragment'
{{ "http://user:[email protected]:9000/dir/index.html?query=term#fragment" | urlsplit }}
# =>
# {
# "fragment": "fragment",
# "hostname": "www.acme.com",
# "netloc": "user:[email protected]:9000",
# "password": "password",
# "path": "/dir/index.html",
# "port": 9000,
# "query": "query=term",
# "scheme": "http",
# "username": "user"
# }
Searching strings with regular expressions
------------------------------------------
To search a string with a regex, use the "regex_search" filter::
# search for "foo" in "foobar"
{{ 'foobar' | regex_search('(foo)') }}
# will return empty if it cannot find a match
{{ 'ansible' | regex_search('(foobar)') }}
# case insensitive search in multiline mode
{{ 'foo\nBAR' | regex_search("^bar", multiline=True, ignorecase=True) }}
To search for all occurrences of regex matches, use the "regex_findall" filter::
# Return a list of all IPv4 addresses in the string
{{ 'Some DNS servers are 8.8.8.8 and 8.8.4.4' | regex_findall('\\b(?:[0-9]{1,3}\\.){3}[0-9]{1,3}\\b') }}
To replace text in a string with regex, use the "regex_replace" filter::
# convert "ansible" to "able"
{{ 'ansible' | regex_replace('^a.*i(.*)$', 'a\\1') }}
# convert "foobar" to "bar"
{{ 'foobar' | regex_replace('^f.*o(.*)$', '\\1') }}
# convert "localhost:80" to "localhost, 80" using named groups
{{ 'localhost:80' | regex_replace('^(?P<host>.+):(?P<port>\\d+)$', '\\g<host>, \\g<port>') }}
# convert "localhost:80" to "localhost"
{{ 'localhost:80' | regex_replace(':80') }}
.. note:: If you want to match the whole string and you are using ``*`` make sure to always wraparound your regular expression with the start/end anchors.
For example ``^(.*)$`` will always match only one result, while ``(.*)`` on some Python versions will match the whole string and an empty string at the
end, which means it will make two replacements::
# add "https://" prefix to each item in a list
GOOD:
{{ hosts | map('regex_replace', '^(.*)$', 'https://\\1') | list }}
{{ hosts | map('regex_replace', '(.+)', 'https://\\1') | list }}
{{ hosts | map('regex_replace', '^', 'https://') | list }}
BAD:
{{ hosts | map('regex_replace', '(.*)', 'https://\\1') | list }}
# append ':80' to each item in a list
GOOD:
{{ hosts | map('regex_replace', '^(.*)$', '\\1:80') | list }}
{{ hosts | map('regex_replace', '(.+)', '\\1:80') | list }}
{{ hosts | map('regex_replace', '$', ':80') | list }}
BAD:
{{ hosts | map('regex_replace', '(.*)', '\\1:80') | list }}
.. note:: Prior to ansible 2.0, if "regex_replace" filter was used with variables inside YAML arguments (as opposed to simpler 'key=value' arguments),
then you needed to escape backreferences (e.g. ``\\1``) with 4 backslashes (``\\\\``) instead of 2 (``\\``).
.. versionadded:: 2.0
To escape special characters within a standard Python regex, use the "regex_escape" filter (using the default re_type='python' option)::
# convert '^f.*o(.*)$' to '\^f\.\*o\(\.\*\)\$'
{{ '^f.*o(.*)$' | regex_escape() }}
.. versionadded:: 2.8
To escape special characters within a POSIX basic regex, use the "regex_escape" filter with the re_type='posix_basic' option::
# convert '^f.*o(.*)$' to '\^f\.\*o(\.\*)\$'
{{ '^f.*o(.*)$' | regex_escape('posix_basic') }}
Working with filenames and pathnames
------------------------------------
To get the last name of a file path, like 'foo.txt' out of '/etc/asdf/foo.txt'::
{{ path | basename }}
To get the last name of a windows style file path (new in version 2.0)::
{{ path | win_basename }}
To separate the windows drive letter from the rest of a file path (new in version 2.0)::
{{ path | win_splitdrive }}
To get only the windows drive letter::
{{ path | win_splitdrive | first }}
To get the rest of the path without the drive letter::
{{ path | win_splitdrive | last }}
To get the directory from a path::
{{ path | dirname }}
To get the directory from a windows path (new version 2.0)::
{{ path | win_dirname }}
To expand a path containing a tilde (`~`) character (new in version 1.5)::
{{ path | expanduser }}
To expand a path containing environment variables::
{{ path | expandvars }}
.. note:: `expandvars` expands local variables; using it on remote paths can lead to errors.
.. versionadded:: 2.6
To get the real path of a link (new in version 1.8)::
{{ path | realpath }}
To get the relative path of a link, from a start point (new in version 1.7)::
{{ path | relpath('/etc') }}
To get the root and extension of a path or filename (new in version 2.0)::
# with path == 'nginx.conf' the return would be ('nginx', '.conf')
{{ path | splitext }}
To join one or more path components::
{{ ('/etc', path, 'subdir', file) | path_join }}
.. versionadded:: 2.10
String filters
==============
To add quotes for shell usage::
- shell: echo {{ string_value | quote }}
To concatenate a list into a string::
{{ list | join(" ") }}
To work with Base64 encoded strings::
{{ encoded | b64decode }}
{{ decoded | string | b64encode }}
As of version 2.6, you can define the type of encoding to use, the default is ``utf-8``::
{{ encoded | b64decode(encoding='utf-16-le') }}
{{ decoded | string | b64encode(encoding='utf-16-le') }}
.. note:: The ``string`` filter is only required for Python 2 and ensures that text to encode is a unicode string.
Without that filter before b64encode the wrong value will be encoded.
.. versionadded:: 2.6
UUID filters
============
To create a namespaced UUIDv5::
{{ string | to_uuid(namespace='11111111-2222-3333-4444-555555555555') }}
.. versionadded:: 2.10
To create a namespaced UUIDv5 using the default Ansible namespace '361E6D51-FAEC-444A-9079-341386DA8E2E'::
{{ string | to_uuid }}
.. versionadded:: 1.9
To make use of one attribute from each item in a list of complex variables, use the :func:`Jinja2 map filter <jinja2:map>`::
# get a comma-separated list of the mount points (e.g. "/,/mnt/stuff") on a host
{{ ansible_mounts | map(attribute='mount') | join(',') }}
Date and time filters
=====================
To get a date object from a string use the `to_datetime` filter::
# Get total amount of seconds between two dates. Default date format is %Y-%m-%d %H:%M:%S but you can pass your own format
{{ (("2016-08-14 20:00:12" | to_datetime) - ("2015-12-25" | to_datetime('%Y-%m-%d'))).total_seconds() }}
# Get remaining seconds after delta has been calculated. NOTE: This does NOT convert years, days, hours, etc to seconds. For that, use total_seconds()
{{ (("2016-08-14 20:00:12" | to_datetime) - ("2016-08-14 18:00:00" | to_datetime)).seconds }}
# This expression evaluates to "12" and not "132". Delta is 2 hours, 12 seconds
# get amount of days between two dates. This returns only number of days and discards remaining hours, minutes, and seconds
{{ (("2016-08-14 20:00:12" | to_datetime) - ("2015-12-25" | to_datetime('%Y-%m-%d'))).days }}
.. versionadded:: 2.4
To format a date using a string (like with the shell date command), use the "strftime" filter::
# Display year-month-day
{{ '%Y-%m-%d' | strftime }}
# Display hour:min:sec
{{ '%H:%M:%S' | strftime }}
# Use ansible_date_time.epoch fact
{{ '%Y-%m-%d %H:%M:%S' | strftime(ansible_date_time.epoch) }}
# Use arbitrary epoch value
{{ '%Y-%m-%d' | strftime(0) }} # => 1970-01-01
{{ '%Y-%m-%d' | strftime(1441357287) }} # => 2015-09-04
.. note:: To get all string possibilities, check https://docs.python.org/2/library/time.html#time.strftime
Kubernetes filters
==================
Use the "k8s_config_resource_name" filter to obtain the name of a Kubernetes ConfigMap or Secret,
including its hash::
{{ configmap_resource_definition | k8s_config_resource_name }}
This can then be used to reference hashes in Pod specifications::
my_secret:
kind: Secret
name: my_secret_name
deployment_resource:
kind: Deployment
spec:
template:
spec:
containers:
- envFrom:
- secretRef:
name: {{ my_secret | k8s_config_resource_name }}
.. versionadded:: 2.8
.. _PyYAML library: https://pyyaml.org/
.. _PyYAML documentation: https://pyyaml.org/wiki/PyYAMLDocumentation
.. seealso::
:ref:`about_playbooks`
An introduction to playbooks
:ref:`playbooks_conditionals`
Conditional statements in playbooks
:ref:`playbooks_variables`
All about variables
:ref:`playbooks_loops`
Looping in playbooks
:ref:`playbooks_reuse_roles`
Playbook organization by roles
:ref:`playbooks_best_practices`
Best practices in playbooks
`User Mailing List <https://groups.google.com/group/ansible-devel>`_
Have a question? Stop by the google group!
`irc.freenode.net <http://irc.freenode.net>`_
#ansible IRC chat channel
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 61,985 |
Multiline support for the regex_replace filter
|
##### SUMMARY
I think the `regex_replace` jinja filter should support `multiline` as the `regex_search` do.
Or maybe use a `flags` parameter to support any `re` flag.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
- `lib/ansible/plugins/filter/core.py`
##### ADDITIONAL INFORMATION
Example: comment a multiline variable.
```yaml
val: "{{ val | regex_replace('^', '#', multiline=True) }}"
```
|
https://github.com/ansible/ansible/issues/61985
|
https://github.com/ansible/ansible/pull/65051
|
b3db41e6d8d8fe0b2d559a35ab2a4557b26c56a8
|
e867535a5700ab5cc38e57593e24926bb3f4b903
| 2019-09-09T09:57:21Z |
python
| 2020-02-19T17:19:40Z |
lib/ansible/plugins/filter/core.py
|
# (c) 2012, Jeroen Hoekx <[email protected]>
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import base64
import crypt
import glob
import hashlib
import itertools
import json
import ntpath
import os.path
import re
import string
import sys
import time
import uuid
import yaml
import datetime
from functools import partial
from random import Random, SystemRandom, shuffle
from jinja2.filters import environmentfilter, do_groupby as _do_groupby
from ansible.errors import AnsibleError, AnsibleFilterError
from ansible.module_utils.six import iteritems, string_types, integer_types, reraise
from ansible.module_utils.six.moves import reduce, shlex_quote
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.module_utils.common.collections import is_sequence
from ansible.module_utils.common._collections_compat import Mapping
from ansible.parsing.ajson import AnsibleJSONEncoder
from ansible.parsing.yaml.dumper import AnsibleDumper
from ansible.template import recursive_check_defined
from ansible.utils.display import Display
from ansible.utils.encrypt import passlib_or_crypt
from ansible.utils.hashing import md5s, checksum_s
from ansible.utils.unicode import unicode_wrap
from ansible.utils.vars import merge_hash
display = Display()
UUID_NAMESPACE_ANSIBLE = uuid.UUID('361E6D51-FAEC-444A-9079-341386DA8E2E')
def to_yaml(a, *args, **kw):
'''Make verbose, human readable yaml'''
default_flow_style = kw.pop('default_flow_style', None)
transformed = yaml.dump(a, Dumper=AnsibleDumper, allow_unicode=True, default_flow_style=default_flow_style, **kw)
return to_text(transformed)
def to_nice_yaml(a, indent=4, *args, **kw):
'''Make verbose, human readable yaml'''
transformed = yaml.dump(a, Dumper=AnsibleDumper, indent=indent, allow_unicode=True, default_flow_style=False, **kw)
return to_text(transformed)
def to_json(a, *args, **kw):
''' Convert the value to JSON '''
return json.dumps(a, cls=AnsibleJSONEncoder, *args, **kw)
def to_nice_json(a, indent=4, sort_keys=True, *args, **kw):
'''Make verbose, human readable JSON'''
try:
return json.dumps(a, indent=indent, sort_keys=sort_keys, separators=(',', ': '), cls=AnsibleJSONEncoder, *args, **kw)
except Exception as e:
# Fallback to the to_json filter
display.warning(u'Unable to convert data using to_nice_json, falling back to to_json: %s' % to_text(e))
return to_json(a, *args, **kw)
def to_bool(a):
''' return a bool for the arg '''
if a is None or isinstance(a, bool):
return a
if isinstance(a, string_types):
a = a.lower()
if a in ('yes', 'on', '1', 'true', 1):
return True
return False
def to_datetime(string, format="%Y-%m-%d %H:%M:%S"):
return datetime.datetime.strptime(string, format)
def strftime(string_format, second=None):
''' return a date string using string. See https://docs.python.org/2/library/time.html#time.strftime for format '''
if second is not None:
try:
second = int(second)
except Exception:
raise AnsibleFilterError('Invalid value for epoch value (%s)' % second)
return time.strftime(string_format, time.localtime(second))
def quote(a):
''' return its argument quoted for shell usage '''
return shlex_quote(to_text(a))
def fileglob(pathname):
''' return list of matched regular files for glob '''
return [g for g in glob.glob(pathname) if os.path.isfile(g)]
def regex_replace(value='', pattern='', replacement='', ignorecase=False):
''' Perform a `re.sub` returning a string '''
value = to_text(value, errors='surrogate_or_strict', nonstring='simplerepr')
if ignorecase:
flags = re.I
else:
flags = 0
_re = re.compile(pattern, flags=flags)
return _re.sub(replacement, value)
def regex_findall(value, regex, multiline=False, ignorecase=False):
''' Perform re.findall and return the list of matches '''
flags = 0
if ignorecase:
flags |= re.I
if multiline:
flags |= re.M
return re.findall(regex, value, flags)
def regex_search(value, regex, *args, **kwargs):
''' Perform re.search and return the list of matches or a backref '''
groups = list()
for arg in args:
if arg.startswith('\\g'):
match = re.match(r'\\g<(\S+)>', arg).group(1)
groups.append(match)
elif arg.startswith('\\'):
match = int(re.match(r'\\(\d+)', arg).group(1))
groups.append(match)
else:
raise AnsibleFilterError('Unknown argument')
flags = 0
if kwargs.get('ignorecase'):
flags |= re.I
if kwargs.get('multiline'):
flags |= re.M
match = re.search(regex, value, flags)
if match:
if not groups:
return match.group()
else:
items = list()
for item in groups:
items.append(match.group(item))
return items
def ternary(value, true_val, false_val, none_val=None):
''' value ? true_val : false_val '''
if value is None and none_val is not None:
return none_val
elif bool(value):
return true_val
else:
return false_val
def regex_escape(string, re_type='python'):
'''Escape all regular expressions special characters from STRING.'''
if re_type == 'python':
return re.escape(string)
elif re_type == 'posix_basic':
# list of BRE special chars:
# https://en.wikibooks.org/wiki/Regular_Expressions/POSIX_Basic_Regular_Expressions
return regex_replace(string, r'([].[^$*\\])', r'\\\1')
# TODO: implement posix_extended
# It's similar to, but different from python regex, which is similar to,
# but different from PCRE. It's possible that re.escape would work here.
# https://remram44.github.io/regex-cheatsheet/regex.html#programs
elif re_type == 'posix_extended':
raise AnsibleFilterError('Regex type (%s) not yet implemented' % re_type)
else:
raise AnsibleFilterError('Invalid regex type (%s)' % re_type)
def from_yaml(data):
if isinstance(data, string_types):
return yaml.safe_load(data)
return data
def from_yaml_all(data):
if isinstance(data, string_types):
return yaml.safe_load_all(data)
return data
@environmentfilter
def rand(environment, end, start=None, step=None, seed=None):
if seed is None:
r = SystemRandom()
else:
r = Random(seed)
if isinstance(end, integer_types):
if not start:
start = 0
if not step:
step = 1
return r.randrange(start, end, step)
elif hasattr(end, '__iter__'):
if start or step:
raise AnsibleFilterError('start and step can only be used with integer values')
return r.choice(end)
else:
raise AnsibleFilterError('random can only be used on sequences and integers')
def randomize_list(mylist, seed=None):
try:
mylist = list(mylist)
if seed:
r = Random(seed)
r.shuffle(mylist)
else:
shuffle(mylist)
except Exception:
pass
return mylist
def get_hash(data, hashtype='sha1'):
try: # see if hash is supported
h = hashlib.new(hashtype)
except Exception:
return None
h.update(to_bytes(data, errors='surrogate_or_strict'))
return h.hexdigest()
def get_encrypted_password(password, hashtype='sha512', salt=None, salt_size=None, rounds=None):
passlib_mapping = {
'md5': 'md5_crypt',
'blowfish': 'bcrypt',
'sha256': 'sha256_crypt',
'sha512': 'sha512_crypt',
}
hashtype = passlib_mapping.get(hashtype, hashtype)
try:
return passlib_or_crypt(password, hashtype, salt=salt, salt_size=salt_size, rounds=rounds)
except AnsibleError as e:
reraise(AnsibleFilterError, AnsibleFilterError(to_native(e), orig_exc=e), sys.exc_info()[2])
def to_uuid(string, namespace=UUID_NAMESPACE_ANSIBLE):
uuid_namespace = namespace
if not isinstance(uuid_namespace, uuid.UUID):
try:
uuid_namespace = uuid.UUID(namespace)
except (AttributeError, ValueError) as e:
raise AnsibleFilterError("Invalid value '%s' for 'namespace': %s" % (to_native(namespace), to_native(e)))
# uuid.uuid5() requires bytes on Python 2 and bytes or text or Python 3
return to_text(uuid.uuid5(uuid_namespace, to_native(string, errors='surrogate_or_strict')))
def mandatory(a, msg=None):
from jinja2.runtime import Undefined
''' Make a variable mandatory '''
if isinstance(a, Undefined):
if a._undefined_name is not None:
name = "'%s' " % to_text(a._undefined_name)
else:
name = ''
if msg is not None:
raise AnsibleFilterError(to_native(msg))
else:
raise AnsibleFilterError("Mandatory variable %s not defined." % name)
return a
def combine(*terms, **kwargs):
recursive = kwargs.pop('recursive', False)
list_merge = kwargs.pop('list_merge', 'replace')
if kwargs:
raise AnsibleFilterError("'recursive' and 'list_merge' are the only valid keyword arguments")
# allow the user to do `[dict1, dict2, ...] | combine`
dictionaries = flatten(terms, levels=1)
# recursively check that every elements are defined (for jinja2)
recursive_check_defined(dictionaries)
if not dictionaries:
return {}
if len(dictionaries) == 1:
return dictionaries[0]
# merge all the dicts so that the dict at the end of the array have precedence
# over the dict at the beginning.
# we merge the dicts from the highest to the lowest priority because there is
# a huge probability that the lowest priority dict will be the biggest in size
# (as the low prio dict will hold the "default" values and the others will be "patches")
# and merge_hash create a copy of it's first argument.
# so high/right -> low/left is more efficient than low/left -> high/right
high_to_low_prio_dict_iterator = reversed(dictionaries)
result = next(high_to_low_prio_dict_iterator)
for dictionary in high_to_low_prio_dict_iterator:
result = merge_hash(dictionary, result, recursive, list_merge)
return result
def comment(text, style='plain', **kw):
# Predefined comment types
comment_styles = {
'plain': {
'decoration': '# '
},
'erlang': {
'decoration': '% '
},
'c': {
'decoration': '// '
},
'cblock': {
'beginning': '/*',
'decoration': ' * ',
'end': ' */'
},
'xml': {
'beginning': '<!--',
'decoration': ' - ',
'end': '-->'
}
}
# Pointer to the right comment type
style_params = comment_styles[style]
if 'decoration' in kw:
prepostfix = kw['decoration']
else:
prepostfix = style_params['decoration']
# Default params
p = {
'newline': '\n',
'beginning': '',
'prefix': (prepostfix).rstrip(),
'prefix_count': 1,
'decoration': '',
'postfix': (prepostfix).rstrip(),
'postfix_count': 1,
'end': ''
}
# Update default params
p.update(style_params)
p.update(kw)
# Compose substrings for the final string
str_beginning = ''
if p['beginning']:
str_beginning = "%s%s" % (p['beginning'], p['newline'])
str_prefix = ''
if p['prefix']:
if p['prefix'] != p['newline']:
str_prefix = str(
"%s%s" % (p['prefix'], p['newline'])) * int(p['prefix_count'])
else:
str_prefix = str(
"%s" % (p['newline'])) * int(p['prefix_count'])
str_text = ("%s%s" % (
p['decoration'],
# Prepend each line of the text with the decorator
text.replace(
p['newline'], "%s%s" % (p['newline'], p['decoration'])))).replace(
# Remove trailing spaces when only decorator is on the line
"%s%s" % (p['decoration'], p['newline']),
"%s%s" % (p['decoration'].rstrip(), p['newline']))
str_postfix = p['newline'].join(
[''] + [p['postfix'] for x in range(p['postfix_count'])])
str_end = ''
if p['end']:
str_end = "%s%s" % (p['newline'], p['end'])
# Return the final string
return "%s%s%s%s%s" % (
str_beginning,
str_prefix,
str_text,
str_postfix,
str_end)
@environmentfilter
def extract(environment, item, container, morekeys=None):
if morekeys is None:
keys = [item]
elif isinstance(morekeys, list):
keys = [item] + morekeys
else:
keys = [item, morekeys]
value = container
for key in keys:
value = environment.getitem(value, key)
return value
@environmentfilter
def do_groupby(environment, value, attribute):
"""Overridden groupby filter for jinja2, to address an issue with
jinja2>=2.9.0,<2.9.5 where a namedtuple was returned which
has repr that prevents ansible.template.safe_eval.safe_eval from being
able to parse and eval the data.
jinja2<2.9.0,>=2.9.5 is not affected, as <2.9.0 uses a tuple, and
>=2.9.5 uses a standard tuple repr on the namedtuple.
The adaptation here, is to run the jinja2 `do_groupby` function, and
cast all of the namedtuples to a regular tuple.
See https://github.com/ansible/ansible/issues/20098
We may be able to remove this in the future.
"""
return [tuple(t) for t in _do_groupby(environment, value, attribute)]
def b64encode(string, encoding='utf-8'):
return to_text(base64.b64encode(to_bytes(string, encoding=encoding, errors='surrogate_or_strict')))
def b64decode(string, encoding='utf-8'):
return to_text(base64.b64decode(to_bytes(string, errors='surrogate_or_strict')), encoding=encoding)
def flatten(mylist, levels=None):
ret = []
for element in mylist:
if element in (None, 'None', 'null'):
# ignore undefined items
break
elif is_sequence(element):
if levels is None:
ret.extend(flatten(element))
elif levels >= 1:
# decrement as we go down the stack
ret.extend(flatten(element, levels=(int(levels) - 1)))
else:
ret.append(element)
else:
ret.append(element)
return ret
def subelements(obj, subelements, skip_missing=False):
'''Accepts a dict or list of dicts, and a dotted accessor and produces a product
of the element and the results of the dotted accessor
>>> obj = [{"name": "alice", "groups": ["wheel"], "authorized": ["/tmp/alice/onekey.pub"]}]
>>> subelements(obj, 'groups')
[({'name': 'alice', 'groups': ['wheel'], 'authorized': ['/tmp/alice/onekey.pub']}, 'wheel')]
'''
if isinstance(obj, dict):
element_list = list(obj.values())
elif isinstance(obj, list):
element_list = obj[:]
else:
raise AnsibleFilterError('obj must be a list of dicts or a nested dict')
if isinstance(subelements, list):
subelement_list = subelements[:]
elif isinstance(subelements, string_types):
subelement_list = subelements.split('.')
else:
raise AnsibleFilterError('subelements must be a list or a string')
results = []
for element in element_list:
values = element
for subelement in subelement_list:
try:
values = values[subelement]
except KeyError:
if skip_missing:
values = []
break
raise AnsibleFilterError("could not find %r key in iterated item %r" % (subelement, values))
except TypeError:
raise AnsibleFilterError("the key %s should point to a dictionary, got '%s'" % (subelement, values))
if not isinstance(values, list):
raise AnsibleFilterError("the key %r should point to a list, got %r" % (subelement, values))
for value in values:
results.append((element, value))
return results
def dict_to_list_of_dict_key_value_elements(mydict, key_name='key', value_name='value'):
''' takes a dictionary and transforms it into a list of dictionaries,
with each having a 'key' and 'value' keys that correspond to the keys and values of the original '''
if not isinstance(mydict, Mapping):
raise AnsibleFilterError("dict2items requires a dictionary, got %s instead." % type(mydict))
ret = []
for key in mydict:
ret.append({key_name: key, value_name: mydict[key]})
return ret
def list_of_dict_key_value_elements_to_dict(mylist, key_name='key', value_name='value'):
''' takes a list of dicts with each having a 'key' and 'value' keys, and transforms the list into a dictionary,
effectively as the reverse of dict2items '''
if not is_sequence(mylist):
raise AnsibleFilterError("items2dict requires a list, got %s instead." % type(mylist))
return dict((item[key_name], item[value_name]) for item in mylist)
def path_join(paths):
''' takes a sequence or a string, and return a concatenation
of the different members '''
if isinstance(paths, string_types):
return os.path.join(paths)
elif is_sequence(paths):
return os.path.join(*paths)
else:
raise AnsibleFilterError("|path_join expects string or sequence, got %s instead." % type(paths))
class FilterModule(object):
''' Ansible core jinja2 filters '''
def filters(self):
return {
# jinja2 overrides
'groupby': do_groupby,
# base 64
'b64decode': b64decode,
'b64encode': b64encode,
# uuid
'to_uuid': to_uuid,
# json
'to_json': to_json,
'to_nice_json': to_nice_json,
'from_json': json.loads,
# yaml
'to_yaml': to_yaml,
'to_nice_yaml': to_nice_yaml,
'from_yaml': from_yaml,
'from_yaml_all': from_yaml_all,
# path
'basename': partial(unicode_wrap, os.path.basename),
'dirname': partial(unicode_wrap, os.path.dirname),
'expanduser': partial(unicode_wrap, os.path.expanduser),
'expandvars': partial(unicode_wrap, os.path.expandvars),
'path_join': path_join,
'realpath': partial(unicode_wrap, os.path.realpath),
'relpath': partial(unicode_wrap, os.path.relpath),
'splitext': partial(unicode_wrap, os.path.splitext),
'win_basename': partial(unicode_wrap, ntpath.basename),
'win_dirname': partial(unicode_wrap, ntpath.dirname),
'win_splitdrive': partial(unicode_wrap, ntpath.splitdrive),
# file glob
'fileglob': fileglob,
# types
'bool': to_bool,
'to_datetime': to_datetime,
# date formatting
'strftime': strftime,
# quote string for shell usage
'quote': quote,
# hash filters
# md5 hex digest of string
'md5': md5s,
# sha1 hex digest of string
'sha1': checksum_s,
# checksum of string as used by ansible for checksumming files
'checksum': checksum_s,
# generic hashing
'password_hash': get_encrypted_password,
'hash': get_hash,
# regex
'regex_replace': regex_replace,
'regex_escape': regex_escape,
'regex_search': regex_search,
'regex_findall': regex_findall,
# ? : ;
'ternary': ternary,
# random stuff
'random': rand,
'shuffle': randomize_list,
# undefined
'mandatory': mandatory,
# comment-style decoration
'comment': comment,
# debug
'type_debug': lambda o: o.__class__.__name__,
# Data structures
'combine': combine,
'extract': extract,
'flatten': flatten,
'dict2items': dict_to_list_of_dict_key_value_elements,
'items2dict': list_of_dict_key_value_elements_to_dict,
'subelements': subelements,
}
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 61,985 |
Multiline support for the regex_replace filter
|
##### SUMMARY
I think the `regex_replace` jinja filter should support `multiline` as the `regex_search` do.
Or maybe use a `flags` parameter to support any `re` flag.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
- `lib/ansible/plugins/filter/core.py`
##### ADDITIONAL INFORMATION
Example: comment a multiline variable.
```yaml
val: "{{ val | regex_replace('^', '#', multiline=True) }}"
```
|
https://github.com/ansible/ansible/issues/61985
|
https://github.com/ansible/ansible/pull/65051
|
b3db41e6d8d8fe0b2d559a35ab2a4557b26c56a8
|
e867535a5700ab5cc38e57593e24926bb3f4b903
| 2019-09-09T09:57:21Z |
python
| 2020-02-19T17:19:40Z |
test/integration/targets/filter_core/files/foo.txt
|
This is a test of various filter plugins found in Ansible (ex: core.py), and
not so much a test of the core filters in Jinja2.
Dumping the same structure to YAML
- this is a list element
- this: is a hash element in a list
warp: 9
where: endor
Dumping the same structure to JSON, but don't pretty print
["this is a list element", {"this": "is a hash element in a list", "warp": 9, "where": "endor"}]
Dumping the same structure to YAML, but don't pretty print
- this is a list element
- {this: is a hash element in a list, warp: 9, where: endor}
From a recorded task, the changed, failed, success, and skipped
tests are shortcuts to ask if those tasks produced changes, failed,
succeeded, or skipped (as one might guess).
Changed = True
Failed = False
Success = True
Skipped = False
The mandatory filter fails if a variable is not defined and returns the value.
To avoid breaking this test, this variable is already defined.
a = 1
There are various casts available
int = 1
bool = True
String quoting
quoted = quoted
The fileglob module returns the list of things matching a pattern.
fileglob = one.txt, two.txt
There are also various string operations that work on paths. These do not require
files to exist and are passthrus to the python os.path functions
/etc/motd with basename = motd
/etc/motd with dirname = /etc
path_join_simple = /etc/subdir/test
path_join_with_slash = /test
path_join_relative = etc/subdir/test
TODO: realpath follows symlinks. There isn't a test for this just now.
TODO: add tests for set theory operations like union
regex_replace = bar
regex_search = 0001
regex_findall = "['car', 'tar', 'bar']"
regex_escape = \^f\.\*o\(\.\*\)\$
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 61,985 |
Multiline support for the regex_replace filter
|
##### SUMMARY
I think the `regex_replace` jinja filter should support `multiline` as the `regex_search` do.
Or maybe use a `flags` parameter to support any `re` flag.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
- `lib/ansible/plugins/filter/core.py`
##### ADDITIONAL INFORMATION
Example: comment a multiline variable.
```yaml
val: "{{ val | regex_replace('^', '#', multiline=True) }}"
```
|
https://github.com/ansible/ansible/issues/61985
|
https://github.com/ansible/ansible/pull/65051
|
b3db41e6d8d8fe0b2d559a35ab2a4557b26c56a8
|
e867535a5700ab5cc38e57593e24926bb3f4b903
| 2019-09-09T09:57:21Z |
python
| 2020-02-19T17:19:40Z |
test/integration/targets/filter_core/templates/foo.j2
|
This is a test of various filter plugins found in Ansible (ex: core.py), and
not so much a test of the core filters in Jinja2.
Dumping the same structure to YAML
{{ some_structure | to_nice_yaml }}
Dumping the same structure to JSON, but don't pretty print
{{ some_structure | to_json(sort_keys=true) }}
Dumping the same structure to YAML, but don't pretty print
{{ some_structure | to_yaml }}
From a recorded task, the changed, failed, success, and skipped
tests are shortcuts to ask if those tasks produced changes, failed,
succeeded, or skipped (as one might guess).
Changed = {{ some_registered_var is changed }}
Failed = {{ some_registered_var is failed }}
Success = {{ some_registered_var is successful }}
Skipped = {{ some_registered_var is skipped }}
The mandatory filter fails if a variable is not defined and returns the value.
To avoid breaking this test, this variable is already defined.
a = {{ a | mandatory }}
There are various casts available
int = {{ a | int }}
bool = {{ 1 | bool }}
String quoting
quoted = {{ 'quoted' | quote }}
The fileglob module returns the list of things matching a pattern.
fileglob = {{ (playbook_dir + '/files/fileglob/*') | fileglob | map('basename') | sort | join(', ') }}
There are also various string operations that work on paths. These do not require
files to exist and are passthrus to the python os.path functions
/etc/motd with basename = {{ '/etc/motd' | basename }}
/etc/motd with dirname = {{ '/etc/motd' | dirname }}
path_join_simple = {{ ('/etc', 'subdir', 'test') | path_join }}
path_join_with_slash = {{ ('/etc', 'subdir', '/test') | path_join }}
path_join_relative = {{ ('etc', 'subdir', 'test') | path_join }}
TODO: realpath follows symlinks. There isn't a test for this just now.
TODO: add tests for set theory operations like union
regex_replace = {{ 'foo' | regex_replace('^foo', 'bar') }}
regex_search = {{ 'test_value_0001' | regex_search('([0-9]+)$')}}
regex_findall = "{{ 'car\ntar\nfoo\nbar\n' | regex_findall('^.ar$', multiline=True) }}"
regex_escape = {{ '^f.*o(.*)$' | regex_escape() }}
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,591 |
ansible-test validate-modules for PS modules not loading Ansible utils when in collection
|
##### SUMMARY
When running `ansible-test sanity --test validate-modules` on a PowerShell module in a collection it will fail to import any module util that is in Ansible itself. This causes issues if it requires access to those utils to build the arg spec as we can see with https://app.shippable.com/github/ansible-collections/ansible.windows/runs/16/5/console.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-test validate-modules
##### ANSIBLE VERSION
```paste below
devel
```
##### CONFIGURATION
N/A
##### OS / ENVIRONMENT
N/A - ansible-test against a collection
|
https://github.com/ansible/ansible/issues/67591
|
https://github.com/ansible/ansible/pull/67596
|
b54e64bbc9f039c3706137ce09f035bf4f55ac7c
|
36def8bf03b7d954c79bd53f690105a40c2b9bd3
| 2020-02-19T22:49:53Z |
python
| 2020-02-20T04:32:21Z |
changelogs/fragments/valdate-modules-ps-arg-util.yaml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,591 |
ansible-test validate-modules for PS modules not loading Ansible utils when in collection
|
##### SUMMARY
When running `ansible-test sanity --test validate-modules` on a PowerShell module in a collection it will fail to import any module util that is in Ansible itself. This causes issues if it requires access to those utils to build the arg spec as we can see with https://app.shippable.com/github/ansible-collections/ansible.windows/runs/16/5/console.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-test validate-modules
##### ANSIBLE VERSION
```paste below
devel
```
##### CONFIGURATION
N/A
##### OS / ENVIRONMENT
N/A - ansible-test against a collection
|
https://github.com/ansible/ansible/issues/67591
|
https://github.com/ansible/ansible/pull/67596
|
b54e64bbc9f039c3706137ce09f035bf4f55ac7c
|
36def8bf03b7d954c79bd53f690105a40c2b9bd3
| 2020-02-19T22:49:53Z |
python
| 2020-02-20T04:32:21Z |
lib/ansible/executor/powershell/module_manifest.py
|
# (c) 2018 Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import base64
import errno
import json
import os
import pkgutil
import random
import re
from distutils.version import LooseVersion
# HACK: keep Python 2.6 controller tests happy in CI until they're properly split
try:
from importlib import import_module
except ImportError:
import_module = __import__
from ansible import constants as C
from ansible.errors import AnsibleError
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.plugins.loader import ps_module_utils_loader
class PSModuleDepFinder(object):
def __init__(self):
self.ps_modules = dict()
self.exec_scripts = dict()
# by defining an explicit dict of cs utils and where they are used, we
# can potentially save time by not adding the type multiple times if it
# isn't needed
self.cs_utils_wrapper = dict()
self.cs_utils_module = dict()
self.ps_version = None
self.os_version = None
self.become = False
self._re_cs_module = [
# Reference C# module_util in another C# util
# 'using ansible_collections.{namespace}.{collection}.plugins.module_utils.{name}'
re.compile(to_bytes(r'(?i)^using\s((Ansible\..+)|'
r'(ansible_collections\.\w+\.\w+\.plugins\.module_utils\.[\w\.]+));\s*$')),
]
self._re_cs_in_ps_module = [
# Reference C# module_util in a PowerShell module
# '#AnsibleRequires -CSharpUtil Ansible.{name}'
# '#AnsibleRequires -CSharpUtil ansible_collections.{namespace}.{collection}.plugins.module_utils.{name}'
re.compile(to_bytes(r'(?i)^#\s*ansiblerequires\s+-csharputil\s+((Ansible\..+)|'
r'(ansible_collections\.\w+\.\w+\.plugins\.module_utils\.[\w\.]+))')),
]
self._re_ps_module = [
# Original way of referencing a builtin module_util
# '#Requires -Module Ansible.ModuleUtils.{name}
re.compile(to_bytes(r'(?i)^#\s*requires\s+\-module(?:s?)\s*(Ansible\.ModuleUtils\..+)')),
# New way of referencing a builtin and collection module_util
# '#AnsibleRequires -PowerShell ansible_collections.{namespace}.{collection}.plugins.module_utils.{name}'
# '#AnsibleRequires -PowerShell Ansible.ModuleUtils.{name}'
re.compile(to_bytes(r'(?i)^#\s*ansiblerequires\s+-powershell\s+(((Ansible\.ModuleUtils\..+))|'
r'(ansible_collections\.\w+\.\w+\.plugins\.module_utils\.[\w\.]+))')),
]
self._re_wrapper = re.compile(to_bytes(r'(?i)^#\s*ansiblerequires\s+-wrapper\s+(\w*)'))
self._re_ps_version = re.compile(to_bytes(r'(?i)^#requires\s+\-version\s+([0-9]+(\.[0-9]+){0,3})$'))
self._re_os_version = re.compile(to_bytes(r'(?i)^#ansiblerequires\s+\-osversion\s+([0-9]+(\.[0-9]+){0,3})$'))
self._re_become = re.compile(to_bytes(r'(?i)^#ansiblerequires\s+\-become$'))
def scan_module(self, module_data, wrapper=False, powershell=True):
lines = module_data.split(b'\n')
module_utils = set()
if wrapper:
cs_utils = self.cs_utils_wrapper
else:
cs_utils = self.cs_utils_module
if powershell:
checks = [
# PS module contains '#Requires -Module Ansible.ModuleUtils.*'
# PS module contains '#AnsibleRequires -Powershell Ansible.*' (or FQ collections module_utils ref)
(self._re_ps_module, self.ps_modules, ".psm1"),
# PS module contains '#AnsibleRequires -CSharpUtil Ansible.*'
(self._re_cs_in_ps_module, cs_utils, ".cs"),
]
else:
checks = [
# CS module contains 'using Ansible.*;' or 'using ansible_collections.ns.coll.plugins.module_utils.*;'
(self._re_cs_module, cs_utils, ".cs"),
]
for line in lines:
for check in checks:
for pattern in check[0]:
match = pattern.match(line)
if match:
# tolerate windows line endings by stripping any remaining
# newline chars
module_util_name = to_text(match.group(1).rstrip())
if module_util_name not in check[1].keys():
module_utils.add((module_util_name, check[2]))
break
if powershell:
ps_version_match = self._re_ps_version.match(line)
if ps_version_match:
self._parse_version_match(ps_version_match, "ps_version")
os_version_match = self._re_os_version.match(line)
if os_version_match:
self._parse_version_match(os_version_match, "os_version")
# once become is set, no need to keep on checking recursively
if not self.become:
become_match = self._re_become.match(line)
if become_match:
self.become = True
if wrapper:
wrapper_match = self._re_wrapper.match(line)
if wrapper_match:
self.scan_exec_script(wrapper_match.group(1).rstrip())
# recursively drill into each Requires to see if there are any more
# requirements
for m in set(module_utils):
self._add_module(m, wrapper=wrapper)
def scan_exec_script(self, name):
# scans lib/ansible/executor/powershell for scripts used in the module
# exec side. It also scans these scripts for any dependencies
name = to_text(name)
if name in self.exec_scripts.keys():
return
data = pkgutil.get_data("ansible.executor.powershell", name + ".ps1")
if data is None:
raise AnsibleError("Could not find executor powershell script "
"for '%s'" % name)
b_data = to_bytes(data)
# remove comments to reduce the payload size in the exec wrappers
if C.DEFAULT_DEBUG:
exec_script = b_data
else:
exec_script = _strip_comments(b_data)
self.exec_scripts[name] = to_bytes(exec_script)
self.scan_module(b_data, wrapper=True, powershell=True)
def _add_module(self, name, wrapper=False):
m, ext = name
m = to_text(m)
if m.startswith("Ansible."):
# Builtin util, use plugin loader to get the data
mu_path = ps_module_utils_loader.find_plugin(m, ext)
if not mu_path:
raise AnsibleError('Could not find imported module support code '
'for \'%s\'' % m)
module_util_data = to_bytes(_slurp(mu_path))
else:
# Collection util, load the package data based on the util import.
submodules = tuple(m.split("."))
n_package_name = to_native('.'.join(submodules[:-1]), errors='surrogate_or_strict')
n_resource_name = to_native(submodules[-1] + ext, errors='surrogate_or_strict')
try:
module_util = import_module(to_native(n_package_name))
module_util_data = to_bytes(pkgutil.get_data(n_package_name, n_resource_name),
errors='surrogate_or_strict')
# Get the path of the util which is required for coverage collection.
resource_paths = list(module_util.__path__)
if len(resource_paths) != 1:
# This should never happen with a collection but we are just being defensive about it.
raise AnsibleError("Internal error: Referenced module_util package '%s' contains 0 or multiple "
"import locations when we only expect 1." % n_package_name)
mu_path = os.path.join(resource_paths[0], n_resource_name)
except OSError as err:
if err.errno == errno.ENOENT:
raise AnsibleError('Could not find collection imported module support code for \'%s\''
% to_native(m))
else:
raise
util_info = {
'data': module_util_data,
'path': to_text(mu_path),
}
if ext == ".psm1":
self.ps_modules[m] = util_info
else:
if wrapper:
self.cs_utils_wrapper[m] = util_info
else:
self.cs_utils_module[m] = util_info
self.scan_module(module_util_data, wrapper=wrapper,
powershell=(ext == ".psm1"))
def _parse_version_match(self, match, attribute):
new_version = to_text(match.group(1)).rstrip()
# PowerShell cannot cast a string of "1" to Version, it must have at
# least the major.minor for it to be valid so we append 0
if match.group(2) is None:
new_version = "%s.0" % new_version
existing_version = getattr(self, attribute, None)
if existing_version is None:
setattr(self, attribute, new_version)
else:
# determine which is the latest version and set that
if LooseVersion(new_version) > LooseVersion(existing_version):
setattr(self, attribute, new_version)
def _slurp(path):
if not os.path.exists(path):
raise AnsibleError("imported module support code does not exist at %s"
% os.path.abspath(path))
fd = open(path, 'rb')
data = fd.read()
fd.close()
return data
def _strip_comments(source):
# Strip comments and blank lines from the wrapper
buf = []
start_block = False
for line in source.splitlines():
l = line.strip()
if start_block and l.endswith(b'#>'):
start_block = False
continue
elif start_block:
continue
elif l.startswith(b'<#'):
start_block = True
continue
elif not l or l.startswith(b'#'):
continue
buf.append(line)
return b'\n'.join(buf)
def _create_powershell_wrapper(b_module_data, module_path, module_args,
environment, async_timeout, become,
become_method, become_user, become_password,
become_flags, substyle, task_vars):
# creates the manifest/wrapper used in PowerShell/C# modules to enable
# things like become and async - this is also called in action/script.py
# FUTURE: add process_wrapper.ps1 to run module_wrapper in a new process
# if running under a persistent connection and substyle is C# so we
# don't have type conflicts
finder = PSModuleDepFinder()
if substyle != 'script':
# don't scan the module for util dependencies and other Ansible related
# flags if the substyle is 'script' which is set by action/script
finder.scan_module(b_module_data, powershell=(substyle == "powershell"))
module_wrapper = "module_%s_wrapper" % substyle
exec_manifest = dict(
module_entry=to_text(base64.b64encode(b_module_data)),
powershell_modules=dict(),
csharp_utils=dict(),
csharp_utils_module=list(), # csharp_utils only required by a module
module_args=module_args,
actions=[module_wrapper],
environment=environment,
encoded_output=False,
)
finder.scan_exec_script(module_wrapper)
if async_timeout > 0:
finder.scan_exec_script('exec_wrapper')
finder.scan_exec_script('async_watchdog')
finder.scan_exec_script('async_wrapper')
exec_manifest["actions"].insert(0, 'async_watchdog')
exec_manifest["actions"].insert(0, 'async_wrapper')
exec_manifest["async_jid"] = str(random.randint(0, 999999999999))
exec_manifest["async_timeout_sec"] = async_timeout
exec_manifest["async_startup_timeout"] = C.config.get_config_value("WIN_ASYNC_STARTUP_TIMEOUT", variables=task_vars)
if become and become_method == 'runas':
finder.scan_exec_script('exec_wrapper')
finder.scan_exec_script('become_wrapper')
exec_manifest["actions"].insert(0, 'become_wrapper')
exec_manifest["become_user"] = become_user
exec_manifest["become_password"] = become_password
exec_manifest['become_flags'] = become_flags
exec_manifest['min_ps_version'] = finder.ps_version
exec_manifest['min_os_version'] = finder.os_version
if finder.become and 'become_wrapper' not in exec_manifest['actions']:
finder.scan_exec_script('exec_wrapper')
finder.scan_exec_script('become_wrapper')
exec_manifest['actions'].insert(0, 'become_wrapper')
exec_manifest['become_user'] = 'SYSTEM'
exec_manifest['become_password'] = None
exec_manifest['become_flags'] = None
coverage_manifest = dict(
module_path=module_path,
module_util_paths=dict(),
output=None,
)
coverage_output = C.config.get_config_value('COVERAGE_REMOTE_OUTPUT', variables=task_vars)
if coverage_output and substyle == 'powershell':
finder.scan_exec_script('coverage_wrapper')
coverage_manifest['output'] = coverage_output
coverage_whitelist = C.config.get_config_value('COVERAGE_REMOTE_WHITELIST', variables=task_vars)
coverage_manifest['whitelist'] = coverage_whitelist
# make sure Ansible.ModuleUtils.AddType is added if any C# utils are used
if len(finder.cs_utils_wrapper) > 0 or len(finder.cs_utils_module) > 0:
finder._add_module((b"Ansible.ModuleUtils.AddType", ".psm1"),
wrapper=False)
# exec_wrapper is only required to be part of the payload if using
# become or async, to save on payload space we check if exec_wrapper has
# already been added, and remove it manually if it hasn't later
exec_required = "exec_wrapper" in finder.exec_scripts.keys()
finder.scan_exec_script("exec_wrapper")
# must contain an empty newline so it runs the begin/process/end block
finder.exec_scripts["exec_wrapper"] += b"\n\n"
exec_wrapper = finder.exec_scripts["exec_wrapper"]
if not exec_required:
finder.exec_scripts.pop("exec_wrapper")
for name, data in finder.exec_scripts.items():
b64_data = to_text(base64.b64encode(data))
exec_manifest[name] = b64_data
for name, data in finder.ps_modules.items():
b64_data = to_text(base64.b64encode(data['data']))
exec_manifest['powershell_modules'][name] = b64_data
coverage_manifest['module_util_paths'][name] = data['path']
cs_utils = {}
for cs_util in [finder.cs_utils_wrapper, finder.cs_utils_module]:
for name, data in cs_util.items():
cs_utils[name] = data['data']
for name, data in cs_utils.items():
b64_data = to_text(base64.b64encode(data))
exec_manifest['csharp_utils'][name] = b64_data
exec_manifest['csharp_utils_module'] = list(finder.cs_utils_module.keys())
# To save on the data we are sending across we only add the coverage info if coverage is being run
if 'coverage_wrapper' in exec_manifest:
exec_manifest['coverage'] = coverage_manifest
b_json = to_bytes(json.dumps(exec_manifest))
# delimit the payload JSON from the wrapper to keep sensitive contents out of scriptblocks (which can be logged)
b_data = exec_wrapper + b'\0\0\0\0' + b_json
return b_data
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,591 |
ansible-test validate-modules for PS modules not loading Ansible utils when in collection
|
##### SUMMARY
When running `ansible-test sanity --test validate-modules` on a PowerShell module in a collection it will fail to import any module util that is in Ansible itself. This causes issues if it requires access to those utils to build the arg spec as we can see with https://app.shippable.com/github/ansible-collections/ansible.windows/runs/16/5/console.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-test validate-modules
##### ANSIBLE VERSION
```paste below
devel
```
##### CONFIGURATION
N/A
##### OS / ENVIRONMENT
N/A - ansible-test against a collection
|
https://github.com/ansible/ansible/issues/67591
|
https://github.com/ansible/ansible/pull/67596
|
b54e64bbc9f039c3706137ce09f035bf4f55ac7c
|
36def8bf03b7d954c79bd53f690105a40c2b9bd3
| 2020-02-19T22:49:53Z |
python
| 2020-02-20T04:32:21Z |
test/lib/ansible_test/_data/sanity/validate-modules/validate_modules/module_args.py
|
# -*- coding: utf-8 -*-
#
# Copyright (C) 2016 Matt Martz <[email protected]>
# Copyright (C) 2016 Rackspace US, Inc.
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import runpy
import json
import os
import subprocess
import sys
from contextlib import contextmanager
from ansible.module_utils.basic import FILE_COMMON_ARGUMENTS
from ansible.module_utils.six import reraise
from .utils import CaptureStd, find_executable, get_module_name_from_filename
class AnsibleModuleCallError(RuntimeError):
pass
class AnsibleModuleImportError(ImportError):
pass
class AnsibleModuleNotInitialized(Exception):
pass
class _FakeAnsibleModuleInit:
def __init__(self):
self.args = tuple()
self.kwargs = {}
self.called = False
def __call__(self, *args, **kwargs):
self.args = args
self.kwargs = kwargs
self.called = True
raise AnsibleModuleCallError('AnsibleModuleCallError')
def _fake_load_params():
pass
@contextmanager
def setup_env(filename):
# Used to clean up imports later
pre_sys_modules = list(sys.modules.keys())
fake = _FakeAnsibleModuleInit()
module = __import__('ansible.module_utils.basic').module_utils.basic
_original_init = module.AnsibleModule.__init__
_original_load_params = module._load_params
setattr(module.AnsibleModule, '__init__', fake)
setattr(module, '_load_params', _fake_load_params)
try:
yield fake
finally:
setattr(module.AnsibleModule, '__init__', _original_init)
setattr(module, '_load_params', _original_load_params)
# Clean up imports to prevent issues with mutable data being used in modules
for k in list(sys.modules.keys()):
# It's faster if we limit to items in ansible.module_utils
# But if this causes problems later, we should remove it
if k not in pre_sys_modules and k.startswith('ansible.module_utils.'):
del sys.modules[k]
def get_ps_argument_spec(filename):
# This uses a very small skeleton of Ansible.Basic.AnsibleModule to return the argspec defined by the module. This
# is pretty rudimentary and will probably require something better going forward.
pwsh = find_executable('pwsh')
if not pwsh:
raise FileNotFoundError('Required program for PowerShell arg spec inspection "pwsh" not found.')
script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'ps_argspec.ps1')
proc = subprocess.Popen([script_path, filename], stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=False)
stdout, stderr = proc.communicate()
if proc.returncode != 0:
raise AnsibleModuleImportError(stderr.decode('utf-8'))
kwargs = json.loads(stdout)
# the validate-modules code expects the options spec to be under the argument_spec key not options as set in PS
kwargs['argument_spec'] = kwargs.pop('options', {})
return kwargs['argument_spec'], (), kwargs
def get_py_argument_spec(filename, collection):
name = get_module_name_from_filename(filename, collection)
with setup_env(filename) as fake:
try:
with CaptureStd():
runpy.run_module(name, run_name='__main__', alter_sys=True)
except AnsibleModuleCallError:
pass
except BaseException as e:
# we want to catch all exceptions here, including sys.exit
reraise(AnsibleModuleImportError, AnsibleModuleImportError('%s' % e), sys.exc_info()[2])
if not fake.called:
raise AnsibleModuleNotInitialized()
try:
try:
# for ping kwargs == {'argument_spec':{'data':{'type':'str','default':'pong'}}, 'supports_check_mode':True}
argument_spec = fake.kwargs['argument_spec']
# If add_file_common_args is truish, add options from FILE_COMMON_ARGUMENTS when not present.
# This is the only modification to argument_spec done by AnsibleModule itself, and which is
# not caught by setup_env's AnsibleModule replacement
if fake.kwargs.get('add_file_common_args'):
for k, v in FILE_COMMON_ARGUMENTS.items():
if k not in argument_spec:
argument_spec[k] = v
return argument_spec, fake.args, fake.kwargs
except KeyError:
return fake.args[0], fake.args, fake.kwargs
except (TypeError, IndexError):
return {}, (), {}
def get_argument_spec(filename, collection):
if filename.endswith('.py'):
return get_py_argument_spec(filename, collection)
else:
return get_ps_argument_spec(filename)
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,591 |
ansible-test validate-modules for PS modules not loading Ansible utils when in collection
|
##### SUMMARY
When running `ansible-test sanity --test validate-modules` on a PowerShell module in a collection it will fail to import any module util that is in Ansible itself. This causes issues if it requires access to those utils to build the arg spec as we can see with https://app.shippable.com/github/ansible-collections/ansible.windows/runs/16/5/console.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-test validate-modules
##### ANSIBLE VERSION
```paste below
devel
```
##### CONFIGURATION
N/A
##### OS / ENVIRONMENT
N/A - ansible-test against a collection
|
https://github.com/ansible/ansible/issues/67591
|
https://github.com/ansible/ansible/pull/67596
|
b54e64bbc9f039c3706137ce09f035bf4f55ac7c
|
36def8bf03b7d954c79bd53f690105a40c2b9bd3
| 2020-02-19T22:49:53Z |
python
| 2020-02-20T04:32:21Z |
test/lib/ansible_test/_data/sanity/validate-modules/validate_modules/ps_argspec.ps1
|
#!/usr/bin/env pwsh
#Requires -Version 6
Set-StrictMode -Version 2.0
$ErrorActionPreference = "Stop"
$WarningPreference = "Stop"
$module_path = $args[0]
if (-not $module_path) {
Write-Error -Message "No module specified."
exit 1
}
# Check if the path is relative and get the full path to the module
if (-not ([System.IO.Path]::IsPathRooted($module_path))) {
$module_path = $ExecutionContext.SessionState.Path.GetUnresolvedProviderPathFromPSPath($module_path)
}
if (-not (Test-Path -LiteralPath $module_path -PathType Leaf)) {
Write-Error -Message "The module at '$module_path' does not exist."
exit 1
}
$dummy_ansible_basic = @'
using System;
using System.Collections;
using System.Management.Automation;
namespace Ansible.Basic
{
public class AnsibleModule
{
public AnsibleModule(string[] args, IDictionary argumentSpec)
{
PSObject rawOut = ScriptBlock.Create("ConvertTo-Json -InputObject $args[0] -Depth 99 -Compress").Invoke(argumentSpec)[0];
Console.WriteLine(rawOut.BaseObject.ToString());
ScriptBlock.Create("Set-Variable -Name LASTEXITCODE -Value 0 -Scope Global; exit 0").Invoke();
}
public static AnsibleModule Create(string[] args, IDictionary argumentSpec)
{
return new AnsibleModule(args, argumentSpec);
}
}
}
'@
Add-Type -TypeDefinition $dummy_ansible_basic
$module_code = Get-Content -LiteralPath $module_path -Raw
$powershell = [PowerShell]::Create()
$powershell.Runspace.SessionStateProxy.SetVariable("ErrorActionPreference", "Stop")
# Load the PowerShell module utils as the module may be using them to refer to shared module options
# FUTURE: Lookup utils in the role or collection's module_utils dir based on #AnsibleRequires
$script_requirements = [ScriptBlock]::Create($module_code).Ast.ScriptRequirements
$required_modules = @()
if ($null -ne $script_requirements) {
$required_modules = $script_requirements.RequiredModules
}
foreach ($required_module in $required_modules) {
if (-not $required_module.Name.StartsWith('Ansible.ModuleUtils.')) {
continue
}
$module_util_path = [System.IO.Path]::GetFullPath([System.IO.Path]::Combine($module_path, '..', '..', '..',
'module_utils', 'powershell', "$($required_module.Name).psm1"))
if (-not (Test-Path -LiteralPath $module_util_path -PathType Leaf)) {
# Failed to find path, just silently ignore for now and hope for the best
continue
}
$module_util_sb = [ScriptBlock]::Create((Get-Content -LiteralPath $module_util_path -Raw))
$powershell.AddCommand('New-Module').AddParameters(@{
Name = $required_module.Name
ScriptBlock = $module_util_sb
}) > $null
$powershell.AddCommand('Import-Module').AddParameter('WarningAction', 'SilentlyContinue') > $null
$powershell.AddCommand('Out-Null').AddStatement() > $null
}
$powershell.AddScript($module_code) > $null
$powershell.Invoke() > $null
if ($powershell.HadErrors) {
$powershell.Streams.Error
exit 1
}
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 61,804 |
azure_rm_virtualmachinescaleset module fails to create a VMSS when data disks are specified in custom image
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
The Ansible azure_rm_virtualmachinescaleset module fails to create a VMSS when data disks are specified, and the source for the VM is a custom image which also has exactly the same disk profile specified.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
azure_rm_virtualmachinescaleset
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.8.2
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/home/someuser/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.5 (default, Jun 20 2019, 20:27:34) [GCC 4.8.5 20150623 (Red Hat 4.8.5-36)]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
-- No output --
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
CentOS 7.5
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
1. Create a CentOS VM with 1 attached data disk (Standard_LRS) of size 1GB
2. Capture an image of the above VM using steps like [this](https://docs.microsoft.com/en-us/azure/virtual-machines/linux/capture-image)
3. Run an Ansible playbook like below to create a VMSS based on the above custom image
<!--- Paste example playbooks or commands between quotes below -->
```yaml
- name: create a VMSS
hosts: localhost
connection: local
tasks:
- name: Create luns dictionary
set_fact:
luns_dict: "{{ luns_dict | default ([]) + [{ 'lun': item, 'managed_disk_type': Standard_LRS, 'disk_size_gb': 1 , 'caching': None } ] }}"
with_sequence: start=0 end=1
- name: Create Scale Set
azure_rm_virtualmachinescaleset:
resource_group: "azure_rg"
location: "westus2"
name: "somevmss"
vm_size: "Standard_D3_v2"
admin_username: "someuser"
ssh_password_enabled: false
ssh_public_keys:
- path: /home/someuser/.ssh/authorized_keys
key_data: "{{ lookup('file', '~/.ssh/id_rsa.pub') }}"
capacity: 2
virtual_network_name: "vnet1"
subnet_name: "subnet1"
upgrade_policy: Manual
tier: Standard
managed_disk_type: "Standard_LRS"
os_disk_caching: ReadWrite
enable_accelerated_networking: yes
image:
resource_group: "somergwithimage"
name: "someimg"
data_disks: "{{ luns_dict }}"
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
VMSS is created successfully with the data disks from the image.
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
Get error message as below when trying to create from Ansible:
<!--- Paste verbatim command output between quotes -->
```paste below
"msg": "Error creating or updating virtual machine somevmss - Azure Error: InvalidParameter\nMessage: Cannot specify user image overrides for a disk already defined in the specified image reference.\nTarget: storageProfile"
```
Note that creating a VMSS from the same image using Azure CLI succeeds; sample command below:
```
az vmss create -n vmss_2 --admin-username azureadmin --admin-password somepass -g some-rg --instance-count 4 --image someimgresourceid --location westus2 --vnet-name vnet1 --subnet subnet1 --subnet-address-prefix 10.1.0.0/16 --vnet-address-prefix 10.0.0.0/8 --vm-sku Standard_D4s_v3 --storage-sku os=Premium_LRS 1=Standard_LRS
```
|
https://github.com/ansible/ansible/issues/61804
|
https://github.com/ansible/ansible/pull/62357
|
36def8bf03b7d954c79bd53f690105a40c2b9bd3
|
23995fef48568412e91739a58f12e4a6b309bd54
| 2019-09-04T18:40:12Z |
python
| 2020-02-20T07:19:48Z |
lib/ansible/modules/cloud/azure/azure_rm_virtualmachinescaleset.py
|
#!/usr/bin/python
#
# Copyright (c) 2016 Sertac Ozercan, <[email protected]>
#
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'}
DOCUMENTATION = '''
---
module: azure_rm_virtualmachinescaleset
version_added: "2.4"
short_description: Manage Azure virtual machine scale sets
description:
- Create and update a virtual machine scale set.
- Note that this module was called M(azure_rm_virtualmachine_scaleset) before Ansible 2.8. The usage did not change.
options:
resource_group:
description:
- Name of the resource group containing the virtual machine scale set.
required: true
name:
description:
- Name of the virtual machine.
required: true
state:
description:
- Assert the state of the virtual machine scale set.
- State C(present) will check that the machine exists with the requested configuration. If the configuration
of the existing machine does not match, the machine will be updated.
- State C(absent) will remove the virtual machine scale set.
default: present
choices:
- absent
- present
location:
description:
- Valid Azure location. Defaults to location of the resource group.
short_hostname:
description:
- Short host name.
vm_size:
description:
- A valid Azure VM size value. For example, C(Standard_D4).
- The list of choices varies depending on the subscription and location. Check your subscription for available choices.
capacity:
description:
- Capacity of VMSS.
default: 1
tier:
description:
- SKU Tier.
choices:
- Basic
- Standard
upgrade_policy:
description:
- Upgrade policy.
- Required when creating the Azure virtual machine scale sets.
choices:
- Manual
- Automatic
admin_username:
description:
- Admin username used to access the host after it is created. Required when creating a VM.
admin_password:
description:
- Password for the admin username.
- Not required if the os_type is Linux and SSH password authentication is disabled by setting I(ssh_password_enabled=false).
ssh_password_enabled:
description:
- When the os_type is Linux, setting I(ssh_password_enabled=false) will disable SSH password authentication and require use of SSH keys.
type: bool
default: true
ssh_public_keys:
description:
- For I(os_type=Linux) provide a list of SSH keys.
- Each item in the list should be a dictionary where the dictionary contains two keys, C(path) and C(key_data).
- Set the C(path) to the default location of the authorized_keys files.
- On an Enterprise Linux host, for example, the I(path=/home/<admin username>/.ssh/authorized_keys).
Set C(key_data) to the actual value of the public key.
image:
description:
- Specifies the image used to build the VM.
- If a string, the image is sourced from a custom image based on the name.
- If a dict with the keys I(publisher), I(offer), I(sku), and I(version), the image is sourced from a Marketplace image.
Note that set I(version=latest) to get the most recent version of a given image.
- If a dict with the keys I(name) and I(resource_group), the image is sourced from a custom image based on the I(name) and I(resource_group) set.
Note that the key I(resource_group) is optional and if omitted, all images in the subscription will be searched for by I(name).
- Custom image support was added in Ansible 2.5.
required: true
os_disk_caching:
description:
- Type of OS disk caching.
choices:
- ReadOnly
- ReadWrite
default: ReadOnly
aliases:
- disk_caching
os_type:
description:
- Base type of operating system.
choices:
- Windows
- Linux
default: Linux
managed_disk_type:
description:
- Managed disk type.
choices:
- Standard_LRS
- Premium_LRS
data_disks:
description:
- Describes list of data disks.
version_added: "2.4"
suboptions:
lun:
description:
- The logical unit number for data disk.
default: 0
version_added: "2.4"
disk_size_gb:
description:
- The initial disk size in GB for blank data disks.
version_added: "2.4"
managed_disk_type:
description:
- Managed data disk type.
choices:
- Standard_LRS
- Premium_LRS
version_added: "2.4"
caching:
description:
- Type of data disk caching.
choices:
- ReadOnly
- ReadWrite
default: ReadOnly
version_added: "2.4"
virtual_network_resource_group:
description:
- When creating a virtual machine, if a specific virtual network from another resource group should be
used.
- Use this parameter to specify the resource group to use.
version_added: "2.5"
virtual_network_name:
description:
- Virtual Network name.
aliases:
- virtual_network
subnet_name:
description:
- Subnet name.
aliases:
- subnet
load_balancer:
description:
- Load balancer name.
version_added: "2.5"
application_gateway:
description:
- Application gateway name.
version_added: "2.8"
remove_on_absent:
description:
- When removing a VM using I(state=absent), also remove associated resources.
- It can be C(all) or a list with any of the following ['network_interfaces', 'virtual_storage', 'public_ips'].
- Any other input will be ignored.
default: ['all']
enable_accelerated_networking:
description:
- Indicates whether user wants to allow accelerated networking for virtual machines in scaleset being created.
version_added: "2.7"
type: bool
security_group:
description:
- Existing security group with which to associate the subnet.
- It can be the security group name which is in the same resource group.
- It can be the resource ID.
- It can be a dict which contains I(name) and I(resource_group) of the security group.
version_added: "2.7"
aliases:
- security_group_name
overprovision:
description:
- Specifies whether the Virtual Machine Scale Set should be overprovisioned.
type: bool
default: True
version_added: "2.8"
single_placement_group:
description:
- When true this limits the scale set to a single placement group, of max size 100 virtual machines.
type: bool
default: True
version_added: "2.9"
plan:
description:
- Third-party billing plan for the VM.
version_added: "2.10"
type: dict
suboptions:
name:
description:
- Billing plan name.
required: true
product:
description:
- Product name.
required: true
publisher:
description:
- Publisher offering the plan.
required: true
promotion_code:
description:
- Optional promotion code.
zones:
description:
- A list of Availability Zones for your virtual machine scale set.
type: list
version_added: "2.8"
custom_data:
description:
- Data which is made available to the virtual machine and used by e.g., C(cloud-init).
- Many images in the marketplace are not cloud-init ready. Thus, data sent to I(custom_data) would be ignored.
- If the image you are attempting to use is not listed in
U(https://docs.microsoft.com/en-us/azure/virtual-machines/linux/using-cloud-init#cloud-init-overview),
follow these steps U(https://docs.microsoft.com/en-us/azure/virtual-machines/linux/cloudinit-prepare-custom-image).
version_added: "2.8"
scale_in_policy:
description:
- define the order in which vmss instances are scaled-in
choices:
- Default
- NewestVM
- OldestVM
version_added: "2.10"
terminate_event_timeout_minutes:
description:
- timeout time for termination notification event
- in range between 5 and 15
version_added: "2.10"
priority:
description:
- If you want to request low-priority VMs for the VMSS, set this to "Low". The default is "Regular"
default: Regular
choices:
- Regular
- Low
version_added: "2.10"
extends_documentation_fragment:
- azure
- azure_tags
author:
- Sertac Ozercan (@sozercan)
'''
EXAMPLES = '''
- name: Create VMSS
azure_rm_virtualmachinescaleset:
resource_group: myResourceGroup
name: testvmss
vm_size: Standard_DS1_v2
capacity: 2
virtual_network_name: testvnet
upgrade_policy: Manual
subnet_name: testsubnet
terminate_event_timeout_minutes: 10
scale_in_policy: NewestVM
admin_username: adminUser
ssh_password_enabled: false
ssh_public_keys:
- path: /home/adminUser/.ssh/authorized_keys
key_data: < insert yor ssh public key here... >
managed_disk_type: Standard_LRS
image:
offer: CoreOS
publisher: CoreOS
sku: Stable
version: latest
data_disks:
- lun: 0
disk_size_gb: 64
caching: ReadWrite
managed_disk_type: Standard_LRS
- name: Create VMSS with an image that requires plan information
azure_rm_virtualmachinescaleset:
resource_group: myResourceGroup
name: testvmss
vm_size: Standard_DS1_v2
capacity: 3
virtual_network_name: testvnet
upgrade_policy: Manual
subnet_name: testsubnet
admin_username: adminUser
ssh_password_enabled: false
ssh_public_keys:
- path: /home/adminUser/.ssh/authorized_keys
key_data: < insert yor ssh public key here... >
managed_disk_type: Standard_LRS
image:
offer: cis-ubuntu-linux-1804-l1
publisher: center-for-internet-security-inc
sku: Stable
version: latest
plan:
name: cis-ubuntu-linux-1804-l1
product: cis-ubuntu-linux-1804-l1
publisher: center-for-internet-security-inc
data_disks:
- lun: 0
disk_size_gb: 64
caching: ReadWrite
managed_disk_type: Standard_LRS
- name: Create a VMSS with a custom image
azure_rm_virtualmachinescaleset:
resource_group: myResourceGroup
name: testvmss
vm_size: Standard_DS1_v2
capacity: 2
virtual_network_name: testvnet
upgrade_policy: Manual
subnet_name: testsubnet
admin_username: adminUser
admin_password: password01
managed_disk_type: Standard_LRS
image: customimage001
- name: Create a VMSS with over 100 instances
azure_rm_virtualmachinescaleset:
resource_group: myResourceGroup
name: testvmss
vm_size: Standard_DS1_v2
capacity: 120
single_placement_group: False
virtual_network_name: testvnet
upgrade_policy: Manual
subnet_name: testsubnet
admin_username: adminUser
admin_password: password01
managed_disk_type: Standard_LRS
image: customimage001
- name: Create a VMSS with a custom image from a particular resource group
azure_rm_virtualmachinescaleset:
resource_group: myResourceGroup
name: testvmss
vm_size: Standard_DS1_v2
capacity: 2
virtual_network_name: testvnet
upgrade_policy: Manual
subnet_name: testsubnet
admin_username: adminUser
admin_password: password01
managed_disk_type: Standard_LRS
image:
name: customimage001
resource_group: myResourceGroup
'''
RETURN = '''
azure_vmss:
description:
- Facts about the current state of the object.
- Note that facts are not part of the registered output but available directly.
returned: always
type: dict
sample: {
"properties": {
"overprovision": true,
"scaleInPolicy": {
"rules": [
"NewestVM"
]
},
"singlePlacementGroup": true,
"upgradePolicy": {
"mode": "Manual"
},
"virtualMachineProfile": {
"networkProfile": {
"networkInterfaceConfigurations": [
{
"name": "testvmss",
"properties": {
"dnsSettings": {
"dnsServers": []
},
"enableAcceleratedNetworking": false,
"ipConfigurations": [
{
"name": "default",
"properties": {
"privateIPAddressVersion": "IPv4",
"subnet": {
"id": "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/resourceGroup/myResourceGroup/providers/Microsoft.Network/virtualNetworks/testvnet/subnets/testsubnet"
}
}
}
],
"primary": true
}
}
]
},
"osProfile": {
"adminUsername": "testuser",
"computerNamePrefix": "testvmss",
"linuxConfiguration": {
"disablePasswordAuthentication": true,
"ssh": {
"publicKeys": [
{
"keyData": "",
"path": "/home/testuser/.ssh/authorized_keys"
}
]
}
},
"secrets": []
},
"scheduledEventsProfile": {
"terminateNotificationProfile": {
"enable": true,
"notBeforeTimeout": "PT10M"
}
},
"storageProfile": {
"dataDisks": [
{
"caching": "ReadWrite",
"createOption": "empty",
"diskSizeGB": 64,
"lun": 0,
"managedDisk": {
"storageAccountType": "Standard_LRS"
}
}
],
"imageReference": {
"offer": "CoreOS",
"publisher": "CoreOS",
"sku": "Stable",
"version": "899.17.0"
},
"osDisk": {
"caching": "ReadWrite",
"createOption": "fromImage",
"managedDisk": {
"storageAccountType": "Standard_LRS"
}
}
}
}
},
"sku": {
"capacity": 2,
"name": "Standard_DS1_v2",
"tier": "Standard"
},
"tags": null,
"type": "Microsoft.Compute/virtualMachineScaleSets"
}
''' # NOQA
import base64
try:
from msrestazure.azure_exceptions import CloudError
from msrestazure.tools import parse_resource_id
except ImportError:
# This is handled in azure_rm_common
pass
from ansible.module_utils.azure_rm_common import AzureRMModuleBase, azure_id_to_dict, format_resource_id
from ansible.module_utils.basic import to_native, to_bytes
AZURE_OBJECT_CLASS = 'VirtualMachineScaleSet'
AZURE_ENUM_MODULES = ['azure.mgmt.compute.models']
class AzureRMVirtualMachineScaleSet(AzureRMModuleBase):
def __init__(self):
self.module_arg_spec = dict(
resource_group=dict(type='str', required=True),
name=dict(type='str', required=True),
state=dict(choices=['present', 'absent'], default='present', type='str'),
location=dict(type='str'),
short_hostname=dict(type='str'),
vm_size=dict(type='str'),
tier=dict(type='str', choices=['Basic', 'Standard']),
capacity=dict(type='int', default=1),
upgrade_policy=dict(type='str', choices=['Automatic', 'Manual']),
admin_username=dict(type='str'),
admin_password=dict(type='str', no_log=True),
ssh_password_enabled=dict(type='bool', default=True),
ssh_public_keys=dict(type='list'),
image=dict(type='raw'),
os_disk_caching=dict(type='str', aliases=['disk_caching'], choices=['ReadOnly', 'ReadWrite'],
default='ReadOnly'),
os_type=dict(type='str', choices=['Linux', 'Windows'], default='Linux'),
managed_disk_type=dict(type='str', choices=['Standard_LRS', 'Premium_LRS']),
data_disks=dict(type='list'),
subnet_name=dict(type='str', aliases=['subnet']),
load_balancer=dict(type='str'),
application_gateway=dict(type='str'),
virtual_network_resource_group=dict(type='str'),
virtual_network_name=dict(type='str', aliases=['virtual_network']),
remove_on_absent=dict(type='list', default=['all']),
enable_accelerated_networking=dict(type='bool'),
security_group=dict(type='raw', aliases=['security_group_name']),
overprovision=dict(type='bool', default=True),
single_placement_group=dict(type='bool', default=True),
zones=dict(type='list'),
custom_data=dict(type='str'),
plan=dict(type='dict', options=dict(publisher=dict(type='str', required=True),
product=dict(type='str', required=True), name=dict(type='str', required=True),
promotion_code=dict(type='str'))),
scale_in_policy=dict(type='str', choices=['Default', 'OldestVM', 'NewestVM']),
terminate_event_timeout_minutes=dict(type='int'),
priority=dict(type='str', choices=['Regular', 'Low'], default='Regular')
)
self.resource_group = None
self.name = None
self.state = None
self.location = None
self.short_hostname = None
self.vm_size = None
self.capacity = None
self.tier = None
self.upgrade_policy = None
self.admin_username = None
self.admin_password = None
self.ssh_password_enabled = None
self.ssh_public_keys = None
self.image = None
self.os_disk_caching = None
self.managed_disk_type = None
self.data_disks = None
self.os_type = None
self.subnet_name = None
self.virtual_network_resource_group = None
self.virtual_network_name = None
self.tags = None
self.differences = None
self.load_balancer = None
self.application_gateway = None
self.enable_accelerated_networking = None
self.security_group = None
self.overprovision = None
self.single_placement_group = None
self.zones = None
self.custom_data = None
self.plan = None
self.scale_in_policy = None
self.terminate_event_timeout_minutes = None
self.priority = None
mutually_exclusive = [('load_balancer', 'application_gateway')]
self.results = dict(
changed=False,
actions=[],
ansible_facts=dict(azure_vmss=None)
)
super(AzureRMVirtualMachineScaleSet, self).__init__(
derived_arg_spec=self.module_arg_spec,
supports_check_mode=True,
mutually_exclusive=mutually_exclusive)
def exec_module(self, **kwargs):
for key in list(self.module_arg_spec.keys()) + ['tags']:
setattr(self, key, kwargs[key])
if self.module._name == 'azure_rm_virtualmachine_scaleset':
self.module.deprecate("The 'azure_rm_virtualmachine_scaleset' module has been renamed to 'azure_rm_virtualmachinescaleset'", version='2.12')
# make sure options are lower case
self.remove_on_absent = set([resource.lower() for resource in self.remove_on_absent])
# convert elements to ints
self.zones = [int(i) for i in self.zones] if self.zones else None
# default virtual_network_resource_group to resource_group
if not self.virtual_network_resource_group:
self.virtual_network_resource_group = self.resource_group
changed = False
results = dict()
vmss = None
disable_ssh_password = None
subnet = None
image_reference = None
load_balancer_backend_address_pools = None
load_balancer_inbound_nat_pools = None
load_balancer = None
application_gateway = None
application_gateway_backend_address_pools = None
support_lb_change = True
resource_group = self.get_resource_group(self.resource_group)
if not self.location:
# Set default location
self.location = resource_group.location
if self.custom_data:
self.custom_data = to_native(base64.b64encode(to_bytes(self.custom_data)))
if self.state == 'present':
# Verify parameters and resolve any defaults
if self.vm_size and not self.vm_size_is_valid():
self.fail("Parameter error: vm_size {0} is not valid for your subscription and location.".format(
self.vm_size
))
# if self.virtual_network_name:
# virtual_network = self.get_virtual_network(self.virtual_network_name)
if self.ssh_public_keys:
msg = "Parameter error: expecting ssh_public_keys to be a list of type dict where " \
"each dict contains keys: path, key_data."
for key in self.ssh_public_keys:
if not isinstance(key, dict):
self.fail(msg)
if not key.get('path') or not key.get('key_data'):
self.fail(msg)
if self.image and isinstance(self.image, dict):
if all(key in self.image for key in ('publisher', 'offer', 'sku', 'version')):
marketplace_image = self.get_marketplace_image_version()
if self.image['version'] == 'latest':
self.image['version'] = marketplace_image.name
self.log("Using image version {0}".format(self.image['version']))
image_reference = self.compute_models.ImageReference(
publisher=self.image['publisher'],
offer=self.image['offer'],
sku=self.image['sku'],
version=self.image['version']
)
elif self.image.get('name'):
custom_image = True
image_reference = self.get_custom_image_reference(
self.image.get('name'),
self.image.get('resource_group'))
elif self.image.get('id'):
try:
image_reference = self.compute_models.ImageReference(id=self.image['id'])
except Exception as exc:
self.fail("id Error: Cannot get image from the reference id - {0}".format(self.image['id']))
else:
self.fail("parameter error: expecting image to contain [publisher, offer, sku, version], [name, resource_group] or [id]")
elif self.image and isinstance(self.image, str):
custom_image = True
image_reference = self.get_custom_image_reference(self.image)
elif self.image:
self.fail("parameter error: expecting image to be a string or dict not {0}".format(type(self.image).__name__))
disable_ssh_password = not self.ssh_password_enabled
if self.load_balancer:
load_balancer = self.get_load_balancer(self.load_balancer)
load_balancer_backend_address_pools = ([self.compute_models.SubResource(id=resource.id)
for resource in load_balancer.backend_address_pools]
if load_balancer.backend_address_pools else None)
load_balancer_inbound_nat_pools = ([self.compute_models.SubResource(id=resource.id)
for resource in load_balancer.inbound_nat_pools]
if load_balancer.inbound_nat_pools else None)
if self.application_gateway:
application_gateway = self.get_application_gateway(self.application_gateway)
application_gateway_backend_address_pools = ([self.compute_models.SubResource(id=resource.id)
for resource in application_gateway.backend_address_pools]
if application_gateway.backend_address_pools else None)
try:
self.log("Fetching virtual machine scale set {0}".format(self.name))
vmss = self.compute_client.virtual_machine_scale_sets.get(self.resource_group, self.name)
self.check_provisioning_state(vmss, self.state)
vmss_dict = self.serialize_vmss(vmss)
if self.state == 'present':
differences = []
results = vmss_dict
if self.os_disk_caching and \
self.os_disk_caching != vmss_dict['properties']['virtualMachineProfile']['storageProfile']['osDisk']['caching']:
self.log('CHANGED: virtual machine scale set {0} - OS disk caching'.format(self.name))
differences.append('OS Disk caching')
changed = True
vmss_dict['properties']['virtualMachineProfile']['storageProfile']['osDisk']['caching'] = self.os_disk_caching
if self.capacity and \
self.capacity != vmss_dict['sku']['capacity']:
self.log('CHANGED: virtual machine scale set {0} - Capacity'.format(self.name))
differences.append('Capacity')
changed = True
vmss_dict['sku']['capacity'] = self.capacity
if self.data_disks and \
len(self.data_disks) != len(vmss_dict['properties']['virtualMachineProfile']['storageProfile'].get('dataDisks', [])):
self.log('CHANGED: virtual machine scale set {0} - Data Disks'.format(self.name))
differences.append('Data Disks')
changed = True
if self.upgrade_policy and \
self.upgrade_policy != vmss_dict['properties']['upgradePolicy']['mode']:
self.log('CHANGED: virtual machine scale set {0} - Upgrade Policy'.format(self.name))
differences.append('Upgrade Policy')
changed = True
vmss_dict['properties']['upgradePolicy']['mode'] = self.upgrade_policy
if image_reference and \
image_reference.as_dict() != vmss_dict['properties']['virtualMachineProfile']['storageProfile']['imageReference']:
self.log('CHANGED: virtual machine scale set {0} - Image'.format(self.name))
differences.append('Image')
changed = True
vmss_dict['properties']['virtualMachineProfile']['storageProfile']['imageReference'] = image_reference.as_dict()
update_tags, vmss_dict['tags'] = self.update_tags(vmss_dict.get('tags', dict()))
if update_tags:
differences.append('Tags')
changed = True
if bool(self.overprovision) != bool(vmss_dict['properties']['overprovision']):
differences.append('overprovision')
changed = True
if bool(self.single_placement_group) != bool(vmss_dict['properties']['singlePlacementGroup']):
differences.append('single_placement_group')
changed = True
vmss_dict['zones'] = [int(i) for i in vmss_dict['zones']] if 'zones' in vmss_dict and vmss_dict['zones'] else None
if self.zones != vmss_dict['zones']:
self.log("CHANGED: virtual machine scale sets {0} zones".format(self.name))
differences.append('Zones')
changed = True
vmss_dict['zones'] = self.zones
if self.terminate_event_timeout_minutes:
timeout = self.terminate_event_timeout_minutes
if timeout < 5 or timeout > 15:
self.fail("terminate_event_timeout_minutes should >= 5 and <= 15")
iso_8601_format = "PT" + str(timeout) + "M"
old = vmss_dict['properties']['virtualMachineProfile'].get('scheduledEventsProfile', {}).\
get('terminateNotificationProfile', {}).get('notBeforeTimeout', "")
if old != iso_8601_format:
differences.append('terminateNotification')
changed = True
vmss_dict['properties']['virtualMachineProfile'].setdefault('scheduledEventsProfile', {})['terminateNotificationProfile'] = {
'notBeforeTimeout': iso_8601_format,
"enable": 'true'
}
if self.scale_in_policy and self.scale_in_policy != vmss_dict['properties'].get('scaleInPolicy', {}).get('rules', [""])[0]:
self.log("CHANGED: virtual machine sale sets {0} scale in policy".format(self.name))
differences.append('scaleInPolicy')
changed = True
vmss_dict['properties'].setdefault('scaleInPolicy', {})['rules'] = [self.scale_in_policy]
nicConfigs = vmss_dict['properties']['virtualMachineProfile']['networkProfile']['networkInterfaceConfigurations']
backend_address_pool = nicConfigs[0]['properties']['ipConfigurations'][0]['properties'].get('loadBalancerBackendAddressPools', [])
backend_address_pool += nicConfigs[0]['properties']['ipConfigurations'][0]['properties'].get('applicationGatewayBackendAddressPools', [])
lb_or_ag_id = None
if (len(nicConfigs) != 1 or len(backend_address_pool) != 1):
support_lb_change = False # Currently not support for the vmss contains more than one loadbalancer
self.module.warn('Updating more than one load balancer on VMSS is currently not supported')
else:
if load_balancer:
lb_or_ag_id = "{0}/".format(load_balancer.id)
elif application_gateway:
lb_or_ag_id = "{0}/".format(application_gateway.id)
backend_address_pool_id = backend_address_pool[0].get('id')
if lb_or_ag_id is not None and (bool(lb_or_ag_id) != bool(backend_address_pool_id) or not backend_address_pool_id.startswith(lb_or_ag_id)):
differences.append('load_balancer')
changed = True
if self.custom_data:
if self.custom_data != vmss_dict['properties']['virtualMachineProfile']['osProfile'].get('customData'):
differences.append('custom_data')
changed = True
vmss_dict['properties']['virtualMachineProfile']['osProfile']['customData'] = self.custom_data
self.differences = differences
elif self.state == 'absent':
self.log("CHANGED: virtual machine scale set {0} exists and requested state is 'absent'".format(self.name))
results = dict()
changed = True
except CloudError:
self.log('Virtual machine scale set {0} does not exist'.format(self.name))
if self.state == 'present':
self.log("CHANGED: virtual machine scale set {0} does not exist but state is 'present'.".format(self.name))
changed = True
self.results['changed'] = changed
self.results['ansible_facts']['azure_vmss'] = results
if self.check_mode:
return self.results
if changed:
if self.state == 'present':
if not vmss:
# Create the VMSS
if self.vm_size is None:
self.fail("vm size must be set")
self.log("Create virtual machine scale set {0}".format(self.name))
self.results['actions'].append('Created VMSS {0}'.format(self.name))
if self.os_type == 'Linux':
if disable_ssh_password and not self.ssh_public_keys:
self.fail("Parameter error: ssh_public_keys required when disabling SSH password.")
if not self.virtual_network_name:
self.fail("virtual network name is required")
if self.subnet_name:
subnet = self.get_subnet(self.virtual_network_name, self.subnet_name)
if not self.short_hostname:
self.short_hostname = self.name
if not image_reference:
self.fail("Parameter error: an image is required when creating a virtual machine.")
managed_disk = self.compute_models.VirtualMachineScaleSetManagedDiskParameters(storage_account_type=self.managed_disk_type)
if self.security_group:
nsg = self.parse_nsg()
if nsg:
self.security_group = self.network_models.NetworkSecurityGroup(id=nsg.get('id'))
plan = None
if self.plan:
plan = self.compute_models.Plan(name=self.plan.get('name'), product=self.plan.get('product'),
publisher=self.plan.get('publisher'),
promotion_code=self.plan.get('promotion_code'))
os_profile = None
if self.admin_username or self.custom_data or self.ssh_public_keys:
os_profile = self.compute_models.VirtualMachineScaleSetOSProfile(
admin_username=self.admin_username,
computer_name_prefix=self.short_hostname,
custom_data=self.custom_data
)
vmss_resource = self.compute_models.VirtualMachineScaleSet(
location=self.location,
overprovision=self.overprovision,
single_placement_group=self.single_placement_group,
tags=self.tags,
upgrade_policy=self.compute_models.UpgradePolicy(
mode=self.upgrade_policy
),
sku=self.compute_models.Sku(
name=self.vm_size,
capacity=self.capacity,
tier=self.tier,
),
plan=plan,
virtual_machine_profile=self.compute_models.VirtualMachineScaleSetVMProfile(
priority=self.priority,
os_profile=os_profile,
storage_profile=self.compute_models.VirtualMachineScaleSetStorageProfile(
os_disk=self.compute_models.VirtualMachineScaleSetOSDisk(
managed_disk=managed_disk,
create_option=self.compute_models.DiskCreateOptionTypes.from_image,
caching=self.os_disk_caching,
),
image_reference=image_reference,
),
network_profile=self.compute_models.VirtualMachineScaleSetNetworkProfile(
network_interface_configurations=[
self.compute_models.VirtualMachineScaleSetNetworkConfiguration(
name=self.name,
primary=True,
ip_configurations=[
self.compute_models.VirtualMachineScaleSetIPConfiguration(
name='default',
subnet=self.compute_models.ApiEntityReference(
id=subnet.id
),
primary=True,
load_balancer_backend_address_pools=load_balancer_backend_address_pools,
load_balancer_inbound_nat_pools=load_balancer_inbound_nat_pools,
application_gateway_backend_address_pools=application_gateway_backend_address_pools
)
],
enable_accelerated_networking=self.enable_accelerated_networking,
network_security_group=self.security_group
)
]
)
),
zones=self.zones
)
if self.scale_in_policy:
vmss_resource.scale_in_policy = self.gen_scale_in_policy()
if self.terminate_event_timeout_minutes:
vmss_resource.virtual_machine_profile.scheduled_events_profile = self.gen_scheduled_event_profile()
if self.admin_password:
vmss_resource.virtual_machine_profile.os_profile.admin_password = self.admin_password
if self.os_type == 'Linux' and os_profile:
vmss_resource.virtual_machine_profile.os_profile.linux_configuration = self.compute_models.LinuxConfiguration(
disable_password_authentication=disable_ssh_password
)
if self.ssh_public_keys:
ssh_config = self.compute_models.SshConfiguration()
ssh_config.public_keys = \
[self.compute_models.SshPublicKey(path=key['path'], key_data=key['key_data']) for key in self.ssh_public_keys]
vmss_resource.virtual_machine_profile.os_profile.linux_configuration.ssh = ssh_config
if self.data_disks:
data_disks = []
for data_disk in self.data_disks:
data_disk_managed_disk = self.compute_models.VirtualMachineScaleSetManagedDiskParameters(
storage_account_type=data_disk.get('managed_disk_type', None)
)
data_disk['caching'] = data_disk.get(
'caching',
self.compute_models.CachingTypes.read_only
)
data_disks.append(self.compute_models.VirtualMachineScaleSetDataDisk(
lun=data_disk.get('lun', None),
caching=data_disk.get('caching', None),
create_option=self.compute_models.DiskCreateOptionTypes.empty,
disk_size_gb=data_disk.get('disk_size_gb', None),
managed_disk=data_disk_managed_disk,
))
vmss_resource.virtual_machine_profile.storage_profile.data_disks = data_disks
if self.plan:
try:
plan_name = self.plan.get('name')
plan_product = self.plan.get('product')
plan_publisher = self.plan.get('publisher')
term = self.marketplace_client.marketplace_agreements.get(
publisher_id=plan_publisher, offer_id=plan_product, plan_id=plan_name)
term.accepted = True
self.marketplace_client.marketplace_agreements.create(
publisher_id=plan_publisher, offer_id=plan_product, plan_id=plan_name, parameters=term)
except Exception as exc:
self.fail(("Error accepting terms for virtual machine {0} with plan {1}. " +
"Only service admin/account admin users can purchase images " +
"from the marketplace. - {2}").format(self.name, self.plan, str(exc)))
self.log("Create virtual machine with parameters:")
self.create_or_update_vmss(vmss_resource)
elif self.differences and len(self.differences) > 0:
self.log("Update virtual machine scale set {0}".format(self.name))
self.results['actions'].append('Updated VMSS {0}'.format(self.name))
vmss_resource = self.get_vmss()
vmss_resource.virtual_machine_profile.storage_profile.os_disk.caching = self.os_disk_caching
vmss_resource.sku.capacity = self.capacity
vmss_resource.overprovision = self.overprovision
vmss_resource.single_placement_group = self.single_placement_group
if support_lb_change:
if self.load_balancer:
vmss_resource.virtual_machine_profile.network_profile.network_interface_configurations[0] \
.ip_configurations[0].load_balancer_backend_address_pools = load_balancer_backend_address_pools
vmss_resource.virtual_machine_profile.network_profile.network_interface_configurations[0] \
.ip_configurations[0].load_balancer_inbound_nat_pools = load_balancer_inbound_nat_pools
vmss_resource.virtual_machine_profile.network_profile.network_interface_configurations[0] \
.ip_configurations[0].application_gateway_backend_address_pools = None
elif self.application_gateway:
vmss_resource.virtual_machine_profile.network_profile.network_interface_configurations[0] \
.ip_configurations[0].application_gateway_backend_address_pools = application_gateway_backend_address_pools
vmss_resource.virtual_machine_profile.network_profile.network_interface_configurations[0] \
.ip_configurations[0].load_balancer_backend_address_pools = None
vmss_resource.virtual_machine_profile.network_profile.network_interface_configurations[0] \
.ip_configurations[0].load_balancer_inbound_nat_pools = None
if self.data_disks is not None:
data_disks = []
for data_disk in self.data_disks:
data_disks.append(self.compute_models.VirtualMachineScaleSetDataDisk(
lun=data_disk['lun'],
caching=data_disk['caching'],
create_option=self.compute_models.DiskCreateOptionTypes.empty,
disk_size_gb=data_disk['disk_size_gb'],
managed_disk=self.compute_models.VirtualMachineScaleSetManagedDiskParameters(
storage_account_type=data_disk.get('managed_disk_type', None)
),
))
vmss_resource.virtual_machine_profile.storage_profile.data_disks = data_disks
if self.scale_in_policy:
vmss_resource.scale_in_policy = self.gen_scale_in_policy()
if self.terminate_event_timeout_minutes:
vmss_resource.virtual_machine_profile.scheduled_events_profile = self.gen_scheduled_event_profile()
if image_reference is not None:
vmss_resource.virtual_machine_profile.storage_profile.image_reference = image_reference
self.log("Update virtual machine with parameters:")
self.create_or_update_vmss(vmss_resource)
self.results['ansible_facts']['azure_vmss'] = self.serialize_vmss(self.get_vmss())
elif self.state == 'absent':
# delete the VM
self.log("Delete virtual machine scale set {0}".format(self.name))
self.results['ansible_facts']['azure_vmss'] = None
self.delete_vmss(vmss)
# until we sort out how we want to do this globally
del self.results['actions']
return self.results
def get_vmss(self):
'''
Get the VMSS
:return: VirtualMachineScaleSet object
'''
try:
vmss = self.compute_client.virtual_machine_scale_sets.get(self.resource_group, self.name)
return vmss
except CloudError as exc:
self.fail("Error getting virtual machine scale set {0} - {1}".format(self.name, str(exc)))
def get_virtual_network(self, name):
try:
vnet = self.network_client.virtual_networks.get(self.virtual_network_resource_group, name)
return vnet
except CloudError as exc:
self.fail("Error fetching virtual network {0} - {1}".format(name, str(exc)))
def get_subnet(self, vnet_name, subnet_name):
self.log("Fetching subnet {0} in virtual network {1}".format(subnet_name, vnet_name))
try:
subnet = self.network_client.subnets.get(self.virtual_network_resource_group, vnet_name, subnet_name)
except CloudError as exc:
self.fail("Error: fetching subnet {0} in virtual network {1} - {2}".format(
subnet_name,
vnet_name,
str(exc)))
return subnet
def get_load_balancer(self, id):
id_dict = parse_resource_id(id)
try:
return self.network_client.load_balancers.get(id_dict.get('resource_group', self.resource_group), id_dict.get('name'))
except CloudError as exc:
self.fail("Error fetching load balancer {0} - {1}".format(id, str(exc)))
def get_application_gateway(self, id):
id_dict = parse_resource_id(id)
try:
return self.network_client.application_gateways.get(id_dict.get('resource_group', self.resource_group), id_dict.get('name'))
except CloudError as exc:
self.fail("Error fetching application_gateway {0} - {1}".format(id, str(exc)))
def serialize_vmss(self, vmss):
'''
Convert a VirtualMachineScaleSet object to dict.
:param vm: VirtualMachineScaleSet object
:return: dict
'''
result = self.serialize_obj(vmss, AZURE_OBJECT_CLASS, enum_modules=AZURE_ENUM_MODULES)
result['id'] = vmss.id
result['name'] = vmss.name
result['type'] = vmss.type
result['location'] = vmss.location
result['tags'] = vmss.tags
return result
def delete_vmss(self, vmss):
self.log("Deleting virtual machine scale set {0}".format(self.name))
self.results['actions'].append("Deleted virtual machine scale set {0}".format(self.name))
try:
poller = self.compute_client.virtual_machine_scale_sets.delete(self.resource_group, self.name)
# wait for the poller to finish
self.get_poller_result(poller)
except CloudError as exc:
self.fail("Error deleting virtual machine scale set {0} - {1}".format(self.name, str(exc)))
return True
def get_marketplace_image_version(self):
try:
versions = self.compute_client.virtual_machine_images.list(self.location,
self.image['publisher'],
self.image['offer'],
self.image['sku'])
except CloudError as exc:
self.fail("Error fetching image {0} {1} {2} - {3}".format(self.image['publisher'],
self.image['offer'],
self.image['sku'],
str(exc)))
if versions and len(versions) > 0:
if self.image['version'] == 'latest':
return versions[len(versions) - 1]
for version in versions:
if version.name == self.image['version']:
return version
self.fail("Error could not find image {0} {1} {2} {3}".format(self.image['publisher'],
self.image['offer'],
self.image['sku'],
self.image['version']))
def get_custom_image_reference(self, name, resource_group=None):
try:
if resource_group:
vm_images = self.compute_client.images.list_by_resource_group(resource_group)
else:
vm_images = self.compute_client.images.list()
except Exception as exc:
self.fail("Error fetching custom images from subscription - {0}".format(str(exc)))
for vm_image in vm_images:
if vm_image.name == name:
self.log("Using custom image id {0}".format(vm_image.id))
return self.compute_models.ImageReference(id=vm_image.id)
self.fail("Error could not find image with name {0}".format(name))
def create_or_update_vmss(self, params):
try:
poller = self.compute_client.virtual_machine_scale_sets.create_or_update(self.resource_group, self.name, params)
self.get_poller_result(poller)
except CloudError as exc:
self.fail("Error creating or updating virtual machine {0} - {1}".format(self.name, str(exc)))
def vm_size_is_valid(self):
'''
Validate self.vm_size against the list of virtual machine sizes available for the account and location.
:return: boolean
'''
try:
sizes = self.compute_client.virtual_machine_sizes.list(self.location)
except CloudError as exc:
self.fail("Error retrieving available machine sizes - {0}".format(str(exc)))
for size in sizes:
if size.name == self.vm_size:
return True
return False
def parse_nsg(self):
nsg = self.security_group
resource_group = self.resource_group
if isinstance(self.security_group, dict):
nsg = self.security_group.get('name')
resource_group = self.security_group.get('resource_group', self.resource_group)
id = format_resource_id(val=nsg,
subscription_id=self.subscription_id,
namespace='Microsoft.Network',
types='networkSecurityGroups',
resource_group=resource_group)
name = azure_id_to_dict(id).get('name')
return dict(id=id, name=name)
def gen_scheduled_event_profile(self):
if self.terminate_event_timeout_minutes is None:
return None
scheduledEventProfile = self.compute_models.ScheduledEventsProfile()
terminationProfile = self.compute_models.TerminateNotificationProfile()
terminationProfile.not_before_timeout = "PT" + str(self.terminate_event_timeout_minutes) + "M"
terminationProfile.enable = True
scheduledEventProfile.terminate_notification_profile = terminationProfile
return scheduledEventProfile
def gen_scale_in_policy(self):
if self.scale_in_policy is None:
return None
return self.compute_models.ScaleInPolicy(rules=[self.scale_in_policy])
def main():
AzureRMVirtualMachineScaleSet()
if __name__ == '__main__':
main()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,574 |
Null secondary dependencies will error collection installs
|
##### SUMMARY
If a dependency of my collection uses a specific syntax in its `galaxy.yml` then it will break installs of my own collection.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
lib/ansible/galaxy/collection.py
lib/ansible/cli/galaxy.py
##### ANSIBLE VERSION
```paste below
$ ansible --version
ansible 2.10.0.dev0
config file = None
configured module search path = ['/Users/alancoding/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/alancoding/Documents/repos/ansible/lib/ansible
executable location = /Users/alancoding/.virtualenvs/ansible3/bin/ansible
python version = 3.6.5 (default, Apr 25 2018, 14:23:58) [GCC 4.2.1 Compatible Apple LLVM 9.1.0 (clang-902.0.39.1)]
```
##### CONFIGURATION
defaults
##### OS / ENVIRONMENT
N/A
##### STEPS TO REPRODUCE
Create a collection like this:
```yaml
namespace: alancoding
name: bug
version: 0.0.1
readme: README.md
authors:
- Alan Rominger <[email protected]>
description: A testing collection that depends on debops collection
license:
- GPL-2.0-or-later
license_file: ''
tags: []
dependencies:
softasap.redis_box: "*"
repository: https://github.com/AlanCoding/collection-dependencies-demo
documentation: https://github.com/AlanCoding/collection-dependencies-demo
homepage: https://github.com/AlanCoding/collection-dependencies-demo
issues: http://example.com/issue/tracker
build_ignore:
- target
- output*.txt
- '*.tar.gz'
```
Now build and then install it.
```
repro_bug:
rm -rf bug_debops/alancoding-bug-0.0.1.tar.gz
ansible-galaxy collection build bug_debops --output-path=bug_debops -vvv
ANSIBLE_COLLECTIONS_PATHS=bug_debops/target ansible-galaxy collection install bug_debops/alancoding-bug-0.0.1.tar.gz -f -p bug_debops/target -vvv
```
(ignore my folder name, originally I mis-attributed the error)
##### EXPECTED RESULTS
Installs. Nothing is wrong with this setup.
##### ACTUAL RESULTS
```paste below
$ make repro_bug
# rm -rf bug_debops/target
rm -rf bug_debops/alancoding-bug-0.0.1.tar.gz
ansible-galaxy collection build bug_debops --output-path=bug_debops -vvv
ansible-galaxy 2.10.0.dev0
config file = None
configured module search path = ['/Users/alancoding/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/alancoding/Documents/repos/ansible/lib/ansible
executable location = /Users/alancoding/.virtualenvs/ansible3/bin/ansible-galaxy
python version = 3.6.5 (default, Apr 25 2018, 14:23:58) [GCC 4.2.1 Compatible Apple LLVM 9.1.0 (clang-902.0.39.1)]
No config file found; using defaults
Skipping '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target' for collection build
Skipping '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/galaxy.yml' for collection build
Created collection for alancoding.bug at /Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/alancoding-bug-0.0.1.tar.gz
ANSIBLE_COLLECTIONS_PATHS=bug_debops/target ansible-galaxy collection install bug_debops/alancoding-bug-0.0.1.tar.gz -f -p bug_debops/target -vvv
ansible-galaxy 2.10.0.dev0
config file = None
configured module search path = ['/Users/alancoding/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/alancoding/Documents/repos/ansible/lib/ansible
executable location = /Users/alancoding/.virtualenvs/ansible3/bin/ansible-galaxy
python version = 3.6.5 (default, Apr 25 2018, 14:23:58) [GCC 4.2.1 Compatible Apple LLVM 9.1.0 (clang-902.0.39.1)]
No config file found; using defaults
Found installed collection gavinfish.azuretest:1.0.3 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/gavinfish/azuretest'
Found installed collection debops.roles03:2.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/debops/roles03'
Found installed collection debops.roles02:2.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/debops/roles02'
Found installed collection debops.debops:2.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/debops/debops'
Found installed collection debops.roles01:2.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/debops/roles01'
Found installed collection fragmentedpacket.netbox_modules:0.1.4 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/fragmentedpacket/netbox_modules'
Found installed collection alancoding.bug:0.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/alancoding/bug'
Found installed collection softasap.redis_box:0.1.0 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/softasap/redis_box'
Process install dependency map
Opened /Users/alancoding/.ansible/galaxy_token
Processing requirement collection 'bug_debops/alancoding-bug-0.0.1.tar.gz'
Processing requirement collection 'softasap.redis_box' - as dependency of alancoding.bug
Opened /Users/alancoding/.ansible/galaxy_token
Collection 'softasap.redis_box' obtained from server default https://galaxy.ansible.com/api/
ERROR! Unexpected Exception, this is probably a bug: object of type 'NoneType' has no len()
the full traceback was:
Traceback (most recent call last):
File "/Users/alancoding/Documents/repos/ansible/bin/ansible-galaxy", line 123, in <module>
exit_code = cli.run()
File "/Users/alancoding/Documents/repos/ansible/lib/ansible/cli/galaxy.py", line 456, in run
context.CLIARGS['func']()
File "/Users/alancoding/Documents/repos/ansible/lib/ansible/cli/galaxy.py", line 944, in execute_install
no_deps, force, force_deps)
File "/Users/alancoding/Documents/repos/ansible/lib/ansible/galaxy/collection.py", line 511, in install_collections
validate_certs, force, force_deps, no_deps)
File "/Users/alancoding/Documents/repos/ansible/lib/ansible/galaxy/collection.py", line 955, in _build_dependency_map
if no_deps or len(dependency_map[collection].dependencies) == 0:
TypeError: object of type 'NoneType' has no len()
make: *** [repro_bug] Error 250
```
It seems that the root of this issue is:
https://github.com/oops-to-devops/redis_box/blob/develop/galaxy.yml
```
cat bug_debops/target/ansible_collections/softasap/redis_box/MANIFEST.json
{
"collection_info": {
"description": "wraps sa-redis and installs redis server",
"repository": "https://github.com/oops-to-devops/redis_box",
"tags": [
"softasap",
"redis"
],
"dependencies": null,
"authors": [
"Vyacheslav Voronenko"
],
"issues": "https://github.com/orgs/oops-to-devops/projects/1",
"name": "redis_box",
...
```
Corresponding to:
```yaml
namespace: "softasap"
name: "redis_box"
description: wraps sa-redis and installs redis server
version: 0.2.0
readme: "Readme.md"
authors:
- "Vyacheslav Voronenko"
dependencies:
license:
- "MIT"
tags:
- softasap
- redis
repository: "https://github.com/oops-to-devops/redis_box"
documentation: "https://github.com/oops-to-devops/mariadb_box/blob/master/README.md"
homepage: "https://www.softasap.com"
issues: "https://github.com/orgs/oops-to-devops/projects/1"
```
I have confirmed that I can build this collection locally.
So Galaxy shouldn't let me build a collection with this entry that errors the dependency processing, or it should handle this case.
|
https://github.com/ansible/ansible/issues/67574
|
https://github.com/ansible/ansible/pull/67575
|
2de4e55650d729bf8a0cb59b55f504267fa94689
|
cffead4631fa3795f66c6da700a9a46d6e95870f
| 2020-02-19T14:59:41Z |
python
| 2020-02-20T16:23:23Z |
changelogs/fragments/67574-null_collection_dependency_list.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,574 |
Null secondary dependencies will error collection installs
|
##### SUMMARY
If a dependency of my collection uses a specific syntax in its `galaxy.yml` then it will break installs of my own collection.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
lib/ansible/galaxy/collection.py
lib/ansible/cli/galaxy.py
##### ANSIBLE VERSION
```paste below
$ ansible --version
ansible 2.10.0.dev0
config file = None
configured module search path = ['/Users/alancoding/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/alancoding/Documents/repos/ansible/lib/ansible
executable location = /Users/alancoding/.virtualenvs/ansible3/bin/ansible
python version = 3.6.5 (default, Apr 25 2018, 14:23:58) [GCC 4.2.1 Compatible Apple LLVM 9.1.0 (clang-902.0.39.1)]
```
##### CONFIGURATION
defaults
##### OS / ENVIRONMENT
N/A
##### STEPS TO REPRODUCE
Create a collection like this:
```yaml
namespace: alancoding
name: bug
version: 0.0.1
readme: README.md
authors:
- Alan Rominger <[email protected]>
description: A testing collection that depends on debops collection
license:
- GPL-2.0-or-later
license_file: ''
tags: []
dependencies:
softasap.redis_box: "*"
repository: https://github.com/AlanCoding/collection-dependencies-demo
documentation: https://github.com/AlanCoding/collection-dependencies-demo
homepage: https://github.com/AlanCoding/collection-dependencies-demo
issues: http://example.com/issue/tracker
build_ignore:
- target
- output*.txt
- '*.tar.gz'
```
Now build and then install it.
```
repro_bug:
rm -rf bug_debops/alancoding-bug-0.0.1.tar.gz
ansible-galaxy collection build bug_debops --output-path=bug_debops -vvv
ANSIBLE_COLLECTIONS_PATHS=bug_debops/target ansible-galaxy collection install bug_debops/alancoding-bug-0.0.1.tar.gz -f -p bug_debops/target -vvv
```
(ignore my folder name, originally I mis-attributed the error)
##### EXPECTED RESULTS
Installs. Nothing is wrong with this setup.
##### ACTUAL RESULTS
```paste below
$ make repro_bug
# rm -rf bug_debops/target
rm -rf bug_debops/alancoding-bug-0.0.1.tar.gz
ansible-galaxy collection build bug_debops --output-path=bug_debops -vvv
ansible-galaxy 2.10.0.dev0
config file = None
configured module search path = ['/Users/alancoding/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/alancoding/Documents/repos/ansible/lib/ansible
executable location = /Users/alancoding/.virtualenvs/ansible3/bin/ansible-galaxy
python version = 3.6.5 (default, Apr 25 2018, 14:23:58) [GCC 4.2.1 Compatible Apple LLVM 9.1.0 (clang-902.0.39.1)]
No config file found; using defaults
Skipping '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target' for collection build
Skipping '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/galaxy.yml' for collection build
Created collection for alancoding.bug at /Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/alancoding-bug-0.0.1.tar.gz
ANSIBLE_COLLECTIONS_PATHS=bug_debops/target ansible-galaxy collection install bug_debops/alancoding-bug-0.0.1.tar.gz -f -p bug_debops/target -vvv
ansible-galaxy 2.10.0.dev0
config file = None
configured module search path = ['/Users/alancoding/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/alancoding/Documents/repos/ansible/lib/ansible
executable location = /Users/alancoding/.virtualenvs/ansible3/bin/ansible-galaxy
python version = 3.6.5 (default, Apr 25 2018, 14:23:58) [GCC 4.2.1 Compatible Apple LLVM 9.1.0 (clang-902.0.39.1)]
No config file found; using defaults
Found installed collection gavinfish.azuretest:1.0.3 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/gavinfish/azuretest'
Found installed collection debops.roles03:2.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/debops/roles03'
Found installed collection debops.roles02:2.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/debops/roles02'
Found installed collection debops.debops:2.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/debops/debops'
Found installed collection debops.roles01:2.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/debops/roles01'
Found installed collection fragmentedpacket.netbox_modules:0.1.4 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/fragmentedpacket/netbox_modules'
Found installed collection alancoding.bug:0.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/alancoding/bug'
Found installed collection softasap.redis_box:0.1.0 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/softasap/redis_box'
Process install dependency map
Opened /Users/alancoding/.ansible/galaxy_token
Processing requirement collection 'bug_debops/alancoding-bug-0.0.1.tar.gz'
Processing requirement collection 'softasap.redis_box' - as dependency of alancoding.bug
Opened /Users/alancoding/.ansible/galaxy_token
Collection 'softasap.redis_box' obtained from server default https://galaxy.ansible.com/api/
ERROR! Unexpected Exception, this is probably a bug: object of type 'NoneType' has no len()
the full traceback was:
Traceback (most recent call last):
File "/Users/alancoding/Documents/repos/ansible/bin/ansible-galaxy", line 123, in <module>
exit_code = cli.run()
File "/Users/alancoding/Documents/repos/ansible/lib/ansible/cli/galaxy.py", line 456, in run
context.CLIARGS['func']()
File "/Users/alancoding/Documents/repos/ansible/lib/ansible/cli/galaxy.py", line 944, in execute_install
no_deps, force, force_deps)
File "/Users/alancoding/Documents/repos/ansible/lib/ansible/galaxy/collection.py", line 511, in install_collections
validate_certs, force, force_deps, no_deps)
File "/Users/alancoding/Documents/repos/ansible/lib/ansible/galaxy/collection.py", line 955, in _build_dependency_map
if no_deps or len(dependency_map[collection].dependencies) == 0:
TypeError: object of type 'NoneType' has no len()
make: *** [repro_bug] Error 250
```
It seems that the root of this issue is:
https://github.com/oops-to-devops/redis_box/blob/develop/galaxy.yml
```
cat bug_debops/target/ansible_collections/softasap/redis_box/MANIFEST.json
{
"collection_info": {
"description": "wraps sa-redis and installs redis server",
"repository": "https://github.com/oops-to-devops/redis_box",
"tags": [
"softasap",
"redis"
],
"dependencies": null,
"authors": [
"Vyacheslav Voronenko"
],
"issues": "https://github.com/orgs/oops-to-devops/projects/1",
"name": "redis_box",
...
```
Corresponding to:
```yaml
namespace: "softasap"
name: "redis_box"
description: wraps sa-redis and installs redis server
version: 0.2.0
readme: "Readme.md"
authors:
- "Vyacheslav Voronenko"
dependencies:
license:
- "MIT"
tags:
- softasap
- redis
repository: "https://github.com/oops-to-devops/redis_box"
documentation: "https://github.com/oops-to-devops/mariadb_box/blob/master/README.md"
homepage: "https://www.softasap.com"
issues: "https://github.com/orgs/oops-to-devops/projects/1"
```
I have confirmed that I can build this collection locally.
So Galaxy shouldn't let me build a collection with this entry that errors the dependency processing, or it should handle this case.
|
https://github.com/ansible/ansible/issues/67574
|
https://github.com/ansible/ansible/pull/67575
|
2de4e55650d729bf8a0cb59b55f504267fa94689
|
cffead4631fa3795f66c6da700a9a46d6e95870f
| 2020-02-19T14:59:41Z |
python
| 2020-02-20T16:23:23Z |
lib/ansible/galaxy/collection.py
|
# Copyright: (c) 2019, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import fnmatch
import json
import operator
import os
import shutil
import sys
import tarfile
import tempfile
import threading
import time
import yaml
from collections import namedtuple
from contextlib import contextmanager
from distutils.version import LooseVersion, StrictVersion
from hashlib import sha256
from io import BytesIO
from yaml.error import YAMLError
try:
import queue
except ImportError:
import Queue as queue # Python 2
import ansible.constants as C
from ansible.errors import AnsibleError
from ansible.galaxy import get_collections_galaxy_meta_info
from ansible.galaxy.api import CollectionVersionMetadata, GalaxyError
from ansible.galaxy.user_agent import user_agent
from ansible.module_utils import six
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.utils.collection_loader import AnsibleCollectionRef
from ansible.utils.display import Display
from ansible.utils.hashing import secure_hash, secure_hash_s
from ansible.module_utils.urls import open_url
urlparse = six.moves.urllib.parse.urlparse
urllib_error = six.moves.urllib.error
display = Display()
MANIFEST_FORMAT = 1
ModifiedContent = namedtuple('ModifiedContent', ['filename', 'expected', 'installed'])
class CollectionRequirement:
_FILE_MAPPING = [(b'MANIFEST.json', 'manifest_file'), (b'FILES.json', 'files_file')]
def __init__(self, namespace, name, b_path, api, versions, requirement, force, parent=None, metadata=None,
files=None, skip=False):
"""
Represents a collection requirement, the versions that are available to be installed as well as any
dependencies the collection has.
:param namespace: The collection namespace.
:param name: The collection name.
:param b_path: Byte str of the path to the collection tarball if it has already been downloaded.
:param api: The GalaxyAPI to use if the collection is from Galaxy.
:param versions: A list of versions of the collection that are available.
:param requirement: The version requirement string used to verify the list of versions fit the requirements.
:param force: Whether the force flag applied to the collection.
:param parent: The name of the parent the collection is a dependency of.
:param metadata: The galaxy.api.CollectionVersionMetadata that has already been retrieved from the Galaxy
server.
:param files: The files that exist inside the collection. This is based on the FILES.json file inside the
collection artifact.
:param skip: Whether to skip installing the collection. Should be set if the collection is already installed
and force is not set.
"""
self.namespace = namespace
self.name = name
self.b_path = b_path
self.api = api
self.versions = set(versions)
self.force = force
self.skip = skip
self.required_by = []
self._metadata = metadata
self._files = files
self.add_requirement(parent, requirement)
def __str__(self):
return to_native("%s.%s" % (self.namespace, self.name))
def __unicode__(self):
return u"%s.%s" % (self.namespace, self.name)
@property
def metadata(self):
self._get_metadata()
return self._metadata
@property
def latest_version(self):
try:
return max([v for v in self.versions if v != '*'], key=LooseVersion)
except ValueError: # ValueError: max() arg is an empty sequence
return '*'
@property
def dependencies(self):
if self._metadata:
return self._metadata.dependencies
elif len(self.versions) > 1:
return None
self._get_metadata()
return self._metadata.dependencies
def add_requirement(self, parent, requirement):
self.required_by.append((parent, requirement))
new_versions = set(v for v in self.versions if self._meets_requirements(v, requirement, parent))
if len(new_versions) == 0:
if self.skip:
force_flag = '--force-with-deps' if parent else '--force'
version = self.latest_version if self.latest_version != '*' else 'unknown'
msg = "Cannot meet requirement %s:%s as it is already installed at version '%s'. Use %s to overwrite" \
% (to_text(self), requirement, version, force_flag)
raise AnsibleError(msg)
elif parent is None:
msg = "Cannot meet requirement %s for dependency %s" % (requirement, to_text(self))
else:
msg = "Cannot meet dependency requirement '%s:%s' for collection %s" \
% (to_text(self), requirement, parent)
collection_source = to_text(self.b_path, nonstring='passthru') or self.api.api_server
req_by = "\n".join(
"\t%s - '%s:%s'" % (to_text(p) if p else 'base', to_text(self), r)
for p, r in self.required_by
)
versions = ", ".join(sorted(self.versions, key=LooseVersion))
raise AnsibleError(
"%s from source '%s'. Available versions before last requirement added: %s\nRequirements from:\n%s"
% (msg, collection_source, versions, req_by)
)
self.versions = new_versions
def install(self, path, b_temp_path):
if self.skip:
display.display("Skipping '%s' as it is already installed" % to_text(self))
return
# Install if it is not
collection_path = os.path.join(path, self.namespace, self.name)
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
display.display("Installing '%s:%s' to '%s'" % (to_text(self), self.latest_version, collection_path))
if self.b_path is None:
download_url = self._metadata.download_url
artifact_hash = self._metadata.artifact_sha256
headers = {}
self.api._add_auth_token(headers, download_url, required=False)
self.b_path = _download_file(download_url, b_temp_path, artifact_hash, self.api.validate_certs,
headers=headers)
if os.path.exists(b_collection_path):
shutil.rmtree(b_collection_path)
os.makedirs(b_collection_path)
with tarfile.open(self.b_path, mode='r') as collection_tar:
files_member_obj = collection_tar.getmember('FILES.json')
with _tarfile_extract(collection_tar, files_member_obj) as files_obj:
files = json.loads(to_text(files_obj.read(), errors='surrogate_or_strict'))
_extract_tar_file(collection_tar, 'MANIFEST.json', b_collection_path, b_temp_path)
_extract_tar_file(collection_tar, 'FILES.json', b_collection_path, b_temp_path)
for file_info in files['files']:
file_name = file_info['name']
if file_name == '.':
continue
if file_info['ftype'] == 'file':
_extract_tar_file(collection_tar, file_name, b_collection_path, b_temp_path,
expected_hash=file_info['chksum_sha256'])
else:
os.makedirs(os.path.join(b_collection_path, to_bytes(file_name, errors='surrogate_or_strict')))
def set_latest_version(self):
self.versions = set([self.latest_version])
self._get_metadata()
def verify(self, remote_collection, path, b_temp_tar_path):
if not self.skip:
display.display("'%s' has not been installed, nothing to verify" % (to_text(self)))
return
collection_path = os.path.join(path, self.namespace, self.name)
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
display.vvv("Verifying '%s:%s'." % (to_text(self), self.latest_version))
display.vvv("Installed collection found at '%s'" % collection_path)
display.vvv("Remote collection found at '%s'" % remote_collection.metadata.download_url)
# Compare installed version versus requirement version
if self.latest_version != remote_collection.latest_version:
err = "%s has the version '%s' but is being compared to '%s'" % (to_text(self), self.latest_version, remote_collection.latest_version)
display.display(err)
return
modified_content = []
# Verify the manifest hash matches before verifying the file manifest
expected_hash = _get_tar_file_hash(b_temp_tar_path, 'MANIFEST.json')
self._verify_file_hash(b_collection_path, 'MANIFEST.json', expected_hash, modified_content)
manifest = _get_json_from_tar_file(b_temp_tar_path, 'MANIFEST.json')
# Use the manifest to verify the file manifest checksum
file_manifest_data = manifest['file_manifest_file']
file_manifest_filename = file_manifest_data['name']
expected_hash = file_manifest_data['chksum_%s' % file_manifest_data['chksum_type']]
# Verify the file manifest before using it to verify individual files
self._verify_file_hash(b_collection_path, file_manifest_filename, expected_hash, modified_content)
file_manifest = _get_json_from_tar_file(b_temp_tar_path, file_manifest_filename)
# Use the file manifest to verify individual file checksums
for manifest_data in file_manifest['files']:
if manifest_data['ftype'] == 'file':
expected_hash = manifest_data['chksum_%s' % manifest_data['chksum_type']]
self._verify_file_hash(b_collection_path, manifest_data['name'], expected_hash, modified_content)
if modified_content:
display.display("Collection %s contains modified content in the following files:" % to_text(self))
display.display(to_text(self))
display.vvv(to_text(self.b_path))
for content_change in modified_content:
display.display(' %s' % content_change.filename)
display.vvv(" Expected: %s\n Found: %s" % (content_change.expected, content_change.installed))
else:
display.vvv("Successfully verified that checksums for '%s:%s' match the remote collection" % (to_text(self), self.latest_version))
def _verify_file_hash(self, b_path, filename, expected_hash, error_queue):
b_file_path = to_bytes(os.path.join(to_text(b_path), filename), errors='surrogate_or_strict')
if not os.path.isfile(b_file_path):
actual_hash = None
else:
with open(b_file_path, mode='rb') as file_object:
actual_hash = _consume_file(file_object)
if expected_hash != actual_hash:
error_queue.append(ModifiedContent(filename=filename, expected=expected_hash, installed=actual_hash))
def _get_metadata(self):
if self._metadata:
return
self._metadata = self.api.get_collection_version_metadata(self.namespace, self.name, self.latest_version)
def _meets_requirements(self, version, requirements, parent):
"""
Supports version identifiers can be '==', '!=', '>', '>=', '<', '<=', '*'. Each requirement is delimited by ','
"""
op_map = {
'!=': operator.ne,
'==': operator.eq,
'=': operator.eq,
'>=': operator.ge,
'>': operator.gt,
'<=': operator.le,
'<': operator.lt,
}
for req in list(requirements.split(',')):
op_pos = 2 if len(req) > 1 and req[1] == '=' else 1
op = op_map.get(req[:op_pos])
requirement = req[op_pos:]
if not op:
requirement = req
op = operator.eq
# In the case we are checking a new requirement on a base requirement (parent != None) we can't accept
# version as '*' (unknown version) unless the requirement is also '*'.
if parent and version == '*' and requirement != '*':
display.warning("Failed to validate the collection requirement '%s:%s' for %s when the existing "
"install does not have a version set, the collection may not work."
% (to_text(self), req, parent))
continue
elif requirement == '*' or version == '*':
continue
if not op(LooseVersion(version), LooseVersion(requirement)):
break
else:
return True
# The loop was broken early, it does not meet all the requirements
return False
@staticmethod
def from_tar(b_path, force, parent=None):
if not tarfile.is_tarfile(b_path):
raise AnsibleError("Collection artifact at '%s' is not a valid tar file." % to_native(b_path))
info = {}
with tarfile.open(b_path, mode='r') as collection_tar:
for b_member_name, property_name in CollectionRequirement._FILE_MAPPING:
n_member_name = to_native(b_member_name)
try:
member = collection_tar.getmember(n_member_name)
except KeyError:
raise AnsibleError("Collection at '%s' does not contain the required file %s."
% (to_native(b_path), n_member_name))
with _tarfile_extract(collection_tar, member) as member_obj:
try:
info[property_name] = json.loads(to_text(member_obj.read(), errors='surrogate_or_strict'))
except ValueError:
raise AnsibleError("Collection tar file member %s does not contain a valid json string."
% n_member_name)
meta = info['manifest_file']['collection_info']
files = info['files_file']['files']
namespace = meta['namespace']
name = meta['name']
version = meta['version']
meta = CollectionVersionMetadata(namespace, name, version, None, None, meta['dependencies'])
return CollectionRequirement(namespace, name, b_path, None, [version], version, force, parent=parent,
metadata=meta, files=files)
@staticmethod
def from_path(b_path, force, parent=None):
info = {}
for b_file_name, property_name in CollectionRequirement._FILE_MAPPING:
b_file_path = os.path.join(b_path, b_file_name)
if not os.path.exists(b_file_path):
continue
with open(b_file_path, 'rb') as file_obj:
try:
info[property_name] = json.loads(to_text(file_obj.read(), errors='surrogate_or_strict'))
except ValueError:
raise AnsibleError("Collection file at '%s' does not contain a valid json string."
% to_native(b_file_path))
if 'manifest_file' in info:
manifest = info['manifest_file']['collection_info']
namespace = manifest['namespace']
name = manifest['name']
version = to_text(manifest['version'], errors='surrogate_or_strict')
if not hasattr(LooseVersion(version), 'version'):
display.warning("Collection at '%s' does not have a valid version set, falling back to '*'. Found "
"version: '%s'" % (to_text(b_path), version))
version = '*'
dependencies = manifest['dependencies']
else:
display.warning("Collection at '%s' does not have a MANIFEST.json file, cannot detect version."
% to_text(b_path))
parent_dir, name = os.path.split(to_text(b_path, errors='surrogate_or_strict'))
namespace = os.path.split(parent_dir)[1]
version = '*'
dependencies = {}
meta = CollectionVersionMetadata(namespace, name, version, None, None, dependencies)
files = info.get('files_file', {}).get('files', {})
return CollectionRequirement(namespace, name, b_path, None, [version], version, force, parent=parent,
metadata=meta, files=files, skip=True)
@staticmethod
def from_name(collection, apis, requirement, force, parent=None):
namespace, name = collection.split('.', 1)
galaxy_meta = None
for api in apis:
try:
if not (requirement == '*' or requirement.startswith('<') or requirement.startswith('>') or
requirement.startswith('!=')):
if requirement.startswith('='):
requirement = requirement.lstrip('=')
resp = api.get_collection_version_metadata(namespace, name, requirement)
galaxy_meta = resp
versions = [resp.version]
else:
resp = api.get_collection_versions(namespace, name)
# Galaxy supports semver but ansible-galaxy does not. We ignore any versions that don't match
# StrictVersion (x.y.z) and only support pre-releases if an explicit version was set (done above).
versions = [v for v in resp if StrictVersion.version_re.match(v)]
except GalaxyError as err:
if err.http_code == 404:
display.vvv("Collection '%s' is not available from server %s %s"
% (collection, api.name, api.api_server))
continue
raise
display.vvv("Collection '%s' obtained from server %s %s" % (collection, api.name, api.api_server))
break
else:
raise AnsibleError("Failed to find collection %s:%s" % (collection, requirement))
req = CollectionRequirement(namespace, name, None, api, versions, requirement, force, parent=parent,
metadata=galaxy_meta)
return req
def build_collection(collection_path, output_path, force):
"""
Creates the Ansible collection artifact in a .tar.gz file.
:param collection_path: The path to the collection to build. This should be the directory that contains the
galaxy.yml file.
:param output_path: The path to create the collection build artifact. This should be a directory.
:param force: Whether to overwrite an existing collection build artifact or fail.
:return: The path to the collection build artifact.
"""
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
b_galaxy_path = os.path.join(b_collection_path, b'galaxy.yml')
if not os.path.exists(b_galaxy_path):
raise AnsibleError("The collection galaxy.yml path '%s' does not exist." % to_native(b_galaxy_path))
collection_meta = _get_galaxy_yml(b_galaxy_path)
file_manifest = _build_files_manifest(b_collection_path, collection_meta['namespace'], collection_meta['name'],
collection_meta['build_ignore'])
collection_manifest = _build_manifest(**collection_meta)
collection_output = os.path.join(output_path, "%s-%s-%s.tar.gz" % (collection_meta['namespace'],
collection_meta['name'],
collection_meta['version']))
b_collection_output = to_bytes(collection_output, errors='surrogate_or_strict')
if os.path.exists(b_collection_output):
if os.path.isdir(b_collection_output):
raise AnsibleError("The output collection artifact '%s' already exists, "
"but is a directory - aborting" % to_native(collection_output))
elif not force:
raise AnsibleError("The file '%s' already exists. You can use --force to re-create "
"the collection artifact." % to_native(collection_output))
_build_collection_tar(b_collection_path, b_collection_output, collection_manifest, file_manifest)
def publish_collection(collection_path, api, wait, timeout):
"""
Publish an Ansible collection tarball into an Ansible Galaxy server.
:param collection_path: The path to the collection tarball to publish.
:param api: A GalaxyAPI to publish the collection to.
:param wait: Whether to wait until the import process is complete.
:param timeout: The time in seconds to wait for the import process to finish, 0 is indefinite.
"""
import_uri = api.publish_collection(collection_path)
if wait:
# Galaxy returns a url fragment which differs between v2 and v3. The second to last entry is
# always the task_id, though.
# v2: {"task": "https://galaxy-dev.ansible.com/api/v2/collection-imports/35573/"}
# v3: {"task": "/api/automation-hub/v3/imports/collections/838d1308-a8f4-402c-95cb-7823f3806cd8/"}
task_id = None
for path_segment in reversed(import_uri.split('/')):
if path_segment:
task_id = path_segment
break
if not task_id:
raise AnsibleError("Publishing the collection did not return valid task info. Cannot wait for task status. Returned task info: '%s'" % import_uri)
display.display("Collection has been published to the Galaxy server %s %s" % (api.name, api.api_server))
with _display_progress():
api.wait_import_task(task_id, timeout)
display.display("Collection has been successfully published and imported to the Galaxy server %s %s"
% (api.name, api.api_server))
else:
display.display("Collection has been pushed to the Galaxy server %s %s, not waiting until import has "
"completed due to --no-wait being set. Import task results can be found at %s"
% (api.name, api.api_server, import_uri))
def install_collections(collections, output_path, apis, validate_certs, ignore_errors, no_deps, force, force_deps):
"""
Install Ansible collections to the path specified.
:param collections: The collections to install, should be a list of tuples with (name, requirement, Galaxy server).
:param output_path: The path to install the collections to.
:param apis: A list of GalaxyAPIs to query when searching for a collection.
:param validate_certs: Whether to validate the certificates if downloading a tarball.
:param ignore_errors: Whether to ignore any errors when installing the collection.
:param no_deps: Ignore any collection dependencies and only install the base requirements.
:param force: Re-install a collection if it has already been installed.
:param force_deps: Re-install a collection as well as its dependencies if they have already been installed.
"""
existing_collections = find_existing_collections(output_path)
with _tempdir() as b_temp_path:
display.display("Process install dependency map")
with _display_progress():
dependency_map = _build_dependency_map(collections, existing_collections, b_temp_path, apis,
validate_certs, force, force_deps, no_deps)
display.display("Starting collection install process")
with _display_progress():
for collection in dependency_map.values():
try:
collection.install(output_path, b_temp_path)
except AnsibleError as err:
if ignore_errors:
display.warning("Failed to install collection %s but skipping due to --ignore-errors being set. "
"Error: %s" % (to_text(collection), to_text(err)))
else:
raise
def validate_collection_name(name):
"""
Validates the collection name as an input from the user or a requirements file fit the requirements.
:param name: The input name with optional range specifier split by ':'.
:return: The input value, required for argparse validation.
"""
collection, dummy, dummy = name.partition(':')
if AnsibleCollectionRef.is_valid_collection_name(collection):
return name
raise AnsibleError("Invalid collection name '%s', "
"name must be in the format <namespace>.<collection>. \n"
"Please make sure namespace and collection name contains "
"characters from [a-zA-Z0-9_] only." % name)
def validate_collection_path(collection_path):
""" Ensure a given path ends with 'ansible_collections'
:param collection_path: The path that should end in 'ansible_collections'
:return: collection_path ending in 'ansible_collections' if it does not already.
"""
if os.path.split(collection_path)[1] != 'ansible_collections':
return os.path.join(collection_path, 'ansible_collections')
return collection_path
def verify_collections(collections, search_paths, apis, validate_certs, ignore_errors):
with _display_progress():
with _tempdir() as b_temp_path:
for collection in collections:
try:
local_collection = None
b_collection = to_bytes(collection[0], errors='surrogate_or_strict')
if os.path.isfile(b_collection) or urlparse(collection[0]).scheme.lower() in ['http', 'https'] or len(collection[0].split('.')) != 2:
raise AnsibleError(message="'%s' is not a valid collection name. The format namespace.name is expected." % collection[0])
collection_name = collection[0]
namespace, name = collection_name.split('.')
collection_version = collection[1]
# Verify local collection exists before downloading it from a galaxy server
for search_path in search_paths:
b_search_path = to_bytes(os.path.join(search_path, namespace, name), errors='surrogate_or_strict')
if os.path.isdir(b_search_path):
local_collection = CollectionRequirement.from_path(b_search_path, False)
break
if local_collection is None:
raise AnsibleError(message='Collection %s is not installed in any of the collection paths.' % collection_name)
# Download collection on a galaxy server for comparison
try:
remote_collection = CollectionRequirement.from_name(collection_name, apis, collection_version, False, parent=None)
except AnsibleError as e:
if e.message == 'Failed to find collection %s:%s' % (collection[0], collection[1]):
raise AnsibleError('Failed to find remote collection %s:%s on any of the galaxy servers' % (collection[0], collection[1]))
raise
download_url = remote_collection.metadata.download_url
headers = {}
remote_collection.api._add_auth_token(headers, download_url, required=False)
b_temp_tar_path = _download_file(download_url, b_temp_path, None, validate_certs, headers=headers)
local_collection.verify(remote_collection, search_path, b_temp_tar_path)
except AnsibleError as err:
if ignore_errors:
display.warning("Failed to verify collection %s but skipping due to --ignore-errors being set. "
"Error: %s" % (collection[0], to_text(err)))
else:
raise
@contextmanager
def _tempdir():
b_temp_path = tempfile.mkdtemp(dir=to_bytes(C.DEFAULT_LOCAL_TMP, errors='surrogate_or_strict'))
yield b_temp_path
shutil.rmtree(b_temp_path)
@contextmanager
def _tarfile_extract(tar, member):
tar_obj = tar.extractfile(member)
yield tar_obj
tar_obj.close()
@contextmanager
def _display_progress():
config_display = C.GALAXY_DISPLAY_PROGRESS
display_wheel = sys.stdout.isatty() if config_display is None else config_display
if not display_wheel:
yield
return
def progress(display_queue, actual_display):
actual_display.debug("Starting display_progress display thread")
t = threading.current_thread()
while True:
for c in "|/-\\":
actual_display.display(c + "\b", newline=False)
time.sleep(0.1)
# Display a message from the main thread
while True:
try:
method, args, kwargs = display_queue.get(block=False, timeout=0.1)
except queue.Empty:
break
else:
func = getattr(actual_display, method)
func(*args, **kwargs)
if getattr(t, "finish", False):
actual_display.debug("Received end signal for display_progress display thread")
return
class DisplayThread(object):
def __init__(self, display_queue):
self.display_queue = display_queue
def __getattr__(self, attr):
def call_display(*args, **kwargs):
self.display_queue.put((attr, args, kwargs))
return call_display
# Temporary override the global display class with our own which add the calls to a queue for the thread to call.
global display
old_display = display
try:
display_queue = queue.Queue()
display = DisplayThread(display_queue)
t = threading.Thread(target=progress, args=(display_queue, old_display))
t.daemon = True
t.start()
try:
yield
finally:
t.finish = True
t.join()
except Exception:
# The exception is re-raised so we can sure the thread is finished and not using the display anymore
raise
finally:
display = old_display
def _get_galaxy_yml(b_galaxy_yml_path):
meta_info = get_collections_galaxy_meta_info()
mandatory_keys = set()
string_keys = set()
list_keys = set()
dict_keys = set()
for info in meta_info:
if info.get('required', False):
mandatory_keys.add(info['key'])
key_list_type = {
'str': string_keys,
'list': list_keys,
'dict': dict_keys,
}[info.get('type', 'str')]
key_list_type.add(info['key'])
all_keys = frozenset(list(mandatory_keys) + list(string_keys) + list(list_keys) + list(dict_keys))
try:
with open(b_galaxy_yml_path, 'rb') as g_yaml:
galaxy_yml = yaml.safe_load(g_yaml)
except YAMLError as err:
raise AnsibleError("Failed to parse the galaxy.yml at '%s' with the following error:\n%s"
% (to_native(b_galaxy_yml_path), to_native(err)))
set_keys = set(galaxy_yml.keys())
missing_keys = mandatory_keys.difference(set_keys)
if missing_keys:
raise AnsibleError("The collection galaxy.yml at '%s' is missing the following mandatory keys: %s"
% (to_native(b_galaxy_yml_path), ", ".join(sorted(missing_keys))))
extra_keys = set_keys.difference(all_keys)
if len(extra_keys) > 0:
display.warning("Found unknown keys in collection galaxy.yml at '%s': %s"
% (to_text(b_galaxy_yml_path), ", ".join(extra_keys)))
# Add the defaults if they have not been set
for optional_string in string_keys:
if optional_string not in galaxy_yml:
galaxy_yml[optional_string] = None
for optional_list in list_keys:
list_val = galaxy_yml.get(optional_list, None)
if list_val is None:
galaxy_yml[optional_list] = []
elif not isinstance(list_val, list):
galaxy_yml[optional_list] = [list_val]
for optional_dict in dict_keys:
if optional_dict not in galaxy_yml:
galaxy_yml[optional_dict] = {}
# license is a builtin var in Python, to avoid confusion we just rename it to license_ids
galaxy_yml['license_ids'] = galaxy_yml['license']
del galaxy_yml['license']
return galaxy_yml
def _build_files_manifest(b_collection_path, namespace, name, ignore_patterns):
# We always ignore .pyc and .retry files as well as some well known version control directories. The ignore
# patterns can be extended by the build_ignore key in galaxy.yml
b_ignore_patterns = [
b'galaxy.yml',
b'*.pyc',
b'*.retry',
b'tests/output', # Ignore ansible-test result output directory.
to_bytes('{0}-{1}-*.tar.gz'.format(namespace, name)), # Ignores previously built artifacts in the root dir.
]
b_ignore_patterns += [to_bytes(p) for p in ignore_patterns]
b_ignore_dirs = frozenset([b'CVS', b'.bzr', b'.hg', b'.git', b'.svn', b'__pycache__', b'.tox'])
entry_template = {
'name': None,
'ftype': None,
'chksum_type': None,
'chksum_sha256': None,
'format': MANIFEST_FORMAT
}
manifest = {
'files': [
{
'name': '.',
'ftype': 'dir',
'chksum_type': None,
'chksum_sha256': None,
'format': MANIFEST_FORMAT,
},
],
'format': MANIFEST_FORMAT,
}
def _walk(b_path, b_top_level_dir):
for b_item in os.listdir(b_path):
b_abs_path = os.path.join(b_path, b_item)
b_rel_base_dir = b'' if b_path == b_top_level_dir else b_path[len(b_top_level_dir) + 1:]
b_rel_path = os.path.join(b_rel_base_dir, b_item)
rel_path = to_text(b_rel_path, errors='surrogate_or_strict')
if os.path.isdir(b_abs_path):
if any(b_item == b_path for b_path in b_ignore_dirs) or \
any(fnmatch.fnmatch(b_rel_path, b_pattern) for b_pattern in b_ignore_patterns):
display.vvv("Skipping '%s' for collection build" % to_text(b_abs_path))
continue
if os.path.islink(b_abs_path):
b_link_target = os.path.realpath(b_abs_path)
if not b_link_target.startswith(b_top_level_dir):
display.warning("Skipping '%s' as it is a symbolic link to a directory outside the collection"
% to_text(b_abs_path))
continue
manifest_entry = entry_template.copy()
manifest_entry['name'] = rel_path
manifest_entry['ftype'] = 'dir'
manifest['files'].append(manifest_entry)
_walk(b_abs_path, b_top_level_dir)
else:
if any(fnmatch.fnmatch(b_rel_path, b_pattern) for b_pattern in b_ignore_patterns):
display.vvv("Skipping '%s' for collection build" % to_text(b_abs_path))
continue
manifest_entry = entry_template.copy()
manifest_entry['name'] = rel_path
manifest_entry['ftype'] = 'file'
manifest_entry['chksum_type'] = 'sha256'
manifest_entry['chksum_sha256'] = secure_hash(b_abs_path, hash_func=sha256)
manifest['files'].append(manifest_entry)
_walk(b_collection_path, b_collection_path)
return manifest
def _build_manifest(namespace, name, version, authors, readme, tags, description, license_ids, license_file,
dependencies, repository, documentation, homepage, issues, **kwargs):
manifest = {
'collection_info': {
'namespace': namespace,
'name': name,
'version': version,
'authors': authors,
'readme': readme,
'tags': tags,
'description': description,
'license': license_ids,
'license_file': license_file if license_file else None, # Handle galaxy.yml having an empty string (None)
'dependencies': dependencies,
'repository': repository,
'documentation': documentation,
'homepage': homepage,
'issues': issues,
},
'file_manifest_file': {
'name': 'FILES.json',
'ftype': 'file',
'chksum_type': 'sha256',
'chksum_sha256': None, # Filled out in _build_collection_tar
'format': MANIFEST_FORMAT
},
'format': MANIFEST_FORMAT,
}
return manifest
def _build_collection_tar(b_collection_path, b_tar_path, collection_manifest, file_manifest):
files_manifest_json = to_bytes(json.dumps(file_manifest, indent=True), errors='surrogate_or_strict')
collection_manifest['file_manifest_file']['chksum_sha256'] = secure_hash_s(files_manifest_json, hash_func=sha256)
collection_manifest_json = to_bytes(json.dumps(collection_manifest, indent=True), errors='surrogate_or_strict')
with _tempdir() as b_temp_path:
b_tar_filepath = os.path.join(b_temp_path, os.path.basename(b_tar_path))
with tarfile.open(b_tar_filepath, mode='w:gz') as tar_file:
# Add the MANIFEST.json and FILES.json file to the archive
for name, b in [('MANIFEST.json', collection_manifest_json), ('FILES.json', files_manifest_json)]:
b_io = BytesIO(b)
tar_info = tarfile.TarInfo(name)
tar_info.size = len(b)
tar_info.mtime = time.time()
tar_info.mode = 0o0644
tar_file.addfile(tarinfo=tar_info, fileobj=b_io)
for file_info in file_manifest['files']:
if file_info['name'] == '.':
continue
# arcname expects a native string, cannot be bytes
filename = to_native(file_info['name'], errors='surrogate_or_strict')
b_src_path = os.path.join(b_collection_path, to_bytes(filename, errors='surrogate_or_strict'))
def reset_stat(tarinfo):
tarinfo.mode = 0o0755 if tarinfo.isdir() else 0o0644
tarinfo.uid = tarinfo.gid = 0
tarinfo.uname = tarinfo.gname = ''
return tarinfo
tar_file.add(os.path.realpath(b_src_path), arcname=filename, recursive=False, filter=reset_stat)
shutil.copy(b_tar_filepath, b_tar_path)
collection_name = "%s.%s" % (collection_manifest['collection_info']['namespace'],
collection_manifest['collection_info']['name'])
display.display('Created collection for %s at %s' % (collection_name, to_text(b_tar_path)))
def find_existing_collections(path):
collections = []
b_path = to_bytes(path, errors='surrogate_or_strict')
for b_namespace in os.listdir(b_path):
b_namespace_path = os.path.join(b_path, b_namespace)
if os.path.isfile(b_namespace_path):
continue
for b_collection in os.listdir(b_namespace_path):
b_collection_path = os.path.join(b_namespace_path, b_collection)
if os.path.isdir(b_collection_path):
req = CollectionRequirement.from_path(b_collection_path, False)
display.vvv("Found installed collection %s:%s at '%s'" % (to_text(req), req.latest_version,
to_text(b_collection_path)))
collections.append(req)
return collections
def _build_dependency_map(collections, existing_collections, b_temp_path, apis, validate_certs, force, force_deps,
no_deps):
dependency_map = {}
# First build the dependency map on the actual requirements
for name, version, source in collections:
_get_collection_info(dependency_map, existing_collections, name, version, source, b_temp_path, apis,
validate_certs, (force or force_deps))
checked_parents = set([to_text(c) for c in dependency_map.values() if c.skip])
while len(dependency_map) != len(checked_parents):
while not no_deps: # Only parse dependencies if no_deps was not set
parents_to_check = set(dependency_map.keys()).difference(checked_parents)
deps_exhausted = True
for parent in parents_to_check:
parent_info = dependency_map[parent]
if parent_info.dependencies:
deps_exhausted = False
for dep_name, dep_requirement in parent_info.dependencies.items():
_get_collection_info(dependency_map, existing_collections, dep_name, dep_requirement,
parent_info.api, b_temp_path, apis, validate_certs, force_deps,
parent=parent)
checked_parents.add(parent)
# No extra dependencies were resolved, exit loop
if deps_exhausted:
break
# Now we have resolved the deps to our best extent, now select the latest version for collections with
# multiple versions found and go from there
deps_not_checked = set(dependency_map.keys()).difference(checked_parents)
for collection in deps_not_checked:
dependency_map[collection].set_latest_version()
if no_deps or len(dependency_map[collection].dependencies) == 0:
checked_parents.add(collection)
return dependency_map
def _get_collection_info(dep_map, existing_collections, collection, requirement, source, b_temp_path, apis,
validate_certs, force, parent=None):
dep_msg = ""
if parent:
dep_msg = " - as dependency of %s" % parent
display.vvv("Processing requirement collection '%s'%s" % (to_text(collection), dep_msg))
b_tar_path = None
if os.path.isfile(to_bytes(collection, errors='surrogate_or_strict')):
display.vvvv("Collection requirement '%s' is a tar artifact" % to_text(collection))
b_tar_path = to_bytes(collection, errors='surrogate_or_strict')
elif urlparse(collection).scheme.lower() in ['http', 'https']:
display.vvvv("Collection requirement '%s' is a URL to a tar artifact" % collection)
try:
b_tar_path = _download_file(collection, b_temp_path, None, validate_certs)
except urllib_error.URLError as err:
raise AnsibleError("Failed to download collection tar from '%s': %s"
% (to_native(collection), to_native(err)))
if b_tar_path:
req = CollectionRequirement.from_tar(b_tar_path, force, parent=parent)
collection_name = to_text(req)
if collection_name in dep_map:
collection_info = dep_map[collection_name]
collection_info.add_requirement(None, req.latest_version)
else:
collection_info = req
else:
validate_collection_name(collection)
display.vvvv("Collection requirement '%s' is the name of a collection" % collection)
if collection in dep_map:
collection_info = dep_map[collection]
collection_info.add_requirement(parent, requirement)
else:
apis = [source] if source else apis
collection_info = CollectionRequirement.from_name(collection, apis, requirement, force, parent=parent)
existing = [c for c in existing_collections if to_text(c) == to_text(collection_info)]
if existing and not collection_info.force:
# Test that the installed collection fits the requirement
existing[0].add_requirement(parent, requirement)
collection_info = existing[0]
dep_map[to_text(collection_info)] = collection_info
def _download_file(url, b_path, expected_hash, validate_certs, headers=None):
bufsize = 65536
digest = sha256()
urlsplit = os.path.splitext(to_text(url.rsplit('/', 1)[1]))
b_file_name = to_bytes(urlsplit[0], errors='surrogate_or_strict')
b_file_ext = to_bytes(urlsplit[1], errors='surrogate_or_strict')
b_file_path = tempfile.NamedTemporaryFile(dir=b_path, prefix=b_file_name, suffix=b_file_ext, delete=False).name
display.vvv("Downloading %s to %s" % (url, to_text(b_path)))
# Galaxy redirs downloads to S3 which reject the request if an Authorization header is attached so don't redir that
resp = open_url(to_native(url, errors='surrogate_or_strict'), validate_certs=validate_certs, headers=headers,
unredirected_headers=['Authorization'], http_agent=user_agent())
with open(b_file_path, 'wb') as download_file:
actual_hash = _consume_file(resp, download_file)
if expected_hash:
display.vvvv("Validating downloaded file hash %s with expected hash %s" % (actual_hash, expected_hash))
if expected_hash != actual_hash:
raise AnsibleError("Mismatch artifact hash with downloaded file")
return b_file_path
def _extract_tar_file(tar, filename, b_dest, b_temp_path, expected_hash=None):
with _get_tar_file_member(tar, filename) as tar_obj:
with tempfile.NamedTemporaryFile(dir=b_temp_path, delete=False) as tmpfile_obj:
actual_hash = _consume_file(tar_obj, tmpfile_obj)
if expected_hash and actual_hash != expected_hash:
raise AnsibleError("Checksum mismatch for '%s' inside collection at '%s'"
% (to_native(filename, errors='surrogate_or_strict'), to_native(tar.name)))
b_dest_filepath = os.path.join(b_dest, to_bytes(filename, errors='surrogate_or_strict'))
b_parent_dir = os.path.split(b_dest_filepath)[0]
if not os.path.exists(b_parent_dir):
# Seems like Galaxy does not validate if all file entries have a corresponding dir ftype entry. This check
# makes sure we create the parent directory even if it wasn't set in the metadata.
os.makedirs(b_parent_dir)
shutil.move(to_bytes(tmpfile_obj.name, errors='surrogate_or_strict'), b_dest_filepath)
def _get_tar_file_member(tar, filename):
n_filename = to_native(filename, errors='surrogate_or_strict')
try:
member = tar.getmember(n_filename)
except KeyError:
raise AnsibleError("Collection tar at '%s' does not contain the expected file '%s'." % (
to_native(tar.name),
n_filename))
return _tarfile_extract(tar, member)
def _get_json_from_tar_file(b_path, filename):
file_contents = ''
with tarfile.open(b_path, mode='r') as collection_tar:
with _get_tar_file_member(collection_tar, filename) as tar_obj:
bufsize = 65536
data = tar_obj.read(bufsize)
while data:
file_contents += to_text(data)
data = tar_obj.read(bufsize)
return json.loads(file_contents)
def _get_tar_file_hash(b_path, filename):
with tarfile.open(b_path, mode='r') as collection_tar:
with _get_tar_file_member(collection_tar, filename) as tar_obj:
return _consume_file(tar_obj)
def _consume_file(read_from, write_to=None):
bufsize = 65536
sha256_digest = sha256()
data = read_from.read(bufsize)
while data:
if write_to is not None:
write_to.write(data)
write_to.flush()
sha256_digest.update(data)
data = read_from.read(bufsize)
return sha256_digest.hexdigest()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,574 |
Null secondary dependencies will error collection installs
|
##### SUMMARY
If a dependency of my collection uses a specific syntax in its `galaxy.yml` then it will break installs of my own collection.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
lib/ansible/galaxy/collection.py
lib/ansible/cli/galaxy.py
##### ANSIBLE VERSION
```paste below
$ ansible --version
ansible 2.10.0.dev0
config file = None
configured module search path = ['/Users/alancoding/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/alancoding/Documents/repos/ansible/lib/ansible
executable location = /Users/alancoding/.virtualenvs/ansible3/bin/ansible
python version = 3.6.5 (default, Apr 25 2018, 14:23:58) [GCC 4.2.1 Compatible Apple LLVM 9.1.0 (clang-902.0.39.1)]
```
##### CONFIGURATION
defaults
##### OS / ENVIRONMENT
N/A
##### STEPS TO REPRODUCE
Create a collection like this:
```yaml
namespace: alancoding
name: bug
version: 0.0.1
readme: README.md
authors:
- Alan Rominger <[email protected]>
description: A testing collection that depends on debops collection
license:
- GPL-2.0-or-later
license_file: ''
tags: []
dependencies:
softasap.redis_box: "*"
repository: https://github.com/AlanCoding/collection-dependencies-demo
documentation: https://github.com/AlanCoding/collection-dependencies-demo
homepage: https://github.com/AlanCoding/collection-dependencies-demo
issues: http://example.com/issue/tracker
build_ignore:
- target
- output*.txt
- '*.tar.gz'
```
Now build and then install it.
```
repro_bug:
rm -rf bug_debops/alancoding-bug-0.0.1.tar.gz
ansible-galaxy collection build bug_debops --output-path=bug_debops -vvv
ANSIBLE_COLLECTIONS_PATHS=bug_debops/target ansible-galaxy collection install bug_debops/alancoding-bug-0.0.1.tar.gz -f -p bug_debops/target -vvv
```
(ignore my folder name, originally I mis-attributed the error)
##### EXPECTED RESULTS
Installs. Nothing is wrong with this setup.
##### ACTUAL RESULTS
```paste below
$ make repro_bug
# rm -rf bug_debops/target
rm -rf bug_debops/alancoding-bug-0.0.1.tar.gz
ansible-galaxy collection build bug_debops --output-path=bug_debops -vvv
ansible-galaxy 2.10.0.dev0
config file = None
configured module search path = ['/Users/alancoding/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/alancoding/Documents/repos/ansible/lib/ansible
executable location = /Users/alancoding/.virtualenvs/ansible3/bin/ansible-galaxy
python version = 3.6.5 (default, Apr 25 2018, 14:23:58) [GCC 4.2.1 Compatible Apple LLVM 9.1.0 (clang-902.0.39.1)]
No config file found; using defaults
Skipping '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target' for collection build
Skipping '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/galaxy.yml' for collection build
Created collection for alancoding.bug at /Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/alancoding-bug-0.0.1.tar.gz
ANSIBLE_COLLECTIONS_PATHS=bug_debops/target ansible-galaxy collection install bug_debops/alancoding-bug-0.0.1.tar.gz -f -p bug_debops/target -vvv
ansible-galaxy 2.10.0.dev0
config file = None
configured module search path = ['/Users/alancoding/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/alancoding/Documents/repos/ansible/lib/ansible
executable location = /Users/alancoding/.virtualenvs/ansible3/bin/ansible-galaxy
python version = 3.6.5 (default, Apr 25 2018, 14:23:58) [GCC 4.2.1 Compatible Apple LLVM 9.1.0 (clang-902.0.39.1)]
No config file found; using defaults
Found installed collection gavinfish.azuretest:1.0.3 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/gavinfish/azuretest'
Found installed collection debops.roles03:2.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/debops/roles03'
Found installed collection debops.roles02:2.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/debops/roles02'
Found installed collection debops.debops:2.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/debops/debops'
Found installed collection debops.roles01:2.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/debops/roles01'
Found installed collection fragmentedpacket.netbox_modules:0.1.4 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/fragmentedpacket/netbox_modules'
Found installed collection alancoding.bug:0.0.1 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/alancoding/bug'
Found installed collection softasap.redis_box:0.1.0 at '/Users/alancoding/Documents/repos/collection-dependencies-demo/bug_debops/target/ansible_collections/softasap/redis_box'
Process install dependency map
Opened /Users/alancoding/.ansible/galaxy_token
Processing requirement collection 'bug_debops/alancoding-bug-0.0.1.tar.gz'
Processing requirement collection 'softasap.redis_box' - as dependency of alancoding.bug
Opened /Users/alancoding/.ansible/galaxy_token
Collection 'softasap.redis_box' obtained from server default https://galaxy.ansible.com/api/
ERROR! Unexpected Exception, this is probably a bug: object of type 'NoneType' has no len()
the full traceback was:
Traceback (most recent call last):
File "/Users/alancoding/Documents/repos/ansible/bin/ansible-galaxy", line 123, in <module>
exit_code = cli.run()
File "/Users/alancoding/Documents/repos/ansible/lib/ansible/cli/galaxy.py", line 456, in run
context.CLIARGS['func']()
File "/Users/alancoding/Documents/repos/ansible/lib/ansible/cli/galaxy.py", line 944, in execute_install
no_deps, force, force_deps)
File "/Users/alancoding/Documents/repos/ansible/lib/ansible/galaxy/collection.py", line 511, in install_collections
validate_certs, force, force_deps, no_deps)
File "/Users/alancoding/Documents/repos/ansible/lib/ansible/galaxy/collection.py", line 955, in _build_dependency_map
if no_deps or len(dependency_map[collection].dependencies) == 0:
TypeError: object of type 'NoneType' has no len()
make: *** [repro_bug] Error 250
```
It seems that the root of this issue is:
https://github.com/oops-to-devops/redis_box/blob/develop/galaxy.yml
```
cat bug_debops/target/ansible_collections/softasap/redis_box/MANIFEST.json
{
"collection_info": {
"description": "wraps sa-redis and installs redis server",
"repository": "https://github.com/oops-to-devops/redis_box",
"tags": [
"softasap",
"redis"
],
"dependencies": null,
"authors": [
"Vyacheslav Voronenko"
],
"issues": "https://github.com/orgs/oops-to-devops/projects/1",
"name": "redis_box",
...
```
Corresponding to:
```yaml
namespace: "softasap"
name: "redis_box"
description: wraps sa-redis and installs redis server
version: 0.2.0
readme: "Readme.md"
authors:
- "Vyacheslav Voronenko"
dependencies:
license:
- "MIT"
tags:
- softasap
- redis
repository: "https://github.com/oops-to-devops/redis_box"
documentation: "https://github.com/oops-to-devops/mariadb_box/blob/master/README.md"
homepage: "https://www.softasap.com"
issues: "https://github.com/orgs/oops-to-devops/projects/1"
```
I have confirmed that I can build this collection locally.
So Galaxy shouldn't let me build a collection with this entry that errors the dependency processing, or it should handle this case.
|
https://github.com/ansible/ansible/issues/67574
|
https://github.com/ansible/ansible/pull/67575
|
2de4e55650d729bf8a0cb59b55f504267fa94689
|
cffead4631fa3795f66c6da700a9a46d6e95870f
| 2020-02-19T14:59:41Z |
python
| 2020-02-20T16:23:23Z |
test/units/galaxy/test_collection_install.py
|
# -*- coding: utf-8 -*-
# Copyright: (c) 2019, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import copy
import json
import os
import pytest
import re
import shutil
import tarfile
import yaml
from io import BytesIO, StringIO
from units.compat.mock import MagicMock
import ansible.module_utils.six.moves.urllib.error as urllib_error
from ansible import context
from ansible.cli.galaxy import GalaxyCLI
from ansible.errors import AnsibleError
from ansible.galaxy import collection, api
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.utils import context_objects as co
from ansible.utils.display import Display
def call_galaxy_cli(args):
orig = co.GlobalCLIArgs._Singleton__instance
co.GlobalCLIArgs._Singleton__instance = None
try:
GalaxyCLI(args=['ansible-galaxy', 'collection'] + args).run()
finally:
co.GlobalCLIArgs._Singleton__instance = orig
def artifact_json(namespace, name, version, dependencies, server):
json_str = json.dumps({
'artifact': {
'filename': '%s-%s-%s.tar.gz' % (namespace, name, version),
'sha256': '2d76f3b8c4bab1072848107fb3914c345f71a12a1722f25c08f5d3f51f4ab5fd',
'size': 1234,
},
'download_url': '%s/download/%s-%s-%s.tar.gz' % (server, namespace, name, version),
'metadata': {
'namespace': namespace,
'name': name,
'dependencies': dependencies,
},
'version': version
})
return to_text(json_str)
def artifact_versions_json(namespace, name, versions, galaxy_api, available_api_versions=None):
results = []
available_api_versions = available_api_versions or {}
api_version = 'v2'
if 'v3' in available_api_versions:
api_version = 'v3'
for version in versions:
results.append({
'href': '%s/api/%s/%s/%s/versions/%s/' % (galaxy_api.api_server, api_version, namespace, name, version),
'version': version,
})
if api_version == 'v2':
json_str = json.dumps({
'count': len(versions),
'next': None,
'previous': None,
'results': results
})
if api_version == 'v3':
response = {'meta': {'count': len(versions)},
'data': results,
'links': {'first': None,
'last': None,
'next': None,
'previous': None},
}
json_str = json.dumps(response)
return to_text(json_str)
def error_json(galaxy_api, errors_to_return=None, available_api_versions=None):
errors_to_return = errors_to_return or []
available_api_versions = available_api_versions or {}
response = {}
api_version = 'v2'
if 'v3' in available_api_versions:
api_version = 'v3'
if api_version == 'v2':
assert len(errors_to_return) <= 1
if errors_to_return:
response = errors_to_return[0]
if api_version == 'v3':
response['errors'] = errors_to_return
json_str = json.dumps(response)
return to_text(json_str)
@pytest.fixture(autouse='function')
def reset_cli_args():
co.GlobalCLIArgs._Singleton__instance = None
yield
co.GlobalCLIArgs._Singleton__instance = None
@pytest.fixture()
def collection_artifact(request, tmp_path_factory):
test_dir = to_text(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
namespace = 'ansible_namespace'
collection = 'collection'
skeleton_path = os.path.join(os.path.dirname(os.path.split(__file__)[0]), 'cli', 'test_data', 'collection_skeleton')
collection_path = os.path.join(test_dir, namespace, collection)
call_galaxy_cli(['init', '%s.%s' % (namespace, collection), '-c', '--init-path', test_dir,
'--collection-skeleton', skeleton_path])
dependencies = getattr(request, 'param', None)
if dependencies:
galaxy_yml = os.path.join(collection_path, 'galaxy.yml')
with open(galaxy_yml, 'rb+') as galaxy_obj:
existing_yaml = yaml.safe_load(galaxy_obj)
existing_yaml['dependencies'] = dependencies
galaxy_obj.seek(0)
galaxy_obj.write(to_bytes(yaml.safe_dump(existing_yaml)))
galaxy_obj.truncate()
call_galaxy_cli(['build', collection_path, '--output-path', test_dir])
collection_tar = os.path.join(test_dir, '%s-%s-0.1.0.tar.gz' % (namespace, collection))
return to_bytes(collection_path), to_bytes(collection_tar)
@pytest.fixture()
def galaxy_server():
context.CLIARGS._store = {'ignore_certs': False}
galaxy_api = api.GalaxyAPI(None, 'test_server', 'https://galaxy.ansible.com')
return galaxy_api
def test_build_requirement_from_path(collection_artifact):
actual = collection.CollectionRequirement.from_path(collection_artifact[0], True)
assert actual.namespace == u'ansible_namespace'
assert actual.name == u'collection'
assert actual.b_path == collection_artifact[0]
assert actual.api is None
assert actual.skip is True
assert actual.versions == set([u'*'])
assert actual.latest_version == u'*'
assert actual.dependencies == {}
@pytest.mark.parametrize('version', ['1.1.1', 1.1, 1])
def test_build_requirement_from_path_with_manifest(version, collection_artifact):
manifest_path = os.path.join(collection_artifact[0], b'MANIFEST.json')
manifest_value = json.dumps({
'collection_info': {
'namespace': 'namespace',
'name': 'name',
'version': version,
'dependencies': {
'ansible_namespace.collection': '*'
}
}
})
with open(manifest_path, 'wb') as manifest_obj:
manifest_obj.write(to_bytes(manifest_value))
actual = collection.CollectionRequirement.from_path(collection_artifact[0], True)
# While the folder name suggests a different collection, we treat MANIFEST.json as the source of truth.
assert actual.namespace == u'namespace'
assert actual.name == u'name'
assert actual.b_path == collection_artifact[0]
assert actual.api is None
assert actual.skip is True
assert actual.versions == set([to_text(version)])
assert actual.latest_version == to_text(version)
assert actual.dependencies == {'ansible_namespace.collection': '*'}
def test_build_requirement_from_path_invalid_manifest(collection_artifact):
manifest_path = os.path.join(collection_artifact[0], b'MANIFEST.json')
with open(manifest_path, 'wb') as manifest_obj:
manifest_obj.write(b"not json")
expected = "Collection file at '%s' does not contain a valid json string." % to_native(manifest_path)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_path(collection_artifact[0], True)
def test_build_requirement_from_path_no_version(collection_artifact, monkeypatch):
manifest_path = os.path.join(collection_artifact[0], b'MANIFEST.json')
manifest_value = json.dumps({
'collection_info': {
'namespace': 'namespace',
'name': 'name',
'version': '',
'dependencies': {}
}
})
with open(manifest_path, 'wb') as manifest_obj:
manifest_obj.write(to_bytes(manifest_value))
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
actual = collection.CollectionRequirement.from_path(collection_artifact[0], True)
# While the folder name suggests a different collection, we treat MANIFEST.json as the source of truth.
assert actual.namespace == u'namespace'
assert actual.name == u'name'
assert actual.b_path == collection_artifact[0]
assert actual.api is None
assert actual.skip is True
assert actual.versions == set(['*'])
assert actual.latest_version == u'*'
assert actual.dependencies == {}
assert mock_display.call_count == 1
actual_warn = ' '.join(mock_display.mock_calls[0][1][0].split('\n'))
expected_warn = "Collection at '%s' does not have a valid version set, falling back to '*'. Found version: ''" \
% to_text(collection_artifact[0])
assert expected_warn in actual_warn
def test_build_requirement_from_tar(collection_artifact):
actual = collection.CollectionRequirement.from_tar(collection_artifact[1], True, True)
assert actual.namespace == u'ansible_namespace'
assert actual.name == u'collection'
assert actual.b_path == collection_artifact[1]
assert actual.api is None
assert actual.skip is False
assert actual.versions == set([u'0.1.0'])
assert actual.latest_version == u'0.1.0'
assert actual.dependencies == {}
def test_build_requirement_from_tar_fail_not_tar(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
test_file = os.path.join(test_dir, b'fake.tar.gz')
with open(test_file, 'wb') as test_obj:
test_obj.write(b"\x00\x01\x02\x03")
expected = "Collection artifact at '%s' is not a valid tar file." % to_native(test_file)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(test_file, True, True)
def test_build_requirement_from_tar_no_manifest(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
json_data = to_bytes(json.dumps(
{
'files': [],
'format': 1,
}
))
tar_path = os.path.join(test_dir, b'ansible-collections.tar.gz')
with tarfile.open(tar_path, 'w:gz') as tfile:
b_io = BytesIO(json_data)
tar_info = tarfile.TarInfo('FILES.json')
tar_info.size = len(json_data)
tar_info.mode = 0o0644
tfile.addfile(tarinfo=tar_info, fileobj=b_io)
expected = "Collection at '%s' does not contain the required file MANIFEST.json." % to_native(tar_path)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(tar_path, True, True)
def test_build_requirement_from_tar_no_files(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
json_data = to_bytes(json.dumps(
{
'collection_info': {},
}
))
tar_path = os.path.join(test_dir, b'ansible-collections.tar.gz')
with tarfile.open(tar_path, 'w:gz') as tfile:
b_io = BytesIO(json_data)
tar_info = tarfile.TarInfo('MANIFEST.json')
tar_info.size = len(json_data)
tar_info.mode = 0o0644
tfile.addfile(tarinfo=tar_info, fileobj=b_io)
expected = "Collection at '%s' does not contain the required file FILES.json." % to_native(tar_path)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(tar_path, True, True)
def test_build_requirement_from_tar_invalid_manifest(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
json_data = b"not a json"
tar_path = os.path.join(test_dir, b'ansible-collections.tar.gz')
with tarfile.open(tar_path, 'w:gz') as tfile:
b_io = BytesIO(json_data)
tar_info = tarfile.TarInfo('MANIFEST.json')
tar_info.size = len(json_data)
tar_info.mode = 0o0644
tfile.addfile(tarinfo=tar_info, fileobj=b_io)
expected = "Collection tar file member MANIFEST.json does not contain a valid json string."
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(tar_path, True, True)
def test_build_requirement_from_name(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['2.1.9', '2.1.10']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '*', True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.1.9', u'2.1.10'])
assert actual.latest_version == u'2.1.10'
assert actual.dependencies is None
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
def test_build_requirement_from_name_with_prerelease(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['1.0.1', '2.0.1-beta.1', '2.0.1']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '*', True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'1.0.1', u'2.0.1'])
assert actual.latest_version == u'2.0.1'
assert actual.dependencies is None
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
def test_build_requirment_from_name_with_prerelease_explicit(galaxy_server, monkeypatch):
mock_get_info = MagicMock()
mock_get_info.return_value = api.CollectionVersionMetadata('namespace', 'collection', '2.0.1-beta.1', None, None,
{})
monkeypatch.setattr(galaxy_server, 'get_collection_version_metadata', mock_get_info)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '2.0.1-beta.1', True,
True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.1-beta.1'])
assert actual.latest_version == u'2.0.1-beta.1'
assert actual.dependencies == {}
assert mock_get_info.call_count == 1
assert mock_get_info.mock_calls[0][1] == ('namespace', 'collection', '2.0.1-beta.1')
def test_build_requirement_from_name_second_server(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['1.0.1', '1.0.2', '1.0.3']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
broken_server = copy.copy(galaxy_server)
broken_server.api_server = 'https://broken.com/'
mock_404 = MagicMock()
mock_404.side_effect = api.GalaxyError(urllib_error.HTTPError('https://galaxy.server.com', 404, 'msg', {},
StringIO()), "custom msg")
monkeypatch.setattr(broken_server, 'get_collection_versions', mock_404)
actual = collection.CollectionRequirement.from_name('namespace.collection', [broken_server, galaxy_server],
'>1.0.1', False, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
# assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'1.0.2', u'1.0.3'])
assert actual.latest_version == u'1.0.3'
assert actual.dependencies is None
assert mock_404.call_count == 1
assert mock_404.mock_calls[0][1] == ('namespace', 'collection')
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
def test_build_requirement_from_name_missing(galaxy_server, monkeypatch):
mock_open = MagicMock()
mock_open.side_effect = api.GalaxyError(urllib_error.HTTPError('https://galaxy.server.com', 404, 'msg', {},
StringIO()), "")
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_open)
expected = "Failed to find collection namespace.collection:*"
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server, galaxy_server], '*', False,
True)
def test_build_requirement_from_name_401_unauthorized(galaxy_server, monkeypatch):
mock_open = MagicMock()
mock_open.side_effect = api.GalaxyError(urllib_error.HTTPError('https://galaxy.server.com', 401, 'msg', {},
StringIO()), "error")
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_open)
expected = "error (HTTP Code: 401, Message: msg)"
with pytest.raises(api.GalaxyError, match=re.escape(expected)):
collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server, galaxy_server], '*', False)
def test_build_requirement_from_name_single_version(galaxy_server, monkeypatch):
mock_get_info = MagicMock()
mock_get_info.return_value = api.CollectionVersionMetadata('namespace', 'collection', '2.0.0', None, None,
{})
monkeypatch.setattr(galaxy_server, 'get_collection_version_metadata', mock_get_info)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '2.0.0', True,
True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.0'])
assert actual.latest_version == u'2.0.0'
assert actual.dependencies == {}
assert mock_get_info.call_count == 1
assert mock_get_info.mock_calls[0][1] == ('namespace', 'collection', '2.0.0')
def test_build_requirement_from_name_multiple_versions_one_match(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['2.0.0', '2.0.1', '2.0.2']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
mock_get_info = MagicMock()
mock_get_info.return_value = api.CollectionVersionMetadata('namespace', 'collection', '2.0.1', None, None,
{})
monkeypatch.setattr(galaxy_server, 'get_collection_version_metadata', mock_get_info)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '>=2.0.1,<2.0.2',
True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.1'])
assert actual.latest_version == u'2.0.1'
assert actual.dependencies == {}
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
assert mock_get_info.call_count == 1
assert mock_get_info.mock_calls[0][1] == ('namespace', 'collection', '2.0.1')
def test_build_requirement_from_name_multiple_version_results(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['2.0.0', '2.0.1', '2.0.2', '2.0.3', '2.0.4', '2.0.5']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '!=2.0.2',
True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.0', u'2.0.1', u'2.0.3', u'2.0.4', u'2.0.5'])
assert actual.latest_version == u'2.0.5'
assert actual.dependencies is None
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
@pytest.mark.parametrize('versions, requirement, expected_filter, expected_latest', [
[['1.0.0', '1.0.1'], '*', ['1.0.0', '1.0.1'], '1.0.1'],
[['1.0.0', '1.0.5', '1.1.0'], '>1.0.0,<1.1.0', ['1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '>1.0.0,<=1.0.5', ['1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '>=1.1.0', ['1.1.0'], '1.1.0'],
[['1.0.0', '1.0.5', '1.1.0'], '!=1.1.0', ['1.0.0', '1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '==1.0.5', ['1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '1.0.5', ['1.0.5'], '1.0.5'],
[['1.0.0', '2.0.0', '3.0.0'], '>=2', ['2.0.0', '3.0.0'], '3.0.0'],
])
def test_add_collection_requirements(versions, requirement, expected_filter, expected_latest):
req = collection.CollectionRequirement('namespace', 'name', None, 'https://galaxy.com', versions, requirement,
False)
assert req.versions == set(expected_filter)
assert req.latest_version == expected_latest
def test_add_collection_requirement_to_unknown_installed_version(monkeypatch):
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
req = collection.CollectionRequirement('namespace', 'name', None, 'https://galaxy.com', ['*'], '*', False,
skip=True)
req.add_requirement('parent.collection', '1.0.0')
assert req.latest_version == '*'
assert mock_display.call_count == 1
actual_warn = ' '.join(mock_display.mock_calls[0][1][0].split('\n'))
assert "Failed to validate the collection requirement 'namespace.name:1.0.0' for parent.collection" in actual_warn
def test_add_collection_wildcard_requirement_to_unknown_installed_version():
req = collection.CollectionRequirement('namespace', 'name', None, 'https://galaxy.com', ['*'], '*', False,
skip=True)
req.add_requirement(str(req), '*')
assert req.versions == set('*')
assert req.latest_version == '*'
def test_add_collection_requirement_with_conflict(galaxy_server):
expected = "Cannot meet requirement ==1.0.2 for dependency namespace.name from source '%s'. Available versions " \
"before last requirement added: 1.0.0, 1.0.1\n" \
"Requirements from:\n" \
"\tbase - 'namespace.name:==1.0.2'" % galaxy_server.api_server
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement('namespace', 'name', None, galaxy_server, ['1.0.0', '1.0.1'], '==1.0.2',
False)
def test_add_requirement_to_existing_collection_with_conflict(galaxy_server):
req = collection.CollectionRequirement('namespace', 'name', None, galaxy_server, ['1.0.0', '1.0.1'], '*', False)
expected = "Cannot meet dependency requirement 'namespace.name:1.0.2' for collection namespace.collection2 from " \
"source '%s'. Available versions before last requirement added: 1.0.0, 1.0.1\n" \
"Requirements from:\n" \
"\tbase - 'namespace.name:*'\n" \
"\tnamespace.collection2 - 'namespace.name:1.0.2'" % galaxy_server.api_server
with pytest.raises(AnsibleError, match=re.escape(expected)):
req.add_requirement('namespace.collection2', '1.0.2')
def test_add_requirement_to_installed_collection_with_conflict():
source = 'https://galaxy.ansible.com'
req = collection.CollectionRequirement('namespace', 'name', None, source, ['1.0.0', '1.0.1'], '*', False,
skip=True)
expected = "Cannot meet requirement namespace.name:1.0.2 as it is already installed at version '1.0.1'. " \
"Use --force to overwrite"
with pytest.raises(AnsibleError, match=re.escape(expected)):
req.add_requirement(None, '1.0.2')
def test_add_requirement_to_installed_collection_with_conflict_as_dep():
source = 'https://galaxy.ansible.com'
req = collection.CollectionRequirement('namespace', 'name', None, source, ['1.0.0', '1.0.1'], '*', False,
skip=True)
expected = "Cannot meet requirement namespace.name:1.0.2 as it is already installed at version '1.0.1'. " \
"Use --force-with-deps to overwrite"
with pytest.raises(AnsibleError, match=re.escape(expected)):
req.add_requirement('namespace.collection2', '1.0.2')
def test_install_skipped_collection(monkeypatch):
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
req = collection.CollectionRequirement('namespace', 'name', None, 'source', ['1.0.0'], '*', False, skip=True)
req.install(None, None)
assert mock_display.call_count == 1
assert mock_display.mock_calls[0][1][0] == "Skipping 'namespace.name' as it is already installed"
def test_install_collection(collection_artifact, monkeypatch):
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
collection_tar = collection_artifact[1]
output_path = os.path.join(os.path.split(collection_tar)[0], b'output')
collection_path = os.path.join(output_path, b'ansible_namespace', b'collection')
os.makedirs(os.path.join(collection_path, b'delete_me')) # Create a folder to verify the install cleans out the dir
temp_path = os.path.join(os.path.split(collection_tar)[0], b'temp')
os.makedirs(temp_path)
req = collection.CollectionRequirement.from_tar(collection_tar, True, True)
req.install(to_text(output_path), temp_path)
# Ensure the temp directory is empty, nothing is left behind
assert os.listdir(temp_path) == []
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
assert mock_display.call_count == 1
assert mock_display.mock_calls[0][1][0] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" \
% to_text(collection_path)
def test_install_collection_with_download(galaxy_server, collection_artifact, monkeypatch):
collection_tar = collection_artifact[1]
output_path = os.path.join(os.path.split(collection_tar)[0], b'output')
collection_path = os.path.join(output_path, b'ansible_namespace', b'collection')
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
mock_download = MagicMock()
mock_download.return_value = collection_tar
monkeypatch.setattr(collection, '_download_file', mock_download)
monkeypatch.setattr(galaxy_server, '_available_api_versions', {'v2': 'v2/'})
temp_path = os.path.join(os.path.split(collection_tar)[0], b'temp')
os.makedirs(temp_path)
meta = api.CollectionVersionMetadata('ansible_namespace', 'collection', '0.1.0', 'https://downloadme.com',
'myhash', {})
req = collection.CollectionRequirement('ansible_namespace', 'collection', None, galaxy_server,
['0.1.0'], '*', False, metadata=meta)
req.install(to_text(output_path), temp_path)
# Ensure the temp directory is empty, nothing is left behind
assert os.listdir(temp_path) == []
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
assert mock_display.call_count == 1
assert mock_display.mock_calls[0][1][0] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" \
% to_text(collection_path)
assert mock_download.call_count == 1
assert mock_download.mock_calls[0][1][0] == 'https://downloadme.com'
assert mock_download.mock_calls[0][1][1] == temp_path
assert mock_download.mock_calls[0][1][2] == 'myhash'
assert mock_download.mock_calls[0][1][3] is True
def test_install_collections_from_tar(collection_artifact, monkeypatch):
collection_path, collection_tar = collection_artifact
temp_path = os.path.split(collection_tar)[0]
shutil.rmtree(collection_path)
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path)
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
with open(os.path.join(collection_path, b'MANIFEST.json'), 'rb') as manifest_obj:
actual_manifest = json.loads(to_text(manifest_obj.read()))
assert actual_manifest['collection_info']['namespace'] == 'ansible_namespace'
assert actual_manifest['collection_info']['name'] == 'collection'
assert actual_manifest['collection_info']['version'] == '0.1.0'
# Filter out the progress cursor display calls.
display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1]
assert len(display_msgs) == 3
assert display_msgs[0] == "Process install dependency map"
assert display_msgs[1] == "Starting collection install process"
assert display_msgs[2] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" % to_text(collection_path)
def test_install_collections_existing_without_force(collection_artifact, monkeypatch):
collection_path, collection_tar = collection_artifact
temp_path = os.path.split(collection_tar)[0]
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
# If we don't delete collection_path it will think the original build skeleton is installed so we expect a skip
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path)
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'README.md', b'docs', b'galaxy.yml', b'playbooks', b'plugins', b'roles']
# Filter out the progress cursor display calls.
display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1]
assert len(display_msgs) == 4
# Msg1 is the warning about not MANIFEST.json, cannot really check message as it has line breaks which varies based
# on the path size
assert display_msgs[1] == "Process install dependency map"
assert display_msgs[2] == "Starting collection install process"
assert display_msgs[3] == "Skipping 'ansible_namespace.collection' as it is already installed"
# Makes sure we don't get stuck in some recursive loop
@pytest.mark.parametrize('collection_artifact', [
{'ansible_namespace.collection': '>=0.0.1'},
], indirect=True)
def test_install_collection_with_circular_dependency(collection_artifact, monkeypatch):
collection_path, collection_tar = collection_artifact
temp_path = os.path.split(collection_tar)[0]
shutil.rmtree(collection_path)
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path)
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
with open(os.path.join(collection_path, b'MANIFEST.json'), 'rb') as manifest_obj:
actual_manifest = json.loads(to_text(manifest_obj.read()))
assert actual_manifest['collection_info']['namespace'] == 'ansible_namespace'
assert actual_manifest['collection_info']['name'] == 'collection'
assert actual_manifest['collection_info']['version'] == '0.1.0'
# Filter out the progress cursor display calls.
display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1]
assert len(display_msgs) == 3
assert display_msgs[0] == "Process install dependency map"
assert display_msgs[1] == "Starting collection install process"
assert display_msgs[2] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" % to_text(collection_path)
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 65,036 |
scaleway_user_data: multiline is broken, cloud-init doesn't work
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
There should be multiline in the cloud-init field, but instead either the module or API itself breaks it and sends the whole text as a string in double quotes.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
scaleway_user_data
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.8.3
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
```yaml
- scaleway_user_data:
server_id: "{{ scaleway_compute.msg.id }}"
region: "{{ region }}"
user_data:
cloud-init: |
#cloud-config
# final_message
# default: cloud-init boot finished at $TIMESTAMP. Up $UPTIME seconds
# this message is written by cloud-final when the system is finished
# its first boot
final_message: "The system is finally up, after $UPTIME seconds"
```
1. Update the cloud-init and start the server
2. Cloud-init throws an error
`2019-11-18 19:27:31,692 - __init__.py[WARNING]: Unhandled non-multipart (text/x-not-multipart) userdata: 'b'"#cloud-config\\\\nfinal_me'...'`
<!--- Paste example playbooks or commands between quotes below -->
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
```
root@test-cloudinit:~# curl --local-port 1-1023 http://169.254.42.42/user_data/cloud-init/ -w "\n"
#cloud-config
# final_message
# default: cloud-init boot finished at $TIMESTAMP. Up $UPTIME seconds
# this message is written by cloud-final when the system is finished
# its first boot
final_message: "The system is finally up, after $UPTIME seconds"
```
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```
root@test-cloudinit:~# curl --local-port 1-1023 http://169.254.42.42/user_data/cloud-init/ -w "\n"
"#cloud-config\n# final_message\n# default: cloud-init boot finished at $TIMESTAMP. Up $UPTIME seconds\n# this message is written by cloud-final when the system is finished\n# its first boot\nfinal_message: \"The system is finally up, after $UPTIME seconds\"\n"
```
|
https://github.com/ansible/ansible/issues/65036
|
https://github.com/ansible/ansible/pull/66957
|
600d6278f9fa056593c94039699bd522118d65d7
|
f70dc261ccae79ba1ae303c14eb9bfbc12a5590a
| 2019-11-19T07:03:56Z |
python
| 2020-02-21T16:14:22Z |
changelogs/fragments/66957-scaleway-jsonify-only-for-json-requests.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 65,036 |
scaleway_user_data: multiline is broken, cloud-init doesn't work
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
There should be multiline in the cloud-init field, but instead either the module or API itself breaks it and sends the whole text as a string in double quotes.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
scaleway_user_data
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.8.3
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
```yaml
- scaleway_user_data:
server_id: "{{ scaleway_compute.msg.id }}"
region: "{{ region }}"
user_data:
cloud-init: |
#cloud-config
# final_message
# default: cloud-init boot finished at $TIMESTAMP. Up $UPTIME seconds
# this message is written by cloud-final when the system is finished
# its first boot
final_message: "The system is finally up, after $UPTIME seconds"
```
1. Update the cloud-init and start the server
2. Cloud-init throws an error
`2019-11-18 19:27:31,692 - __init__.py[WARNING]: Unhandled non-multipart (text/x-not-multipart) userdata: 'b'"#cloud-config\\\\nfinal_me'...'`
<!--- Paste example playbooks or commands between quotes below -->
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
```
root@test-cloudinit:~# curl --local-port 1-1023 http://169.254.42.42/user_data/cloud-init/ -w "\n"
#cloud-config
# final_message
# default: cloud-init boot finished at $TIMESTAMP. Up $UPTIME seconds
# this message is written by cloud-final when the system is finished
# its first boot
final_message: "The system is finally up, after $UPTIME seconds"
```
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```
root@test-cloudinit:~# curl --local-port 1-1023 http://169.254.42.42/user_data/cloud-init/ -w "\n"
"#cloud-config\n# final_message\n# default: cloud-init boot finished at $TIMESTAMP. Up $UPTIME seconds\n# this message is written by cloud-final when the system is finished\n# its first boot\nfinal_message: \"The system is finally up, after $UPTIME seconds\"\n"
```
|
https://github.com/ansible/ansible/issues/65036
|
https://github.com/ansible/ansible/pull/66957
|
600d6278f9fa056593c94039699bd522118d65d7
|
f70dc261ccae79ba1ae303c14eb9bfbc12a5590a
| 2019-11-19T07:03:56Z |
python
| 2020-02-21T16:14:22Z |
lib/ansible/module_utils/scaleway.py
|
import json
import re
import sys
from ansible.module_utils.basic import env_fallback
from ansible.module_utils.urls import fetch_url
from ansible.module_utils.six.moves.urllib.parse import urlencode
def scaleway_argument_spec():
return dict(
api_token=dict(required=True, fallback=(env_fallback, ['SCW_TOKEN', 'SCW_API_KEY', 'SCW_OAUTH_TOKEN', 'SCW_API_TOKEN']),
no_log=True, aliases=['oauth_token']),
api_url=dict(fallback=(env_fallback, ['SCW_API_URL']), default='https://api.scaleway.com', aliases=['base_url']),
api_timeout=dict(type='int', default=30, aliases=['timeout']),
query_parameters=dict(type='dict', default={}),
validate_certs=dict(default=True, type='bool'),
)
def payload_from_object(scw_object):
return dict(
(k, v)
for k, v in scw_object.items()
if k != 'id' and v is not None
)
class ScalewayException(Exception):
def __init__(self, message):
self.message = message
# Specify a complete Link header, for validation purposes
R_LINK_HEADER = r'''<[^>]+>;\srel="(first|previous|next|last)"
(,<[^>]+>;\srel="(first|previous|next|last)")*'''
# Specify a single relation, for iteration and string extraction purposes
R_RELATION = r'<(?P<target_IRI>[^>]+)>; rel="(?P<relation>first|previous|next|last)"'
def parse_pagination_link(header):
if not re.match(R_LINK_HEADER, header, re.VERBOSE):
raise ScalewayException('Scaleway API answered with an invalid Link pagination header')
else:
relations = header.split(',')
parsed_relations = {}
rc_relation = re.compile(R_RELATION)
for relation in relations:
match = rc_relation.match(relation)
if not match:
raise ScalewayException('Scaleway API answered with an invalid relation in the Link pagination header')
data = match.groupdict()
parsed_relations[data['relation']] = data['target_IRI']
return parsed_relations
class Response(object):
def __init__(self, resp, info):
self.body = None
if resp:
self.body = resp.read()
self.info = info
@property
def json(self):
if not self.body:
if "body" in self.info:
return json.loads(self.info["body"])
return None
try:
return json.loads(self.body)
except ValueError:
return None
@property
def status_code(self):
return self.info["status"]
@property
def ok(self):
return self.status_code in (200, 201, 202, 204)
class Scaleway(object):
def __init__(self, module):
self.module = module
self.headers = {
'X-Auth-Token': self.module.params.get('api_token'),
'User-Agent': self.get_user_agent_string(module),
'Content-type': 'application/json',
}
self.name = None
def get_resources(self):
results = self.get('/%s' % self.name)
if not results.ok:
raise ScalewayException('Error fetching {0} ({1}) [{2}: {3}]'.format(
self.name, '%s/%s' % (self.module.params.get('api_url'), self.name),
results.status_code, results.json['message']
))
return results.json.get(self.name)
def _url_builder(self, path, params):
d = self.module.params.get('query_parameters')
if params is not None:
d.update(params)
query_string = urlencode(d, doseq=True)
if path[0] == '/':
path = path[1:]
return '%s/%s?%s' % (self.module.params.get('api_url'), path, query_string)
def send(self, method, path, data=None, headers=None, params=None):
url = self._url_builder(path=path, params=params)
self.warn(url)
data = self.module.jsonify(data)
if headers is not None:
self.headers.update(headers)
resp, info = fetch_url(
self.module, url, data=data, headers=self.headers, method=method,
timeout=self.module.params.get('api_timeout')
)
# Exceptions in fetch_url may result in a status -1, the ensures a proper error to the user in all cases
if info['status'] == -1:
self.module.fail_json(msg=info['msg'])
return Response(resp, info)
@staticmethod
def get_user_agent_string(module):
return "ansible %s Python %s" % (module.ansible_version, sys.version.split(' ')[0])
def get(self, path, data=None, headers=None, params=None):
return self.send(method='GET', path=path, data=data, headers=headers, params=params)
def put(self, path, data=None, headers=None, params=None):
return self.send(method='PUT', path=path, data=data, headers=headers, params=params)
def post(self, path, data=None, headers=None, params=None):
return self.send(method='POST', path=path, data=data, headers=headers, params=params)
def delete(self, path, data=None, headers=None, params=None):
return self.send(method='DELETE', path=path, data=data, headers=headers, params=params)
def patch(self, path, data=None, headers=None, params=None):
return self.send(method="PATCH", path=path, data=data, headers=headers, params=params)
def update(self, path, data=None, headers=None, params=None):
return self.send(method="UPDATE", path=path, data=data, headers=headers, params=params)
def warn(self, x):
self.module.warn(str(x))
SCALEWAY_LOCATION = {
'par1': {'name': 'Paris 1', 'country': 'FR', "api_endpoint": 'https://api.scaleway.com/instance/v1/zones/fr-par-1'},
'EMEA-FR-PAR1': {'name': 'Paris 1', 'country': 'FR', "api_endpoint": 'https://api.scaleway.com/instance/v1/zones/fr-par-1'},
'ams1': {'name': 'Amsterdam 1', 'country': 'NL', "api_endpoint": 'https://api.scaleway.com/instance/v1/zones/nl-ams-1'},
'EMEA-NL-EVS': {'name': 'Amsterdam 1', 'country': 'NL', "api_endpoint": 'https://api.scaleway.com/instance/v1/zones/nl-ams-1'}
}
SCALEWAY_ENDPOINT = "https://api.scaleway.com"
SCALEWAY_REGIONS = [
"fr-par",
"nl-ams",
]
SCALEWAY_ZONES = [
"fr-par-1",
"nl-ams-1",
]
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,615 |
vmware_host_service_info fails with AttributeError: 'NoneType' object has no attribute 'service'
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
When using the vmware_host_service_info module, an attribute error is encountered with different versions of Ansible (2.9.0, 2.9.2 and 2.9.5)
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
vmware_host_service_info
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.2
config file = /home/wtwassim/Documents/Projects/CMSP/ansible.cfg
configured module search path = [u'/home/wtwassim/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /home/wtwassim/.local/lib/python2.7/site-packages/ansible
executable location = /home/wtwassim/.local/bin/ansible
python version = 2.7.5 (default, Aug 7 2019, 00:51:29) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
ACTION_WARNINGS(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = False
ANSIBLE_PIPELINING(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = True
ANSIBLE_SSH_RETRIES(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = 3
DEFAULT_FORKS(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = 60
DEFAULT_GATHERING(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = smart
DEFAULT_GATHER_SUBSET(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = [u'!all']
DEFAULT_HOST_LIST(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = [u'/home/wtwassim/Documents/Projects/CM
DEFAULT_TIMEOUT(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = 60
DISPLAY_SKIPPED_HOSTS(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = False
HOST_KEY_CHECKING(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = False
INTERPRETER_PYTHON(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = auto_silent
RETRY_FILES_ENABLED(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = False
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
CentOS Linux release 7.7.1908 (Core)
Linux wassimans01.cisco-cms.com 3.10.0-1062.12.1.el7.x86_64 #1 SMP Tue Feb 4 23:02:59 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
Target vsphere version: 6.5.0
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
To get the list of esxi hosts that are in a given cluster, I call the module in a task contained within a block
<!--- Paste example playbooks or commands between quotes below -->
```yaml
---
# tasks to get hosts list
- block:
- name: Get hosts list
vmware_host_service_info:
hostname: "{{ information.address }}"
username: "{{ credentials.username }}"
password: "{{ credentials.password }}"
cluster_name: "{{ information.cluster }}"
validate_certs: no
register: host_info
delegate_to: localhost
- name: define hosts_list
set_fact:
hosts_list: "{{hosts_list|default([]) + [item.key]}}"
loop: "{{host_info.host_service_info|dict2items}}"
- name: define information.resources
set_fact:
information: "{{information | combine(new_item, recursive=true)}}"
vars:
new_item: "{'resources': \"{{hosts_list}}\"}"
no_log: true
tags: ['capcheck', 'vm_creation']
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
The module to succeed in getting the hosts info in the specified cluster
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
Module fails with
<!--- Paste verbatim command output between quotes -->
```paste below
The full traceback is:
Traceback (most recent call last):
File "<stdin>", line 102, in <module>
File "<stdin>", line 94, in _ansiballz_main
File "<stdin>", line 40, in invoke_module
File "/usr/lib64/python2.7/runpy.py", line 176, in run_module
fname, loader, pkg_name)
File "/usr/lib64/python2.7/runpy.py", line 82, in _run_module_code
mod_name, mod_fname, mod_loader, pkg_name)
File "/usr/lib64/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/tmp/ansible_vmware_host_service_info_payload_R9VIO2/ansible_vmware_host_service_info_payload.zip/ansible/modules/cloud/vmware/vmware_host_service_info.py", line 154, in <module>
File "/tmp/ansible_vmware_host_service_info_payload_R9VIO2/ansible_vmware_host_service_info_payload.zip/ansible/modules/cloud/vmware/vmware_host_service_info.py", line 150, in main
File "/tmp/ansible_vmware_host_service_info_payload_R9VIO2/ansible_vmware_host_service_info_payload.zip/ansible/modules/cloud/vmware/vmware_host_service_info.py", line 116, in gather_host_info
AttributeError: 'NoneType' object has no attribute 'service'
fatal: [vcenter01 -> localhost]: FAILED! => {
"changed": false,
"module_stderr": "Traceback (most recent call last):\n File \"<stdin>\", line 102, in <module>\n File \"<stdin>\", line 94, in _ansiballz_main\n File \"<stdin>\", line 40, in invoke_module\n File \"/usr/lib64/python2.7/runpy.py\", line 176, in run_module\n fname, loader, pkg_name)\n File \"/usr/lib64/python2.7/runpy.py\", line 82, in _run_module_code\n mod_name, mod_fname, mod_loader, pkg_name)\n File \"/usr/lib64/python2.7/runpy.py\", line 72, in _run_code\n exec code in run_globals\n File \"/tmp/ansible_vmware_host_service_info_payload_R9VIO2/ansible_vmware_host_service_info_payload.zip/ansible/modules/cloud/vmware/vmware_host_service_info.py\", line 154, in <module>\n File \"/tmp/ansible_vmware_host_service_info_payload_R9VIO2/ansible_vmware_host_service_info_payload.zip/ansible/modules/cloud/vmware/vmware_host_service_info.py\", line 150, in main\n File \"/tmp/ansible_vmware_host_service_info_payload_R9VIO2/ansible_vmware_host_service_info_payload.zip/ansible/modules/cloud/vmware/vmware_host_service_info.py\", line 116, in gather_host_info\nAttributeError: 'NoneType' object has no attribute 'service'\n",
"module_stdout": "",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 1
}
```
|
https://github.com/ansible/ansible/issues/67615
|
https://github.com/ansible/ansible/pull/67641
|
482885e0c8ebc9554ae5eb81dce67253f64455f2
|
6936e7b698d374076ccd5154a80dd23a00ab7d92
| 2020-02-20T14:08:05Z |
python
| 2020-02-22T16:16:18Z |
changelogs/fragments/67615-vmware_host_service_info_fix.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,615 |
vmware_host_service_info fails with AttributeError: 'NoneType' object has no attribute 'service'
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
When using the vmware_host_service_info module, an attribute error is encountered with different versions of Ansible (2.9.0, 2.9.2 and 2.9.5)
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
vmware_host_service_info
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.2
config file = /home/wtwassim/Documents/Projects/CMSP/ansible.cfg
configured module search path = [u'/home/wtwassim/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /home/wtwassim/.local/lib/python2.7/site-packages/ansible
executable location = /home/wtwassim/.local/bin/ansible
python version = 2.7.5 (default, Aug 7 2019, 00:51:29) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
ACTION_WARNINGS(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = False
ANSIBLE_PIPELINING(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = True
ANSIBLE_SSH_RETRIES(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = 3
DEFAULT_FORKS(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = 60
DEFAULT_GATHERING(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = smart
DEFAULT_GATHER_SUBSET(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = [u'!all']
DEFAULT_HOST_LIST(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = [u'/home/wtwassim/Documents/Projects/CM
DEFAULT_TIMEOUT(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = 60
DISPLAY_SKIPPED_HOSTS(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = False
HOST_KEY_CHECKING(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = False
INTERPRETER_PYTHON(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = auto_silent
RETRY_FILES_ENABLED(/home/wtwassim/Documents/Projects/CMSP/ansible.cfg) = False
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
CentOS Linux release 7.7.1908 (Core)
Linux wassimans01.cisco-cms.com 3.10.0-1062.12.1.el7.x86_64 #1 SMP Tue Feb 4 23:02:59 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
Target vsphere version: 6.5.0
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
To get the list of esxi hosts that are in a given cluster, I call the module in a task contained within a block
<!--- Paste example playbooks or commands between quotes below -->
```yaml
---
# tasks to get hosts list
- block:
- name: Get hosts list
vmware_host_service_info:
hostname: "{{ information.address }}"
username: "{{ credentials.username }}"
password: "{{ credentials.password }}"
cluster_name: "{{ information.cluster }}"
validate_certs: no
register: host_info
delegate_to: localhost
- name: define hosts_list
set_fact:
hosts_list: "{{hosts_list|default([]) + [item.key]}}"
loop: "{{host_info.host_service_info|dict2items}}"
- name: define information.resources
set_fact:
information: "{{information | combine(new_item, recursive=true)}}"
vars:
new_item: "{'resources': \"{{hosts_list}}\"}"
no_log: true
tags: ['capcheck', 'vm_creation']
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
The module to succeed in getting the hosts info in the specified cluster
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
Module fails with
<!--- Paste verbatim command output between quotes -->
```paste below
The full traceback is:
Traceback (most recent call last):
File "<stdin>", line 102, in <module>
File "<stdin>", line 94, in _ansiballz_main
File "<stdin>", line 40, in invoke_module
File "/usr/lib64/python2.7/runpy.py", line 176, in run_module
fname, loader, pkg_name)
File "/usr/lib64/python2.7/runpy.py", line 82, in _run_module_code
mod_name, mod_fname, mod_loader, pkg_name)
File "/usr/lib64/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/tmp/ansible_vmware_host_service_info_payload_R9VIO2/ansible_vmware_host_service_info_payload.zip/ansible/modules/cloud/vmware/vmware_host_service_info.py", line 154, in <module>
File "/tmp/ansible_vmware_host_service_info_payload_R9VIO2/ansible_vmware_host_service_info_payload.zip/ansible/modules/cloud/vmware/vmware_host_service_info.py", line 150, in main
File "/tmp/ansible_vmware_host_service_info_payload_R9VIO2/ansible_vmware_host_service_info_payload.zip/ansible/modules/cloud/vmware/vmware_host_service_info.py", line 116, in gather_host_info
AttributeError: 'NoneType' object has no attribute 'service'
fatal: [vcenter01 -> localhost]: FAILED! => {
"changed": false,
"module_stderr": "Traceback (most recent call last):\n File \"<stdin>\", line 102, in <module>\n File \"<stdin>\", line 94, in _ansiballz_main\n File \"<stdin>\", line 40, in invoke_module\n File \"/usr/lib64/python2.7/runpy.py\", line 176, in run_module\n fname, loader, pkg_name)\n File \"/usr/lib64/python2.7/runpy.py\", line 82, in _run_module_code\n mod_name, mod_fname, mod_loader, pkg_name)\n File \"/usr/lib64/python2.7/runpy.py\", line 72, in _run_code\n exec code in run_globals\n File \"/tmp/ansible_vmware_host_service_info_payload_R9VIO2/ansible_vmware_host_service_info_payload.zip/ansible/modules/cloud/vmware/vmware_host_service_info.py\", line 154, in <module>\n File \"/tmp/ansible_vmware_host_service_info_payload_R9VIO2/ansible_vmware_host_service_info_payload.zip/ansible/modules/cloud/vmware/vmware_host_service_info.py\", line 150, in main\n File \"/tmp/ansible_vmware_host_service_info_payload_R9VIO2/ansible_vmware_host_service_info_payload.zip/ansible/modules/cloud/vmware/vmware_host_service_info.py\", line 116, in gather_host_info\nAttributeError: 'NoneType' object has no attribute 'service'\n",
"module_stdout": "",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 1
}
```
|
https://github.com/ansible/ansible/issues/67615
|
https://github.com/ansible/ansible/pull/67641
|
482885e0c8ebc9554ae5eb81dce67253f64455f2
|
6936e7b698d374076ccd5154a80dd23a00ab7d92
| 2020-02-20T14:08:05Z |
python
| 2020-02-22T16:16:18Z |
lib/ansible/modules/cloud/vmware/vmware_host_service_info.py
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright: (c) 2018, Abhijeet Kasurde <[email protected]>
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {
'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'
}
DOCUMENTATION = r'''
---
module: vmware_host_service_info
short_description: Gathers info about an ESXi host's services
description:
- This module can be used to gather information about an ESXi host's services.
version_added: '2.9'
author:
- Abhijeet Kasurde (@Akasurde)
notes:
- Tested on vSphere 6.5
- If source package name is not available then fact is populated as null.
requirements:
- python >= 2.6
- PyVmomi
options:
cluster_name:
description:
- Name of the cluster.
- Service information about each ESXi server will be returned for given cluster.
- If C(esxi_hostname) is not given, this parameter is required.
type: str
esxi_hostname:
description:
- ESXi hostname.
- Service information about this ESXi server will be returned.
- If C(cluster_name) is not given, this parameter is required.
type: str
extends_documentation_fragment: vmware.documentation
'''
EXAMPLES = r'''
- name: Gather info about all ESXi Host in given Cluster
vmware_host_service_info:
hostname: '{{ vcenter_hostname }}'
username: '{{ vcenter_username }}'
password: '{{ vcenter_password }}'
cluster_name: cluster_name
delegate_to: localhost
register: cluster_host_services
- name: Gather info about ESXi Host
vmware_host_service_info:
hostname: '{{ vcenter_hostname }}'
username: '{{ vcenter_username }}'
password: '{{ vcenter_password }}'
esxi_hostname: '{{ esxi_hostname }}'
delegate_to: localhost
register: host_services
'''
RETURN = r'''
host_service_info:
description:
- dict with hostname as key and dict with host service config information
returned: always
type: dict
sample: {
"10.76.33.226": [
{
"key": "DCUI",
"label": "Direct Console UI",
"policy": "on",
"required": false,
"running": true,
"uninstallable": false,
"source_package_name": "esx-base",
"source_package_desc": "This VIB contains all of the base functionality of vSphere ESXi."
},
{
"key": "TSM",
"label": "ESXi Shell",
"policy": "off",
"required": false,
"running": false,
"uninstallable": false,
"source_package_name": "esx-base",
"source_package_desc": "This VIB contains all of the base functionality of vSphere ESXi."
},
]
}
'''
from ansible.module_utils.basic import AnsibleModule
from ansible.module_utils.vmware import vmware_argument_spec, PyVmomi
class VmwareServiceManager(PyVmomi):
def __init__(self, module):
super(VmwareServiceManager, self).__init__(module)
cluster_name = self.params.get('cluster_name', None)
esxi_host_name = self.params.get('esxi_hostname', None)
self.hosts = self.get_all_host_objs(cluster_name=cluster_name, esxi_host_name=esxi_host_name)
def gather_host_info(self):
hosts_info = {}
for host in self.hosts:
host_service_info = []
host_service_system = host.configManager.serviceSystem
if host_service_system:
services = host_service_system.serviceInfo.service
for service in services:
host_service_info.append(
dict(
key=service.key,
label=service.label,
required=service.required,
uninstallable=service.uninstallable,
running=service.running,
policy=service.policy,
source_package_name=service.sourcePackage.sourcePackageName if service.sourcePackage else None,
source_package_desc=service.sourcePackage.description if service.sourcePackage else None,
)
)
hosts_info[host.name] = host_service_info
return hosts_info
def main():
argument_spec = vmware_argument_spec()
argument_spec.update(
cluster_name=dict(type='str', required=False),
esxi_hostname=dict(type='str', required=False),
)
module = AnsibleModule(
argument_spec=argument_spec,
required_one_of=[
['cluster_name', 'esxi_hostname'],
],
supports_check_mode=True,
)
vmware_host_service_config = VmwareServiceManager(module)
module.exit_json(changed=False, host_service_info=vmware_host_service_config.gather_host_info())
if __name__ == "__main__":
main()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,455 |
Nxos_l2_interfaces module fails with traceback if allowed vlans are not preconfigured
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
Nxos_l2_interfaces module fails with a traceback if allowed vlans are not preconfigured on interfaces, raised from PR fix #66517
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
nxos_l2_interfaces
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
2.9
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
macos
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
Try running the module for configuring allowed vlans for the 1st time.
<!--- Paste example playbooks or commands between quotes below -->
```yaml
- nxos_l2_interfaces:
config:
- name: Ethernet2/2
mode: trunk
trunk:
allowed_vlans: "10-12"
state: merged
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
If allowed vlans are not preconfigured, the module should go ahead and configure the user-provided allowed vlans
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
If allowed vlans are not preconfigured, it fails with below traceback
<!--- Paste verbatim command output between quotes -->
```paste below
The full traceback is:
Traceback (most recent call last):
File "/Users/sjaiswal/.ansible/tmp/ansible-local-34163u8gljr/ansible-tmp-1581928344.74-175330388156897/AnsiballZ_nxos_l2_interfaces.py", line 102, in <module>
_ansiballz_main()
File "/Users/sjaiswal/.ansible/tmp/ansible-local-34163u8gljr/ansible-tmp-1581928344.74-175330388156897/AnsiballZ_nxos_l2_interfaces.py", line 94, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/Users/sjaiswal/.ansible/tmp/ansible-local-34163u8gljr/ansible-tmp-1581928344.74-175330388156897/AnsiballZ_nxos_l2_interfaces.py", line 40, in invoke_module
runpy.run_module(mod_name='ansible.modules.network.nxos.nxos_l2_interfaces', init_globals=None, run_name='__main__', alter_sys=True)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/runpy.py", line 188, in run_module
fname, loader, pkg_name)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/runpy.py", line 82, in _run_module_code
mod_name, mod_fname, mod_loader, pkg_name)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/var/folders/n7/73s22vps0ls44ng__t8mwtl40000gn/T/ansible_nxos_l2_interfaces_payload_QGTQ63/ansible_nxos_l2_interfaces_payload.zip/ansible/modules/network/nxos/nxos_l2_interfaces.py", line 283, in <module>
File "/var/folders/n7/73s22vps0ls44ng__t8mwtl40000gn/T/ansible_nxos_l2_interfaces_payload_QGTQ63/ansible_nxos_l2_interfaces_payload.zip/ansible/modules/network/nxos/nxos_l2_interfaces.py", line 278, in main
File "/var/folders/n7/73s22vps0ls44ng__t8mwtl40000gn/T/ansible_nxos_l2_interfaces_payload_QGTQ63/ansible_nxos_l2_interfaces_payload.zip/ansible/module_utils/network/nxos/config/l2_interfaces/l2_interfaces.py", line 69, in execute_module
File "/var/folders/n7/73s22vps0ls44ng__t8mwtl40000gn/T/ansible_nxos_l2_interfaces_payload_QGTQ63/ansible_nxos_l2_interfaces_payload.zip/ansible/module_utils/network/nxos/config/l2_interfaces/l2_interfaces.py", line 103, in set_config
File "/var/folders/n7/73s22vps0ls44ng__t8mwtl40000gn/T/ansible_nxos_l2_interfaces_payload_QGTQ63/ansible_nxos_l2_interfaces_payload.zip/ansible/module_utils/network/nxos/config/l2_interfaces/l2_interfaces.py", line 136, in set_state
File "/var/folders/n7/73s22vps0ls44ng__t8mwtl40000gn/T/ansible_nxos_l2_interfaces_payload_QGTQ63/ansible_nxos_l2_interfaces_payload.zip/ansible/module_utils/network/nxos/config/l2_interfaces/l2_interfaces.py", line 205, in _state_merged
File "/var/folders/n7/73s22vps0ls44ng__t8mwtl40000gn/T/ansible_nxos_l2_interfaces_payload_QGTQ63/ansible_nxos_l2_interfaces_payload.zip/ansible/module_utils/network/nxos/config/l2_interfaces/l2_interfaces.py", line 287, in set_commands
KeyError: 'allowed_vlans'
```
|
https://github.com/ansible/ansible/issues/67455
|
https://github.com/ansible/ansible/pull/67457
|
0d72d2d4d21888cced5dce658384810bfde52f09
|
f292f21d867e7ab48b9021ee4aa2612f049e170e
| 2020-02-17T08:39:35Z |
python
| 2020-02-24T12:14:21Z |
lib/ansible/module_utils/network/nxos/config/l2_interfaces/l2_interfaces.py
|
#
# -*- coding: utf-8 -*-
# Copyright 2019 Red Hat
# GNU General Public License v3.0+
# (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
"""
The nxos_l2_interfaces class
It is in this file where the current configuration (as dict)
is compared to the provided configuration (as dict) and the command set
necessary to bring the current configuration to it's desired end-state is
created
"""
from __future__ import absolute_import, division, print_function
__metaclass__ = type
from ansible.module_utils.network.common.cfg.base import ConfigBase
from ansible.module_utils.network.common.utils import dict_diff, to_list, remove_empties
from ansible.module_utils.network.nxos.facts.facts import Facts
from ansible.module_utils.network.nxos.utils.utils import flatten_dict, normalize_interface, search_obj_in_list, vlan_range_to_list
class L2_interfaces(ConfigBase):
"""
The nxos_l2_interfaces class
"""
gather_subset = [
'!all',
'!min',
]
gather_network_resources = [
'l2_interfaces',
]
exclude_params = [
'vlan',
'allowed_vlans',
'native_vlans',
]
def __init__(self, module):
super(L2_interfaces, self).__init__(module)
def get_l2_interfaces_facts(self):
""" Get the 'facts' (the current configuration)
:rtype: A dictionary
:returns: The current configuration as a dictionary
"""
facts, _warnings = Facts(self._module).get_facts(self.gather_subset, self.gather_network_resources)
l2_interfaces_facts = facts['ansible_network_resources'].get('l2_interfaces')
if not l2_interfaces_facts:
return []
return l2_interfaces_facts
def execute_module(self):
""" Execute the module
:rtype: A dictionary
:returns: The result from module execution
"""
result = {'changed': False}
commands = list()
warnings = list()
existing_l2_interfaces_facts = self.get_l2_interfaces_facts()
commands.extend(self.set_config(existing_l2_interfaces_facts))
if commands:
if not self._module.check_mode:
self._connection.edit_config(commands)
result['changed'] = True
result['commands'] = commands
changed_l2_interfaces_facts = self.get_l2_interfaces_facts()
result['before'] = existing_l2_interfaces_facts
if result['changed']:
result['after'] = changed_l2_interfaces_facts
result['warnings'] = warnings
return result
def set_config(self, existing_l2_interfaces_facts):
""" Collect the configuration from the args passed to the module,
collect the current configuration (as a dict from facts)
:rtype: A list
:returns: the commands necessary to migrate the current configuration
to the desired configuration
"""
config = self._module.params.get('config')
want = []
if config:
for w in config:
w.update({'name': normalize_interface(w['name'])})
self.expand_trunk_allowed_vlans(w)
want.append(remove_empties(w))
have = existing_l2_interfaces_facts
for h in have:
self.expand_trunk_allowed_vlans(h)
resp = self.set_state(want, have)
return to_list(resp)
def expand_trunk_allowed_vlans(self, d):
if not d:
return None
if 'trunk' in d and d['trunk']:
if 'allowed_vlans' in d['trunk']:
allowed_vlans = vlan_range_to_list(d['trunk']['allowed_vlans'])
vlans_list = [str(l) for l in sorted(allowed_vlans)]
d['trunk']['allowed_vlans'] = ",".join(vlans_list)
def set_state(self, want, have):
""" Select the appropriate function based on the state provided
:param want: the desired configuration as a dictionary
:param have: the current configuration as a dictionary
:rtype: A list
:returns: the commands necessary to migrate the current configuration
to the desired configuration
"""
state = self._module.params['state']
if state in ('overridden', 'merged', 'replaced') and not want:
self._module.fail_json(msg='config is required for state {0}'.format(state))
commands = list()
if state == 'overridden':
commands.extend(self._state_overridden(want, have))
elif state == 'deleted':
commands.extend(self._state_deleted(want, have))
else:
for w in want:
if state == 'merged':
commands.extend(self._state_merged(flatten_dict(w), have))
elif state == 'replaced':
commands.extend(self._state_replaced(flatten_dict(w), have))
return commands
def _state_replaced(self, w, have):
""" The command generator when state is replaced
:rtype: A list
:returns: the commands necessary to migrate the current configuration
to the desired configuration
"""
commands = []
obj_in_have = flatten_dict(search_obj_in_list(w['name'], have, 'name'))
if obj_in_have:
diff = dict_diff(w, obj_in_have)
else:
diff = w
merged_commands = self.set_commands(w, have, True)
if 'name' not in diff:
diff['name'] = w['name']
wkeys = w.keys()
dkeys = diff.keys()
for k in w.copy():
if k in self.exclude_params and k in dkeys:
del diff[k]
replaced_commands = self.del_attribs(diff)
if merged_commands:
cmds = set(replaced_commands).intersection(set(merged_commands))
for cmd in cmds:
merged_commands.remove(cmd)
commands.extend(replaced_commands)
commands.extend(merged_commands)
return commands
def _state_overridden(self, want, have):
""" The command generator when state is overridden
:rtype: A list
:returns: the commands necessary to migrate the current configuration
to the desired configuration
"""
commands = []
for h in have:
h = flatten_dict(h)
obj_in_want = flatten_dict(search_obj_in_list(h['name'], want, 'name'))
if h == obj_in_want:
continue
for w in want:
w = flatten_dict(w)
if h['name'] == w['name']:
wkeys = w.keys()
hkeys = h.keys()
for k in wkeys:
if k in self.exclude_params and k in hkeys:
del h[k]
commands.extend(self.del_attribs(h))
for w in want:
commands.extend(self.set_commands(flatten_dict(w), have, True))
return commands
def _state_merged(self, w, have):
""" The command generator when state is merged
:rtype: A list
:returns: the commands necessary to merge the provided into
the current configuration
"""
return self.set_commands(w, have)
def _state_deleted(self, want, have):
""" The command generator when state is deleted
:rtype: A list
:returns: the commands necessary to remove the current configuration
of the provided objects
"""
commands = []
if want:
for w in want:
obj_in_have = flatten_dict(search_obj_in_list(w['name'], have, 'name'))
commands.extend(self.del_attribs(obj_in_have))
else:
if not have:
return commands
for h in have:
commands.extend(self.del_attribs(flatten_dict(h)))
return commands
def del_attribs(self, obj):
commands = []
if not obj or len(obj.keys()) == 1:
return commands
cmd = 'no switchport '
if 'vlan' in obj:
commands.append(cmd + 'access vlan')
if 'allowed_vlans' in obj:
commands.append(cmd + 'trunk allowed vlan')
if 'native_vlan' in obj:
commands.append(cmd + 'trunk native vlan')
if commands:
commands.insert(0, 'interface ' + obj['name'])
return commands
def diff_of_dicts(self, w, obj):
diff = set(w.items()) - set(obj.items())
diff = dict(diff)
if diff and w['name'] == obj['name']:
diff.update({'name': w['name']})
return diff
def add_commands(self, d, vlan_exists=False):
commands = []
if not d:
return commands
cmd = 'switchport '
if 'vlan' in d:
commands.append(cmd + 'access vlan ' + str(d['vlan']))
if 'allowed_vlans' in d:
if vlan_exists:
commands.append(cmd + 'trunk allowed vlan add ' + str(d['allowed_vlans']))
else:
commands.append(cmd + 'trunk allowed vlan ' + str(d['allowed_vlans']))
if 'native_vlan' in d:
commands.append(cmd + 'trunk native vlan ' + str(d['native_vlan']))
if commands:
commands.insert(0, 'interface ' + d['name'])
return commands
def set_commands(self, w, have, replace=False):
commands = []
vlan_tobe_added = []
obj_in_have = flatten_dict(search_obj_in_list(w['name'], have, 'name'))
if not obj_in_have:
commands = self.add_commands(w)
else:
diff = self.diff_of_dicts(w, obj_in_have)
if diff and not replace:
if "allowed_vlans" in diff.keys() and diff["allowed_vlans"]:
vlan_tobe_added = diff["allowed_vlans"].split(',')
vlan_list = vlan_tobe_added[:]
have_vlans = obj_in_have["allowed_vlans"].split(',')
for w_vlans in vlan_list:
if w_vlans in have_vlans:
vlan_tobe_added.pop(vlan_tobe_added.index(w_vlans))
if vlan_tobe_added:
diff.update({"allowed_vlans": ','.join(vlan_tobe_added)})
commands = self.add_commands(diff, True)
return commands
commands = self.add_commands(diff)
return commands
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,169 |
module nxos_lag_interfaces not idempotent when mode yes
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
When using nxos_lag_interfaces with mode: yes, to configure "channel-group x mode on" (needed on Nexuses for connection of FEXes) it is not idempotent.
The output even states the mode as "True" not "yes" (see below).
I am not sure why in the Ansible module there is an option "yes" instead of "on" which is actually used on devices.
changed: [SWITCH-A] => (item={'key': 6, 'value': {u'members': [{u'member': u'Ethernet1/12', u'mode': True}, {u'member': u'Ethernet1/13', u'mode': True}]}})
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
modules/network/nxos/nxos_lag_interfaces
##### ANSIBLE VERSION
ansible 2.9.4
config file = /home/martin/git/ansible-evpn/ansible.cfg
configured module search path = [u'/home/martin/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.17 (default, Nov 7 2019, 10:07:09) [GCC 7.4.0]
##### CONFIGURATION
DEFAULT_CALLBACK_WHITELIST(/home/martin/git/ansible-evpn/ansible.cfg) = [u'timer', u'mail', u'profile_tasks']
DEFAULT_FORKS(/home/martin/git/ansible-evpn/ansible.cfg) = 10
DEFAULT_GATHERING(/home/martin/git/ansible-evpn/ansible.cfg) = explicit
DEFAULT_HOST_LIST(/home/martin/git/ansible-evpn/ansible.cfg) = [u'/home/martin/git/ansible-evpn/inventory.yml']
DEFAULT_STDOUT_CALLBACK(/home/martin/git/ansible-evpn/ansible.cfg) = skippy
DEFAULT_VAULT_PASSWORD_FILE(/home/martin/git/ansible-evpn/ansible.cfg) = /home/martin/git/ansible-evpn/group_vars/.va
HOST_KEY_CHECKING(/home/martin/git/ansible-evpn/ansible.cfg) = False
INTERPRETER_PYTHON(/home/martin/git/ansible-evpn/ansible.cfg) = auto_legacy_silent
PERSISTENT_COMMAND_TIMEOUT(/home/martin/git/ansible-evpn/ansible.cfg) = 180
PERSISTENT_CONNECT_TIMEOUT(/home/martin/git/ansible-evpn/ansible.cfg) = 180
##### OS / ENVIRONMENT
Ansible Server : Ubuntu18.04.3 LTS
Device: N9K running 9.2(3)
##### STEPS TO REPRODUCE
```
- name: Create Port-Channels
nxos_lag_interfaces:
config:
- name: port-channel7
members:
- member: Ethernet1/15
mode: yes
```
##### EXPECTED RESULTS
ok: [SWITCH-A]
##### ACTUAL RESULTS
changed: [SWITCH-A]
|
https://github.com/ansible/ansible/issues/67169
|
https://github.com/ansible/ansible/pull/67359
|
f292f21d867e7ab48b9021ee4aa2612f049e170e
|
3acd8f6f7f61b72a42a017060c3a718bbbeaf854
| 2020-02-06T16:15:02Z |
python
| 2020-02-24T12:57:11Z |
lib/ansible/module_utils/network/nxos/config/lag_interfaces/lag_interfaces.py
|
#
# -*- coding: utf-8 -*-
# Copyright 2019 Red Hat
# GNU General Public License v3.0+
# (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
"""
The junos_lag_interfaces class
It is in this file where the current configuration (as dict)
is compared to the provided configuration (as dict) and the command set
necessary to bring the current configuration to it's desired end-state is
created
"""
from __future__ import absolute_import, division, print_function
__metaclass__ = type
from ansible.module_utils.network.common.cfg.base import ConfigBase
from ansible.module_utils.network.common.utils import to_list, remove_empties, dict_diff, search_obj_in_list
from ansible.module_utils.network.nxos.facts.facts import Facts
from ansible.module_utils.network.nxos.utils.utils import normalize_interface
class Lag_interfaces(ConfigBase):
"""
The nxos_lag_interfaces class
"""
gather_subset = [
'!all',
'!min',
]
gather_network_resources = [
'lag_interfaces',
]
def __init__(self, module):
super(Lag_interfaces, self).__init__(module)
def get_lag_interfaces_facts(self):
""" Get the 'facts' (the current configuration)
:rtype: A dictionary
:returns: The current configuration as a dictionary
"""
facts, _warnings = Facts(self._module).get_facts(self.gather_subset, self.gather_network_resources)
lag_interfaces_facts = facts['ansible_network_resources'].get('lag_interfaces')
if not lag_interfaces_facts:
return []
return lag_interfaces_facts
def execute_module(self):
""" Execute the module
:rtype: A dictionary
:returns: The result from module execution
"""
result = {'changed': False}
commands = list()
warnings = list()
existing_lag_interfaces_facts = self.get_lag_interfaces_facts()
commands.extend(self.set_config(existing_lag_interfaces_facts))
if commands:
if not self._module.check_mode:
resp = self._connection.edit_config(commands)
if 'response' in resp:
for item in resp['response']:
if item:
err_str = item
if err_str.lower().startswith('cannot add'):
self._module.fail_json(msg=err_str)
result['changed'] = True
result['commands'] = commands
changed_lag_interfaces_facts = self.get_lag_interfaces_facts()
result['before'] = existing_lag_interfaces_facts
if result['changed']:
result['after'] = changed_lag_interfaces_facts
result['warnings'] = warnings
return result
def set_config(self, existing_lag_interfaces_facts):
""" Collect the configuration from the args passed to the module,
collect the current configuration (as a dict from facts)
:rtype: A list
:returns: the commands necessary to migrate the current configuration
to the desired configuration
"""
want = self._module.params.get('config')
if want:
for w in want:
w.update(remove_empties(w))
if 'members' in w and w['members']:
for item in w['members']:
item.update({'member': normalize_interface(item['member'])})
have = existing_lag_interfaces_facts
resp = self.set_state(want, have)
return to_list(resp)
def set_state(self, want, have):
""" Select the appropriate function based on the state provided
:param want: the desired configuration as a dictionary
:param have: the current configuration as a dictionary
:rtype: A list
:returns: the commands necessary to migrate the current configuration
to the desired configuration
"""
state = self._module.params['state']
commands = list()
if state == 'overridden':
commands.extend(self._state_overridden(want, have))
elif state == 'deleted':
commands.extend(self._state_deleted(want, have))
else:
for w in want:
if state == 'merged':
commands.extend(self._state_merged(w, have))
if state == 'replaced':
commands.extend(self._state_replaced(w, have))
return commands
def _state_replaced(self, w, have):
""" The command generator when state is replaced
:rtype: A list
:returns: the commands necessary to migrate the current configuration
to the desired configuration
"""
commands = []
merged_commands = self.set_commands(w, have)
replaced_commands = self.del_intf_commands(w, have)
if merged_commands:
commands.extend(replaced_commands)
commands.extend(merged_commands)
return commands
def _state_overridden(self, want, have):
""" The command generator when state is overridden
:rtype: A list
:returns: the commands necessary to migrate the current configuration
to the desired configuration
"""
commands = []
for h in have:
obj_in_want = search_obj_in_list(h['name'], want, 'name')
if h == obj_in_want:
continue
commands.extend(self.del_all_commands(h))
for w in want:
commands.extend(self.set_commands(w, have))
return commands
def _state_merged(self, w, have):
""" The command generator when state is merged
:rtype: A list
:returns: the commands necessary to merge the provided into
the current configuration
"""
return self.set_commands(w, have)
def _state_deleted(self, want, have):
""" The command generator when state is deleted
:rtype: A list
:returns: the commands necessary to remove the current configuration
of the provided objects
"""
commands = []
if want:
for w in want:
obj_in_have = search_obj_in_list(w['name'], have, 'name')
commands.extend(self.del_all_commands(obj_in_have))
else:
if not have:
return commands
for h in have:
commands.extend(self.del_all_commands(h))
return commands
def diff_list_of_dicts(self, want, have):
if not want:
want = []
if not have:
have = []
diff = []
for w_item in want:
h_item = search_obj_in_list(w_item['member'], have, key='member') or {}
delta = dict_diff(h_item, w_item)
if delta:
if 'member' not in delta.keys():
delta['member'] = w_item['member']
diff.append(delta)
return diff
def intersect_list_of_dicts(self, w, h):
intersect = []
wmem = []
hmem = []
for d in w:
wmem.append({'member': d['member']})
for d in h:
hmem.append({'member': d['member']})
set_w = set(tuple(sorted(d.items())) for d in wmem)
set_h = set(tuple(sorted(d.items())) for d in hmem)
intersection = set_w.intersection(set_h)
for element in intersection:
intersect.append(dict((x, y) for x, y in element))
return intersect
def add_commands(self, diff, name):
commands = []
name = name.strip('port-channel')
for d in diff:
commands.append('interface' + ' ' + d['member'])
cmd = ''
group_cmd = 'channel-group {0}'.format(name)
if 'force' in d:
cmd = group_cmd + ' force ' + d['force']
if 'mode' in d:
if cmd:
cmd = cmd + ' mode ' + d['mode']
else:
cmd = group_cmd + ' mode ' + d['mode']
if not cmd:
cmd = group_cmd
commands.append(cmd)
return commands
def set_commands(self, w, have):
commands = []
obj_in_have = search_obj_in_list(w['name'], have, 'name')
if not obj_in_have:
commands = self.add_commands(w['members'], w['name'])
else:
diff = self.diff_list_of_dicts(w['members'], obj_in_have['members'])
commands = self.add_commands(diff, w['name'])
return commands
def del_all_commands(self, obj_in_have):
commands = []
if not obj_in_have:
return commands
for m in obj_in_have['members']:
commands.append('interface' + ' ' + m['member'])
commands.append('no channel-group')
return commands
def del_intf_commands(self, w, have):
commands = []
obj_in_have = search_obj_in_list(w['name'], have, 'name')
if obj_in_have:
lst_to_del = self.intersect_list_of_dicts(w['members'], obj_in_have['members'])
if lst_to_del:
for item in lst_to_del:
commands.append('interface' + ' ' + item['member'])
commands.append('no channel-group')
return commands
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,169 |
module nxos_lag_interfaces not idempotent when mode yes
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
When using nxos_lag_interfaces with mode: yes, to configure "channel-group x mode on" (needed on Nexuses for connection of FEXes) it is not idempotent.
The output even states the mode as "True" not "yes" (see below).
I am not sure why in the Ansible module there is an option "yes" instead of "on" which is actually used on devices.
changed: [SWITCH-A] => (item={'key': 6, 'value': {u'members': [{u'member': u'Ethernet1/12', u'mode': True}, {u'member': u'Ethernet1/13', u'mode': True}]}})
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
modules/network/nxos/nxos_lag_interfaces
##### ANSIBLE VERSION
ansible 2.9.4
config file = /home/martin/git/ansible-evpn/ansible.cfg
configured module search path = [u'/home/martin/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.17 (default, Nov 7 2019, 10:07:09) [GCC 7.4.0]
##### CONFIGURATION
DEFAULT_CALLBACK_WHITELIST(/home/martin/git/ansible-evpn/ansible.cfg) = [u'timer', u'mail', u'profile_tasks']
DEFAULT_FORKS(/home/martin/git/ansible-evpn/ansible.cfg) = 10
DEFAULT_GATHERING(/home/martin/git/ansible-evpn/ansible.cfg) = explicit
DEFAULT_HOST_LIST(/home/martin/git/ansible-evpn/ansible.cfg) = [u'/home/martin/git/ansible-evpn/inventory.yml']
DEFAULT_STDOUT_CALLBACK(/home/martin/git/ansible-evpn/ansible.cfg) = skippy
DEFAULT_VAULT_PASSWORD_FILE(/home/martin/git/ansible-evpn/ansible.cfg) = /home/martin/git/ansible-evpn/group_vars/.va
HOST_KEY_CHECKING(/home/martin/git/ansible-evpn/ansible.cfg) = False
INTERPRETER_PYTHON(/home/martin/git/ansible-evpn/ansible.cfg) = auto_legacy_silent
PERSISTENT_COMMAND_TIMEOUT(/home/martin/git/ansible-evpn/ansible.cfg) = 180
PERSISTENT_CONNECT_TIMEOUT(/home/martin/git/ansible-evpn/ansible.cfg) = 180
##### OS / ENVIRONMENT
Ansible Server : Ubuntu18.04.3 LTS
Device: N9K running 9.2(3)
##### STEPS TO REPRODUCE
```
- name: Create Port-Channels
nxos_lag_interfaces:
config:
- name: port-channel7
members:
- member: Ethernet1/15
mode: yes
```
##### EXPECTED RESULTS
ok: [SWITCH-A]
##### ACTUAL RESULTS
changed: [SWITCH-A]
|
https://github.com/ansible/ansible/issues/67169
|
https://github.com/ansible/ansible/pull/67359
|
f292f21d867e7ab48b9021ee4aa2612f049e170e
|
3acd8f6f7f61b72a42a017060c3a718bbbeaf854
| 2020-02-06T16:15:02Z |
python
| 2020-02-24T12:57:11Z |
test/integration/targets/nxos_lag_interfaces/tests/cli/merged.yaml
|
---
- debug:
msg: "Start nxos_lag_interfaces merged integration tests connection={{ ansible_connection }}"
- set_fact: test_int1="{{ nxos_int1 }}"
- set_fact: test_int2="{{ nxos_int2 }}"
- name: Enable feature lacp
nxos_feature:
feature: lacp
- name: Setup
nxos_config:
lines:
- no interface port-channel 10
ignore_errors: yes
- name: setup2
nxos_lag_interfaces: &remove_lags
state: deleted
- block:
- name: Merged
nxos_lag_interfaces: &merged
config:
- name: port-channel10
members:
- member: "{{ test_int1 }}"
- member: "{{ test_int2 }}"
state: merged
register: result
- assert:
that:
- "result.before|length == 0"
- "result.changed == true"
- name: Gather LAG interfaces facts
nxos_facts:
gather_subset:
- '!all'
- '!min'
gather_network_resources: lag_interfaces
- assert:
that:
- "ansible_facts.network_resources.lag_interfaces|symmetric_difference(result.after)|length == 0"
- name: Idempotence - Merged
nxos_lag_interfaces: *merged
register: result
- assert:
that:
- "result.changed == false"
always:
- name: Teardown
nxos_lag_interfaces: *remove_lags
ignore_errors: yes
- name: Disable feature lacp
nxos_feature:
feature: lacp
state: disabled
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,274 |
Edgeos config module handling of quotation marks
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
The current implementation of the edgeos_config module has changing behaviour for handling the stripping of single quotation marks.
So in review of this [line](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py#L197) the `item` var is defined with stripped single quotes. This is then used for a series of comparisons ending with the original non-stripped `line` var defined in the for loop being added to the `updates` list. Though on line [213](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py#L213) it uses `item`. This introduces a change of behaviour described below.
A potential method of addressing this is to use double quotes. Unfortunately these will be escaped meaning that Ansible will always report a change on existing text values. It does work for empty values like `plaintext-password`
```
set system login user test full-name "test user"
```
results in
```
"set system login user test full-name test \"user\""
```
Another potential resolution of this is to remove the white space stripping all together and put the onus on correct command syntax in the hands of the user.
When passing a single quote within a value Ansible correctly returns a valid error code
```
set system login user test full-name "test' user"
```
Results in:
```
Cannot use the single quote (') character in a value string
```
The only issue which the original stripping might of been added to address is when a hanging single quote is left. So if an invalid command line is passed through, Ansible will hang on the network connection waiting on input.
```
set system login user test full-name 'test' user'
```
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
[EdgeOS (edgeos_config) network module](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py)
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.2
ansible 2.10.0.dev0
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
Edgerouter 3/4
##### STEPS TO REPRODUCE
The following will be successfully added to the updates list including single quotes:
```
set system login user test level admin
set system login user test authentication encrypted-password <encrypted-value>
set system login user test authentication plaintext-password ''
set system login user test full-name 'test user'
```
With a delete command though it will be added via the delete logic stripping the single quotes:
```
delete system login user test
set system login user test level admin
set system login user test authentication encrypted-password <encrypted-value>
set system login user test authentication plaintext-password ''
set system login user test full-name 'test user'
```
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
Both with and without a delete the config should be handled the same
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
Adding the delete command results in Ansible failing in two ways. The `plaintext-password` will be invalid for having no value. The `full-name` will be invalid for having two values:
```
set system login user test authentication plaintext-password
set system login user test full-name test user
```
|
https://github.com/ansible/ansible/issues/67274
|
https://github.com/ansible/ansible/pull/67500
|
eab914426bc958c0eb9adfa26b3856d1be0f49ae
|
bd26b6c0b4c71be356efa727dc39396b41ba2b9a
| 2020-02-10T14:24:37Z |
python
| 2020-02-24T18:45:19Z |
changelogs/fragments/67500-fix-edgeos-config-single-quote-stripping.yaml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,274 |
Edgeos config module handling of quotation marks
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
The current implementation of the edgeos_config module has changing behaviour for handling the stripping of single quotation marks.
So in review of this [line](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py#L197) the `item` var is defined with stripped single quotes. This is then used for a series of comparisons ending with the original non-stripped `line` var defined in the for loop being added to the `updates` list. Though on line [213](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py#L213) it uses `item`. This introduces a change of behaviour described below.
A potential method of addressing this is to use double quotes. Unfortunately these will be escaped meaning that Ansible will always report a change on existing text values. It does work for empty values like `plaintext-password`
```
set system login user test full-name "test user"
```
results in
```
"set system login user test full-name test \"user\""
```
Another potential resolution of this is to remove the white space stripping all together and put the onus on correct command syntax in the hands of the user.
When passing a single quote within a value Ansible correctly returns a valid error code
```
set system login user test full-name "test' user"
```
Results in:
```
Cannot use the single quote (') character in a value string
```
The only issue which the original stripping might of been added to address is when a hanging single quote is left. So if an invalid command line is passed through, Ansible will hang on the network connection waiting on input.
```
set system login user test full-name 'test' user'
```
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
[EdgeOS (edgeos_config) network module](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py)
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.2
ansible 2.10.0.dev0
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
Edgerouter 3/4
##### STEPS TO REPRODUCE
The following will be successfully added to the updates list including single quotes:
```
set system login user test level admin
set system login user test authentication encrypted-password <encrypted-value>
set system login user test authentication plaintext-password ''
set system login user test full-name 'test user'
```
With a delete command though it will be added via the delete logic stripping the single quotes:
```
delete system login user test
set system login user test level admin
set system login user test authentication encrypted-password <encrypted-value>
set system login user test authentication plaintext-password ''
set system login user test full-name 'test user'
```
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
Both with and without a delete the config should be handled the same
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
Adding the delete command results in Ansible failing in two ways. The `plaintext-password` will be invalid for having no value. The `full-name` will be invalid for having two values:
```
set system login user test authentication plaintext-password
set system login user test full-name test user
```
|
https://github.com/ansible/ansible/issues/67274
|
https://github.com/ansible/ansible/pull/67500
|
eab914426bc958c0eb9adfa26b3856d1be0f49ae
|
bd26b6c0b4c71be356efa727dc39396b41ba2b9a
| 2020-02-10T14:24:37Z |
python
| 2020-02-24T18:45:19Z |
lib/ansible/modules/network/edgeos/edgeos_config.py
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright (c) 2018 Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'}
DOCUMENTATION = """
---
module: edgeos_config
version_added: "2.5"
author:
- "Nathaniel Case (@Qalthos)"
- "Sam Doran (@samdoran)"
short_description: Manage EdgeOS configuration on remote device
description:
- This module provides configuration file management of EdgeOS
devices. It provides arguments for managing both the
configuration file and state of the active configuration. All
configuration statements are based on `set` and `delete` commands
in the device configuration.
- "This is a network module and requires the C(connection: network_cli) in order
to work properly."
- For more information please see the L(Network Guide,../network/getting_started/index.html).
notes:
- Tested against EdgeOS 1.9.7
- Setting C(ANSIBLE_PERSISTENT_COMMAND_TIMEOUT) to 30 is recommended since
the save command can take longer than the default of 10 seconds on
some EdgeOS hardware.
options:
lines:
description:
- The ordered set of configuration lines to be managed and
compared with the existing configuration on the remote
device.
src:
description:
- The C(src) argument specifies the path to the source config
file to load. The source config file can either be in
bracket format or set format. The source file can include
Jinja2 template variables.
match:
description:
- The C(match) argument controls the method used to match
against the current active configuration. By default, the
desired config is matched against the active config and the
deltas are loaded. If the C(match) argument is set to C(none)
the active configuration is ignored and the configuration is
always loaded.
default: line
choices: ['line', 'none']
backup:
description:
- The C(backup) argument will backup the current device's active
configuration to the Ansible control host prior to making any
changes. If the C(backup_options) value is not given, the backup
file will be located in the backup folder in the playbook root
directory or role root directory if the playbook is part of an
ansible role. If the directory does not exist, it is created.
type: bool
default: 'no'
comment:
description:
- Allows a commit description to be specified to be included
when the configuration is committed. If the configuration is
not changed or committed, this argument is ignored.
default: 'configured by edgeos_config'
config:
description:
- The C(config) argument specifies the base configuration to use
to compare against the desired configuration. If this value
is not specified, the module will automatically retrieve the
current active configuration from the remote device.
save:
description:
- The C(save) argument controls whether or not changes made
to the active configuration are saved to disk. This is
independent of committing the config. When set to C(True), the
active configuration is saved.
type: bool
default: 'no'
backup_options:
description:
- This is a dict object containing configurable options related to backup file path.
The value of this option is read only when C(backup) is set to I(yes), if C(backup) is set
to I(no) this option will be silently ignored.
suboptions:
filename:
description:
- The filename to be used to store the backup configuration. If the filename
is not given it will be generated based on the hostname, current time and date
in format defined by <hostname>_config.<current-date>@<current-time>
dir_path:
description:
- This option provides the path ending with directory name in which the backup
configuration file will be stored. If the directory does not exist it will be first
created and the filename is either the value of C(filename) or default filename
as described in C(filename) options description. If the path value is not given
in that case a I(backup) directory will be created in the current working directory
and backup configuration will be copied in C(filename) within I(backup) directory.
type: path
type: dict
version_added: "2.8"
"""
EXAMPLES = """
- name: configure the remote device
edgeos_config:
lines:
- set system host-name {{ inventory_hostname }}
- set service lldp
- delete service dhcp-server
- name: backup and load from file
edgeos_config:
src: edgeos.cfg
backup: yes
- name: configurable backup path
edgeos_config:
src: edgeos.cfg
backup: yes
backup_options:
filename: backup.cfg
dir_path: /home/user
"""
RETURN = """
commands:
description: The list of configuration commands sent to the device
returned: always
type: list
sample: ['...', '...']
backup_path:
description: The full path to the backup file
returned: when backup is yes
type: str
sample: /playbooks/ansible/backup/edgeos_config.2016-07-16@22:28:34
"""
import re
from ansible.module_utils._text import to_native
from ansible.module_utils.basic import AnsibleModule
from ansible.module_utils.network.common.config import NetworkConfig
from ansible.module_utils.network.edgeos.edgeos import load_config, get_config, run_commands
DEFAULT_COMMENT = 'configured by edgeos_config'
def config_to_commands(config):
set_format = config.startswith('set') or config.startswith('delete')
candidate = NetworkConfig(indent=4, contents=config)
if not set_format:
candidate = [c.line for c in candidate.items]
commands = list()
# this filters out less specific lines
for item in candidate:
for index, entry in enumerate(commands):
if item.startswith(entry):
del commands[index]
break
commands.append(item)
commands = ['set %s' % cmd.replace(' {', '') for cmd in commands]
else:
commands = to_native(candidate).split('\n')
return commands
def get_candidate(module):
contents = module.params['src'] or module.params['lines']
if module.params['lines']:
contents = '\n'.join(contents)
return config_to_commands(contents)
def diff_config(commands, config):
config = [to_native(c).replace("'", '') for c in config.splitlines()]
updates = list()
visited = set()
delete_commands = [line for line in commands if line.startswith('delete')]
for line in commands:
item = to_native(line).replace("'", '')
if not item.startswith('set') and not item.startswith('delete'):
raise ValueError('line must start with either `set` or `delete`')
elif item.startswith('set'):
if item not in config:
updates.append(line)
# If there is a corresponding delete command in the desired config, make sure to append
# the set command even though it already exists in the running config
else:
ditem = re.sub('set', 'delete', item)
for line in delete_commands:
if ditem.startswith(line):
updates.append(item)
elif item.startswith('delete'):
if not config:
updates.append(line)
else:
item = re.sub(r'delete', 'set', item)
for entry in config:
if entry.startswith(item) and line not in visited:
updates.append(line)
visited.add(line)
return list(updates)
def run(module, result):
# get the current active config from the node or passed in via
# the config param
config = module.params['config'] or get_config(module)
# create the candidate config object from the arguments
candidate = get_candidate(module)
# create loadable config that includes only the configuration updates
commands = diff_config(candidate, config)
result['commands'] = commands
commit = not module.check_mode
comment = module.params['comment']
if commands:
load_config(module, commands, commit=commit, comment=comment)
result['changed'] = True
def main():
backup_spec = dict(
filename=dict(),
dir_path=dict(type='path')
)
spec = dict(
src=dict(type='path'),
lines=dict(type='list'),
match=dict(default='line', choices=['line', 'none']),
comment=dict(default=DEFAULT_COMMENT),
config=dict(),
backup=dict(type='bool', default=False),
backup_options=dict(type='dict', options=backup_spec),
save=dict(type='bool', default=False),
)
mutually_exclusive = [('lines', 'src')]
module = AnsibleModule(
argument_spec=spec,
mutually_exclusive=mutually_exclusive,
supports_check_mode=True
)
warnings = list()
result = dict(changed=False, warnings=warnings)
if module.params['backup']:
result['__backup__'] = get_config(module=module)
if any((module.params['src'], module.params['lines'])):
run(module, result)
if module.params['save']:
diff = run_commands(module, commands=['configure', 'compare saved'])[1]
if diff != '[edit]':
run_commands(module, commands=['save'])
result['changed'] = True
run_commands(module, commands=['exit'])
module.exit_json(**result)
if __name__ == '__main__':
main()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,274 |
Edgeos config module handling of quotation marks
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
The current implementation of the edgeos_config module has changing behaviour for handling the stripping of single quotation marks.
So in review of this [line](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py#L197) the `item` var is defined with stripped single quotes. This is then used for a series of comparisons ending with the original non-stripped `line` var defined in the for loop being added to the `updates` list. Though on line [213](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py#L213) it uses `item`. This introduces a change of behaviour described below.
A potential method of addressing this is to use double quotes. Unfortunately these will be escaped meaning that Ansible will always report a change on existing text values. It does work for empty values like `plaintext-password`
```
set system login user test full-name "test user"
```
results in
```
"set system login user test full-name test \"user\""
```
Another potential resolution of this is to remove the white space stripping all together and put the onus on correct command syntax in the hands of the user.
When passing a single quote within a value Ansible correctly returns a valid error code
```
set system login user test full-name "test' user"
```
Results in:
```
Cannot use the single quote (') character in a value string
```
The only issue which the original stripping might of been added to address is when a hanging single quote is left. So if an invalid command line is passed through, Ansible will hang on the network connection waiting on input.
```
set system login user test full-name 'test' user'
```
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
[EdgeOS (edgeos_config) network module](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py)
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.2
ansible 2.10.0.dev0
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
Edgerouter 3/4
##### STEPS TO REPRODUCE
The following will be successfully added to the updates list including single quotes:
```
set system login user test level admin
set system login user test authentication encrypted-password <encrypted-value>
set system login user test authentication plaintext-password ''
set system login user test full-name 'test user'
```
With a delete command though it will be added via the delete logic stripping the single quotes:
```
delete system login user test
set system login user test level admin
set system login user test authentication encrypted-password <encrypted-value>
set system login user test authentication plaintext-password ''
set system login user test full-name 'test user'
```
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
Both with and without a delete the config should be handled the same
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
Adding the delete command results in Ansible failing in two ways. The `plaintext-password` will be invalid for having no value. The `full-name` will be invalid for having two values:
```
set system login user test authentication plaintext-password
set system login user test full-name test user
```
|
https://github.com/ansible/ansible/issues/67274
|
https://github.com/ansible/ansible/pull/67500
|
eab914426bc958c0eb9adfa26b3856d1be0f49ae
|
bd26b6c0b4c71be356efa727dc39396b41ba2b9a
| 2020-02-10T14:24:37Z |
python
| 2020-02-24T18:45:19Z |
test/units/modules/network/edgeos/fixtures/edgeos_config_config.cfg
|
set system host-name 'router'
set system domain-name 'acme.com'
set system domain-search domain 'acme.com'
set system name-server '208.67.220.220'
set system name-server '208.67.222.222'
set interfaces ethernet eth0 address '1.2.3.4/24'
set interfaces ethernet eth0 description 'Outside'
set interfaces ethernet eth1 address '10.77.88.1/24'
set interfaces ethernet eth1 description 'Inside'
set interfaces ethernet eth1 disable
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,274 |
Edgeos config module handling of quotation marks
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
The current implementation of the edgeos_config module has changing behaviour for handling the stripping of single quotation marks.
So in review of this [line](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py#L197) the `item` var is defined with stripped single quotes. This is then used for a series of comparisons ending with the original non-stripped `line` var defined in the for loop being added to the `updates` list. Though on line [213](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py#L213) it uses `item`. This introduces a change of behaviour described below.
A potential method of addressing this is to use double quotes. Unfortunately these will be escaped meaning that Ansible will always report a change on existing text values. It does work for empty values like `plaintext-password`
```
set system login user test full-name "test user"
```
results in
```
"set system login user test full-name test \"user\""
```
Another potential resolution of this is to remove the white space stripping all together and put the onus on correct command syntax in the hands of the user.
When passing a single quote within a value Ansible correctly returns a valid error code
```
set system login user test full-name "test' user"
```
Results in:
```
Cannot use the single quote (') character in a value string
```
The only issue which the original stripping might of been added to address is when a hanging single quote is left. So if an invalid command line is passed through, Ansible will hang on the network connection waiting on input.
```
set system login user test full-name 'test' user'
```
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
[EdgeOS (edgeos_config) network module](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py)
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.2
ansible 2.10.0.dev0
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
Edgerouter 3/4
##### STEPS TO REPRODUCE
The following will be successfully added to the updates list including single quotes:
```
set system login user test level admin
set system login user test authentication encrypted-password <encrypted-value>
set system login user test authentication plaintext-password ''
set system login user test full-name 'test user'
```
With a delete command though it will be added via the delete logic stripping the single quotes:
```
delete system login user test
set system login user test level admin
set system login user test authentication encrypted-password <encrypted-value>
set system login user test authentication plaintext-password ''
set system login user test full-name 'test user'
```
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
Both with and without a delete the config should be handled the same
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
Adding the delete command results in Ansible failing in two ways. The `plaintext-password` will be invalid for having no value. The `full-name` will be invalid for having two values:
```
set system login user test authentication plaintext-password
set system login user test full-name test user
```
|
https://github.com/ansible/ansible/issues/67274
|
https://github.com/ansible/ansible/pull/67500
|
eab914426bc958c0eb9adfa26b3856d1be0f49ae
|
bd26b6c0b4c71be356efa727dc39396b41ba2b9a
| 2020-02-10T14:24:37Z |
python
| 2020-02-24T18:45:19Z |
test/units/modules/network/edgeos/fixtures/edgeos_config_src.cfg
|
set system host-name er01
delete interfaces ethernet eth0 address
set interfaces ethernet eth1 address '10.77.88.1/24'
set interfaces ethernet eth1 description 'Inside'
set interfaces ethernet eth1 disable
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,274 |
Edgeos config module handling of quotation marks
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
The current implementation of the edgeos_config module has changing behaviour for handling the stripping of single quotation marks.
So in review of this [line](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py#L197) the `item` var is defined with stripped single quotes. This is then used for a series of comparisons ending with the original non-stripped `line` var defined in the for loop being added to the `updates` list. Though on line [213](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py#L213) it uses `item`. This introduces a change of behaviour described below.
A potential method of addressing this is to use double quotes. Unfortunately these will be escaped meaning that Ansible will always report a change on existing text values. It does work for empty values like `plaintext-password`
```
set system login user test full-name "test user"
```
results in
```
"set system login user test full-name test \"user\""
```
Another potential resolution of this is to remove the white space stripping all together and put the onus on correct command syntax in the hands of the user.
When passing a single quote within a value Ansible correctly returns a valid error code
```
set system login user test full-name "test' user"
```
Results in:
```
Cannot use the single quote (') character in a value string
```
The only issue which the original stripping might of been added to address is when a hanging single quote is left. So if an invalid command line is passed through, Ansible will hang on the network connection waiting on input.
```
set system login user test full-name 'test' user'
```
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
[EdgeOS (edgeos_config) network module](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py)
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.2
ansible 2.10.0.dev0
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
Edgerouter 3/4
##### STEPS TO REPRODUCE
The following will be successfully added to the updates list including single quotes:
```
set system login user test level admin
set system login user test authentication encrypted-password <encrypted-value>
set system login user test authentication plaintext-password ''
set system login user test full-name 'test user'
```
With a delete command though it will be added via the delete logic stripping the single quotes:
```
delete system login user test
set system login user test level admin
set system login user test authentication encrypted-password <encrypted-value>
set system login user test authentication plaintext-password ''
set system login user test full-name 'test user'
```
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
Both with and without a delete the config should be handled the same
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
Adding the delete command results in Ansible failing in two ways. The `plaintext-password` will be invalid for having no value. The `full-name` will be invalid for having two values:
```
set system login user test authentication plaintext-password
set system login user test full-name test user
```
|
https://github.com/ansible/ansible/issues/67274
|
https://github.com/ansible/ansible/pull/67500
|
eab914426bc958c0eb9adfa26b3856d1be0f49ae
|
bd26b6c0b4c71be356efa727dc39396b41ba2b9a
| 2020-02-10T14:24:37Z |
python
| 2020-02-24T18:45:19Z |
test/units/modules/network/edgeos/fixtures/edgeos_config_src_brackets.cfg
|
interfaces {
ethernet eth0 {
address 10.10.10.10/24
}
ethernet eth1 {
address 10.77.88.1/24
description Inside
disable
}
}
system {
host-name er01
}
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,274 |
Edgeos config module handling of quotation marks
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
The current implementation of the edgeos_config module has changing behaviour for handling the stripping of single quotation marks.
So in review of this [line](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py#L197) the `item` var is defined with stripped single quotes. This is then used for a series of comparisons ending with the original non-stripped `line` var defined in the for loop being added to the `updates` list. Though on line [213](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py#L213) it uses `item`. This introduces a change of behaviour described below.
A potential method of addressing this is to use double quotes. Unfortunately these will be escaped meaning that Ansible will always report a change on existing text values. It does work for empty values like `plaintext-password`
```
set system login user test full-name "test user"
```
results in
```
"set system login user test full-name test \"user\""
```
Another potential resolution of this is to remove the white space stripping all together and put the onus on correct command syntax in the hands of the user.
When passing a single quote within a value Ansible correctly returns a valid error code
```
set system login user test full-name "test' user"
```
Results in:
```
Cannot use the single quote (') character in a value string
```
The only issue which the original stripping might of been added to address is when a hanging single quote is left. So if an invalid command line is passed through, Ansible will hang on the network connection waiting on input.
```
set system login user test full-name 'test' user'
```
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
[EdgeOS (edgeos_config) network module](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/network/edgeos/edgeos_config.py)
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.2
ansible 2.10.0.dev0
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
Edgerouter 3/4
##### STEPS TO REPRODUCE
The following will be successfully added to the updates list including single quotes:
```
set system login user test level admin
set system login user test authentication encrypted-password <encrypted-value>
set system login user test authentication plaintext-password ''
set system login user test full-name 'test user'
```
With a delete command though it will be added via the delete logic stripping the single quotes:
```
delete system login user test
set system login user test level admin
set system login user test authentication encrypted-password <encrypted-value>
set system login user test authentication plaintext-password ''
set system login user test full-name 'test user'
```
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
Both with and without a delete the config should be handled the same
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
Adding the delete command results in Ansible failing in two ways. The `plaintext-password` will be invalid for having no value. The `full-name` will be invalid for having two values:
```
set system login user test authentication plaintext-password
set system login user test full-name test user
```
|
https://github.com/ansible/ansible/issues/67274
|
https://github.com/ansible/ansible/pull/67500
|
eab914426bc958c0eb9adfa26b3856d1be0f49ae
|
bd26b6c0b4c71be356efa727dc39396b41ba2b9a
| 2020-02-10T14:24:37Z |
python
| 2020-02-24T18:45:19Z |
test/units/modules/network/edgeos/test_edgeos_config.py
|
#
# (c) 2018 Red Hat Inc.
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
from units.compat.mock import patch
from ansible.modules.network.edgeos import edgeos_config
from units.modules.utils import set_module_args
from .edgeos_module import TestEdgeosModule, load_fixture
class TestEdgeosConfigModule(TestEdgeosModule):
module = edgeos_config
def setUp(self):
super(TestEdgeosConfigModule, self).setUp()
self.mock_get_config = patch('ansible.modules.network.edgeos.edgeos_config.get_config')
self.get_config = self.mock_get_config.start()
self.mock_load_config = patch('ansible.modules.network.edgeos.edgeos_config.load_config')
self.load_config = self.mock_load_config.start()
self.mock_run_commands = patch('ansible.modules.network.edgeos.edgeos_config.run_commands')
self.run_commands = self.mock_run_commands.start()
def tearDown(self):
super(TestEdgeosConfigModule, self).tearDown()
self.mock_get_config.stop()
self.mock_load_config.stop()
self.mock_run_commands.stop()
def load_fixtures(self, commands=None):
config_file = 'edgeos_config_config.cfg'
self.get_config.return_value = load_fixture(config_file)
self.load_config.return_value = None
def test_edgeos_config_unchanged(self):
src = load_fixture('edgeos_config_config.cfg')
set_module_args(dict(src=src))
self.execute_module()
def test_edgeos_config_src(self):
src = load_fixture('edgeos_config_src.cfg')
set_module_args(dict(src=src))
commands = ['set system host-name er01', 'delete interfaces ethernet eth0 address']
self.execute_module(changed=True, commands=commands)
def test_edgeos_config_src_brackets(self):
src = load_fixture('edgeos_config_src_brackets.cfg')
set_module_args(dict(src=src))
commands = ['set interfaces ethernet eth0 address 10.10.10.10/24', 'set system host-name er01']
self.execute_module(changed=True, commands=commands)
def test_edgeos_config_backup(self):
set_module_args(dict(backup=True))
result = self.execute_module()
self.assertIn('__backup__', result)
def test_edgeos_config_lines(self):
commands = ['set system host-name er01']
set_module_args(dict(lines=commands))
self.execute_module(changed=True, commands=commands)
def test_edgeos_config_config(self):
config = 'set system host-name localhost'
new_config = ['set system host-name er01']
set_module_args(dict(lines=new_config, config=config))
self.execute_module(changed=True, commands=new_config)
def test_edgeos_config_match_none(self):
lines = ['set system interfaces ethernet eth0 address 1.2.3.4/24',
'set system interfaces ethernet eth0 description Outside']
set_module_args(dict(lines=lines, match='none'))
self.execute_module(changed=True, commands=lines, sort=False)
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,603 |
Not able to enable intelligence_packs from ansible module for loganalyticsworkspace
|
##### SUMMARY
I am using azure_rm_loganalyticsworkspace ansible module to enable container monitoring and backup , and it gives me error as unsupported parameters
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
azure_rm_loganalyticsworkspace
##### ANSIBLE VERSION
```
ansible-playbook 2.8.6
config file = /home/devans/ansible.cfg
configured module search path = [u'/home/devans/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /bin/ansible-playbook
python version = 2.7.5 (default, Apr 9 2019, 16:02:27) [GCC 4.8.5 20150623 (Red Hat 4.8.5-36.0.1)]
```
##### CONFIGURATION
```
COMMAND_WARNINGS(/home/devans/ansible.cfg) = False
DEFAULT_GATHERING(/home/devans/ansible.cfg) = smart
DEFAULT_HOST_LIST(/home/devans/ansible.cfg) = [u'/home/devans/hosts']
DEFAULT_STDOUT_CALLBACK(/home/devans/ansible.cfg) = yaml
DEPRECATION_WARNINGS(/home/devans/ansible.cfg) = False
HOST_KEY_CHECKING(/home/devans/ansible.cfg) = False
SYSTEM_WARNINGS(/home/devans/ansible.cfg) = False
```
##### OS / ENVIRONMENT
Oralce Linux 7.4
##### STEPS TO REPRODUCE
Run below YAML
```
azure_rm_loganalyticsworkspace:
resource_group: "{{ main_resource_group }}"
name: "{{ npe_loganalyticsworkspace_name }}"
intelligence_pack:
Backup: true
Containers: true
profile: "{{ profilename }}"
```
##### EXPECTED RESULTS
I assume it creates workspace with those features enabled
##### ACTUAL RESULTS
```
fatal: [localhost]: FAILED! => changed=false
invocation:
module_args:
intelligence_pack:
Backup: true
Containers: true
name: bglnpeapglogs
profile: BGLNPE
resource_group: BGL-NPE-ARG-APG
msg: 'Unsupported parameters for (azure_rm_loganalyticsworkspace) module: intelligence_pack Supported parameters include: ad_user, adfs_authority_url, api_profile, append_tags, auth_source, cert_validation_mode, client_id, cloud_environment, intelligence_packs, location, name, password, profile, resource_group, retention_in_days, secret, sku, state, subscription_id, tags, tenant'
```
|
https://github.com/ansible/ansible/issues/67603
|
https://github.com/ansible/ansible/pull/67686
|
f520238d604d8de7d5f6e3ef8141933090ab0341
|
a173ce96302ce5f718865f3c8e8695f2a062b1c3
| 2020-02-20T06:41:49Z |
python
| 2020-02-24T19:25:23Z |
lib/ansible/modules/cloud/azure/azure_rm_loganalyticsworkspace.py
|
#!/usr/bin/python
#
# Copyright (c) 2019 Yuwei Zhou, <[email protected]>
#
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'}
DOCUMENTATION = '''
---
module: azure_rm_loganalyticsworkspace
version_added: "2.8"
short_description: Manage Azure Log Analytics workspaces
description:
- Create, delete Azure Log Analytics workspaces.
options:
resource_group:
description:
- Name of resource group.
required: true
name:
description:
- Name of the workspace.
required: true
state:
description:
- Assert the state of the image. Use C(present) to create or update a image and C(absent) to delete an image.
default: present
choices:
- absent
- present
location:
description:
- Resource location.
sku:
description:
- The SKU of the workspace.
choices:
- free
- standard
- premium
- unlimited
- per_node
- per_gb2018
- standalone
default: per_gb2018
retention_in_days:
description:
- The workspace data retention in days.
- -1 means Unlimited retention for I(sku=unlimited).
- 730 days is the maximum allowed for all other SKUs.
intelligence_packs:
description:
- Manage intelligence packs possible for this workspace.
- Enable one pack by setting it to C(true). For example "Backup:true".
- Disable one pack by setting it to C(false). For example "Backup:false".
- Other intelligence packs not list in this property will not be changed.
type: dict
extends_documentation_fragment:
- azure
- azure_tags
author:
- Yuwei Zhou (@yuwzho)
'''
EXAMPLES = '''
- name: Create a workspace with backup enabled
azure_rm_loganalyticsworkspace:
resource_group: myResourceGroup
name: myLogAnalyticsWorkspace
intelligence_pack:
Backup: true
'''
RETURN = '''
id:
description:
- Workspace resource path.
type: str
returned: success
example: "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/resourceGroups/myResourceGroup/providers/Microsoft.OperationalInsights/workspaces/m
yLogAnalyticsWorkspace"
location:
description:
- Resource location.
type: str
returned: success
example: eastus
sku:
description:
- The SKU of the workspace.
type: str
returned: success
example: "per_gb2018"
retention_in_days:
description:
- The workspace data retention in days.
- -1 means Unlimited retention for I(sku=unlimited).
- 730 days is the maximum allowed for all other SKUs.
type: int
returned: success
example: 40
intelligence_packs:
description:
- Lists all the intelligence packs possible and whether they are enabled or disabled for a given workspace.
type: list
returned: success
example: ['name': 'CapacityPerformance', 'enabled': true]
management_groups:
description:
- Management groups connected to the workspace.
type: dict
returned: success
example: {'value': []}
shared_keys:
description:
- Shared keys for the workspace.
type: dict
returned: success
example: {
'primarySharedKey': 'BozLY1JnZbxu0jWUQSY8iRPEM8ObmpP8rW+8bUl3+HpDJI+n689SxXgTgU7k1qdxo/WugRLxechxbolAfHM5uA==',
'secondarySharedKey': '7tDt5W0JBrCQKtQA3igfFltLSzJeyr9LmuT+B/ibzd8cdC1neZ1ePOQLBx5NUzc0q2VUIK0cLhWNyFvo/hT8Ww=='
}
usages:
description:
- Usage metrics for the workspace.
type: dict
returned: success
example: {
'value': [
{
'name': {
'value': 'DataAnalyzed',
'localizedValue': 'Data Analyzed'
},
'unit': 'Bytes',
'currentValue': 0,
'limit': 524288000,
'nextResetTime': '2017-10-03T00:00:00Z',
'quotaPeriod': 'P1D'
}
]
}
''' # NOQA
from ansible.module_utils.azure_rm_common import AzureRMModuleBase, format_resource_id
from ansible.module_utils.common.dict_transformations import _snake_to_camel, _camel_to_snake
try:
from msrestazure.tools import parse_resource_id
from msrestazure.azure_exceptions import CloudError
except ImportError:
# This is handled in azure_rm_common
pass
class AzureRMLogAnalyticsWorkspace(AzureRMModuleBase):
def __init__(self):
self.module_arg_spec = dict(
resource_group=dict(type='str', required=True),
name=dict(type='str', required=True),
state=dict(type='str', default='present', choices=['present', 'absent']),
location=dict(type='str'),
sku=dict(type='str', default='per_gb2018', choices=['free', 'standard', 'premium', 'unlimited', 'per_node', 'per_gb2018', 'standalone']),
retention_in_days=dict(type='int'),
intelligence_packs=dict(type='dict')
)
self.results = dict(
changed=False,
id=None
)
self.resource_group = None
self.name = None
self.state = None
self.location = None
self.sku = None
self.retention_in_days = None
self.intelligence_packs = None
super(AzureRMLogAnalyticsWorkspace, self).__init__(self.module_arg_spec, supports_check_mode=True)
def exec_module(self, **kwargs):
for key in list(self.module_arg_spec.keys()) + ['tags']:
setattr(self, key, kwargs[key])
self.results = dict()
changed = False
if not self.location:
resource_group = self.get_resource_group(self.resource_group)
self.location = resource_group.location
if self.sku == 'per_gb2018':
self.sku = 'PerGB2018'
else:
self.sku = _snake_to_camel(self.sku)
workspace = self.get_workspace()
if not workspace and self.state == 'present':
changed = True
workspace = self.log_analytics_models.Workspace(sku=self.log_analytics_models.Sku(name=self.sku),
retention_in_days=self.retention_in_days,
location=self.location)
if not self.check_mode:
workspace = self.create_workspace(workspace)
elif workspace and self.state == 'absent':
changed = True
workspace = None
if not self.check_mode:
self.delete_workspace()
if workspace and workspace.id:
self.results = self.to_dict(workspace)
self.results['intelligence_packs'] = self.list_intelligence_packs()
self.results['management_groups'] = self.list_management_groups()
self.results['usages'] = self.list_usages()
self.results['shared_keys'] = self.get_shared_keys()
# handle the intelligence pack
if workspace and workspace.id and self.intelligence_packs:
intelligence_packs = self.results['intelligence_packs']
for key in self.intelligence_packs.keys():
enabled = self.intelligence_packs[key]
for x in intelligence_packs:
if x['name'].lower() == key.lower():
if x['enabled'] != enabled:
changed = True
if not self.check_mode:
self.change_intelligence(x['name'], enabled)
x['enabled'] = enabled
break
self.results['changed'] = changed
return self.results
def create_workspace(self, workspace):
try:
poller = self.log_analytics_client.workspaces.create_or_update(self.resource_group, self.name, workspace)
return self.get_poller_result(poller)
except CloudError as exc:
self.fail('Error when creating workspace {0} - {1}'.format(self.name, exc.message or str(exc)))
def get_workspace(self):
try:
return self.log_analytics_client.workspaces.get(self.resource_group, self.name)
except CloudError:
pass
def delete_workspace(self):
try:
self.log_analytics_client.workspaces.delete(self.resource_group, self.name)
except CloudError as exc:
self.fail('Error when deleting workspace {0} - {1}'.format(self.name, exc.message or str(exc)))
def to_dict(self, workspace):
result = workspace.as_dict()
result['sku'] = _camel_to_snake(workspace.sku.name)
return result
def list_intelligence_packs(self):
try:
response = self.log_analytics_client.workspaces.list_intelligence_packs(self.resource_group, self.name)
return [x.as_dict() for x in response]
except CloudError as exc:
self.fail('Error when listing intelligence packs {0}'.format(exc.message or str(exc)))
def change_intelligence(self, key, value):
try:
if value:
self.log_analytics_client.workspaces.enable_intelligence_pack(self.resource_group, self.name, key)
else:
self.log_analytics_client.workspaces.disable_intelligence_pack(self.resource_group, self.name, key)
except CloudError as exc:
self.fail('Error when changing intelligence pack {0} - {1}'.format(key, exc.message or str(exc)))
def list_management_groups(self):
result = []
try:
response = self.log_analytics_client.workspaces.list_management_groups(self.resource_group, self.name)
while True:
result.append(response.next().as_dict())
except StopIteration:
pass
except CloudError as exc:
self.fail('Error when listing management groups {0}'.format(exc.message or str(exc)))
return result
def list_usages(self):
result = []
try:
response = self.log_analytics_client.workspaces.list_usages(self.resource_group, self.name)
while True:
result.append(response.next().as_dict())
except StopIteration:
pass
except CloudError as exc:
self.fail('Error when listing usages {0}'.format(exc.message or str(exc)))
return result
def get_shared_keys(self):
try:
return self.log_analytics_client.workspaces.get_shared_keys(self.resource_group, self.name).as_dict()
except CloudError as exc:
self.fail('Error when getting shared key {0}'.format(exc.message or str(exc)))
def main():
AzureRMLogAnalyticsWorkspace()
if __name__ == '__main__':
main()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,459 |
s3_bucket: add StorageGRID support
|
<!--- Verify first that your feature was not already discussed on GitHub -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Describe the new feature/improvement briefly below -->
Support NetApp StorageGRID
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
s3_bucket
##### ADDITIONAL INFORMATION
<!--- Describe how the feature would be used, why it is needed and what it would solve -->
StorageGRID provides S3 REST API, but it's not fully-compatible with AWS API.
Current StorageGRID 11.3 does not have Requester Pays and tags feature, furthermore, the API returns XNotImplemented error instead of the standard NotImplemented error.
http://docs.netapp.com/sgws-113/topic/com.netapp.doc.sg-s3/GUID-6E737B19-441A-495F-A35C-9BDC8BA1A4A2.html
Here is the error message that s3_bucket module in ansible 2.9.5 outputs:
<!--- Paste example playbooks or commands between quotes below -->
```
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: botocore.exceptions.ClientError: An error occurred (XNotImplemented) when calling the GetBucketRequestPayment operation: The request you provided implies functionality that is not implemented.
fatal: [localhost]: FAILED! => {"boto3_version": "1.12.0", "botocore_version": "1.15.0", "changed": false, "error": {"code": "XNotImplemented", "message": "The request you provided implies functionality that is not implemented.", "resource": "/ansible-bucket-0217?requestPayment"}, "msg": "Failed to get bucket request payment: An error occurred (XNotImplemented) when calling the GetBucketRequestPayment operation: The request you provided implies functionality that is not implemented.", "resource_actions": ["objectstorage-s:CreateBucket", "objectstorage-s:GetBucketVersioning", "objectstorage-s:ListBuckets", "objectstorage-s:GetBucketRequestPayment", "objectstorage-s:HeadBucket"], "response_metadata": {"host_id": "12889128", "http_headers": {"cache-control": "no-cache", "connection": "keep-alive", "content-length": "267", "content-type": "application/xml", "date": "Mon, 17 Feb 2020 08:15:45 GMT", "x-amz-id-2": "12889128", "x-amz-request-id": "1581927345613682", "http_status_code": 501, "request_id": "1581927345613682", "retry_attempts": 0}}
```
<!--- HINT: You can also paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/67459
|
https://github.com/ansible/ansible/pull/67462
|
c5ec0fcb34af3dba4fdc6afcd1bd52ab617cc0c9
|
d317cc71c7086ecee74974ddc8aa71b4830ff998
| 2020-02-17T09:53:35Z |
python
| 2020-02-24T20:00:38Z |
changelogs/fragments/67462-s3_bucket-accept-storagegrid-response.yaml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,459 |
s3_bucket: add StorageGRID support
|
<!--- Verify first that your feature was not already discussed on GitHub -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Describe the new feature/improvement briefly below -->
Support NetApp StorageGRID
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
s3_bucket
##### ADDITIONAL INFORMATION
<!--- Describe how the feature would be used, why it is needed and what it would solve -->
StorageGRID provides S3 REST API, but it's not fully-compatible with AWS API.
Current StorageGRID 11.3 does not have Requester Pays and tags feature, furthermore, the API returns XNotImplemented error instead of the standard NotImplemented error.
http://docs.netapp.com/sgws-113/topic/com.netapp.doc.sg-s3/GUID-6E737B19-441A-495F-A35C-9BDC8BA1A4A2.html
Here is the error message that s3_bucket module in ansible 2.9.5 outputs:
<!--- Paste example playbooks or commands between quotes below -->
```
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: botocore.exceptions.ClientError: An error occurred (XNotImplemented) when calling the GetBucketRequestPayment operation: The request you provided implies functionality that is not implemented.
fatal: [localhost]: FAILED! => {"boto3_version": "1.12.0", "botocore_version": "1.15.0", "changed": false, "error": {"code": "XNotImplemented", "message": "The request you provided implies functionality that is not implemented.", "resource": "/ansible-bucket-0217?requestPayment"}, "msg": "Failed to get bucket request payment: An error occurred (XNotImplemented) when calling the GetBucketRequestPayment operation: The request you provided implies functionality that is not implemented.", "resource_actions": ["objectstorage-s:CreateBucket", "objectstorage-s:GetBucketVersioning", "objectstorage-s:ListBuckets", "objectstorage-s:GetBucketRequestPayment", "objectstorage-s:HeadBucket"], "response_metadata": {"host_id": "12889128", "http_headers": {"cache-control": "no-cache", "connection": "keep-alive", "content-length": "267", "content-type": "application/xml", "date": "Mon, 17 Feb 2020 08:15:45 GMT", "x-amz-id-2": "12889128", "x-amz-request-id": "1581927345613682", "http_status_code": 501, "request_id": "1581927345613682", "retry_attempts": 0}}
```
<!--- HINT: You can also paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/67459
|
https://github.com/ansible/ansible/pull/67462
|
c5ec0fcb34af3dba4fdc6afcd1bd52ab617cc0c9
|
d317cc71c7086ecee74974ddc8aa71b4830ff998
| 2020-02-17T09:53:35Z |
python
| 2020-02-24T20:00:38Z |
lib/ansible/modules/cloud/amazon/s3_bucket.py
|
#!/usr/bin/python
#
# This is a free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This Ansible library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this library. If not, see <http://www.gnu.org/licenses/>.
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
ANSIBLE_METADATA = {'metadata_version': '1.1',
'status': ['stableinterface'],
'supported_by': 'core'}
DOCUMENTATION = '''
---
module: s3_bucket
short_description: Manage S3 buckets in AWS, DigitalOcean, Ceph, Walrus and FakeS3
description:
- Manage S3 buckets in AWS, DigitalOcean, Ceph, Walrus and FakeS3
version_added: "2.0"
requirements: [ boto3 ]
author: "Rob White (@wimnat)"
options:
force:
description:
- When trying to delete a bucket, delete all keys (including versions and delete markers)
in the bucket first (an s3 bucket must be empty for a successful deletion)
type: bool
default: 'no'
name:
description:
- Name of the s3 bucket
required: true
type: str
policy:
description:
- The JSON policy as a string.
type: json
s3_url:
description:
- S3 URL endpoint for usage with DigitalOcean, Ceph, Eucalyptus and fakes3 etc.
- Assumes AWS if not specified.
- For Walrus, use FQDN of the endpoint without scheme nor path.
aliases: [ S3_URL ]
type: str
ceph:
description:
- Enable API compatibility with Ceph. It takes into account the S3 API subset working
with Ceph in order to provide the same module behaviour where possible.
type: bool
version_added: "2.2"
requester_pays:
description:
- With Requester Pays buckets, the requester instead of the bucket owner pays the cost
of the request and the data download from the bucket.
type: bool
default: False
state:
description:
- Create or remove the s3 bucket
required: false
default: present
choices: [ 'present', 'absent' ]
type: str
tags:
description:
- tags dict to apply to bucket
type: dict
purge_tags:
description:
- whether to remove tags that aren't present in the C(tags) parameter
type: bool
default: True
version_added: "2.9"
versioning:
description:
- Whether versioning is enabled or disabled (note that once versioning is enabled, it can only be suspended)
type: bool
encryption:
description:
- Describes the default server-side encryption to apply to new objects in the bucket.
In order to remove the server-side encryption, the encryption needs to be set to 'none' explicitly.
choices: [ 'none', 'AES256', 'aws:kms' ]
version_added: "2.9"
type: str
encryption_key_id:
description: KMS master key ID to use for the default encryption. This parameter is allowed if encryption is aws:kms. If
not specified then it will default to the AWS provided KMS key.
version_added: "2.9"
type: str
extends_documentation_fragment:
- aws
- ec2
notes:
- If C(requestPayment), C(policy), C(tagging) or C(versioning)
operations/API aren't implemented by the endpoint, module doesn't fail
if each parameter satisfies the following condition.
I(requester_pays) is C(False), I(policy), I(tags), and I(versioning) are C(None).
'''
EXAMPLES = '''
# Note: These examples do not set authentication details, see the AWS Guide for details.
# Create a simple s3 bucket
- s3_bucket:
name: mys3bucket
state: present
# Create a simple s3 bucket on Ceph Rados Gateway
- s3_bucket:
name: mys3bucket
s3_url: http://your-ceph-rados-gateway-server.xxx
ceph: true
# Remove an s3 bucket and any keys it contains
- s3_bucket:
name: mys3bucket
state: absent
force: yes
# Create a bucket, add a policy from a file, enable requester pays, enable versioning and tag
- s3_bucket:
name: mys3bucket
policy: "{{ lookup('file','policy.json') }}"
requester_pays: yes
versioning: yes
tags:
example: tag1
another: tag2
# Create a simple DigitalOcean Spaces bucket using their provided regional endpoint
- s3_bucket:
name: mydobucket
s3_url: 'https://nyc3.digitaloceanspaces.com'
# Create a bucket with AES256 encryption
- s3_bucket:
name: mys3bucket
state: present
encryption: "AES256"
# Create a bucket with aws:kms encryption, KMS key
- s3_bucket:
name: mys3bucket
state: present
encryption: "aws:kms"
encryption_key_id: "arn:aws:kms:us-east-1:1234/5678example"
# Create a bucket with aws:kms encryption, default key
- s3_bucket:
name: mys3bucket
state: present
encryption: "aws:kms"
'''
import json
import os
import time
from ansible.module_utils.six.moves.urllib.parse import urlparse
from ansible.module_utils.six import string_types
from ansible.module_utils.basic import to_text
from ansible.module_utils.aws.core import AnsibleAWSModule, is_boto3_error_code
from ansible.module_utils.ec2 import compare_policies, ec2_argument_spec, boto3_tag_list_to_ansible_dict, ansible_dict_to_boto3_tag_list
from ansible.module_utils.ec2 import get_aws_connection_info, boto3_conn, AWSRetry
try:
from botocore.exceptions import BotoCoreError, ClientError, EndpointConnectionError, WaiterError
except ImportError:
pass # handled by AnsibleAWSModule
def create_or_update_bucket(s3_client, module, location):
policy = module.params.get("policy")
name = module.params.get("name")
requester_pays = module.params.get("requester_pays")
tags = module.params.get("tags")
purge_tags = module.params.get("purge_tags")
versioning = module.params.get("versioning")
encryption = module.params.get("encryption")
encryption_key_id = module.params.get("encryption_key_id")
changed = False
result = {}
try:
bucket_is_present = bucket_exists(s3_client, name)
except EndpointConnectionError as e:
module.fail_json_aws(e, msg="Invalid endpoint provided: %s" % to_text(e))
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to check bucket presence")
if not bucket_is_present:
try:
bucket_changed = create_bucket(s3_client, name, location)
s3_client.get_waiter('bucket_exists').wait(Bucket=name)
changed = changed or bucket_changed
except WaiterError as e:
module.fail_json_aws(e, msg='An error occurred waiting for the bucket to become available')
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed while creating bucket")
# Versioning
try:
versioning_status = get_bucket_versioning(s3_client, name)
except BotoCoreError as exp:
module.fail_json_aws(exp, msg="Failed to get bucket versioning")
except ClientError as exp:
if exp.response['Error']['Code'] != 'NotImplemented' or versioning is not None:
module.fail_json_aws(exp, msg="Failed to get bucket versioning")
else:
if versioning is not None:
required_versioning = None
if versioning and versioning_status.get('Status') != "Enabled":
required_versioning = 'Enabled'
elif not versioning and versioning_status.get('Status') == "Enabled":
required_versioning = 'Suspended'
if required_versioning:
try:
put_bucket_versioning(s3_client, name, required_versioning)
changed = True
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to update bucket versioning")
versioning_status = wait_versioning_is_applied(module, s3_client, name, required_versioning)
# This output format is there to ensure compatibility with previous versions of the module
result['versioning'] = {
'Versioning': versioning_status.get('Status', 'Disabled'),
'MfaDelete': versioning_status.get('MFADelete', 'Disabled'),
}
# Requester pays
try:
requester_pays_status = get_bucket_request_payment(s3_client, name)
except BotoCoreError as exp:
module.fail_json_aws(exp, msg="Failed to get bucket request payment")
except ClientError as exp:
if exp.response['Error']['Code'] != 'NotImplemented' or requester_pays:
module.fail_json_aws(exp, msg="Failed to get bucket request payment")
else:
if requester_pays:
payer = 'Requester' if requester_pays else 'BucketOwner'
if requester_pays_status != payer:
put_bucket_request_payment(s3_client, name, payer)
requester_pays_status = wait_payer_is_applied(module, s3_client, name, payer, should_fail=False)
if requester_pays_status is None:
# We have seen that it happens quite a lot of times that the put request was not taken into
# account, so we retry one more time
put_bucket_request_payment(s3_client, name, payer)
requester_pays_status = wait_payer_is_applied(module, s3_client, name, payer, should_fail=True)
changed = True
result['requester_pays'] = requester_pays
# Policy
try:
current_policy = get_bucket_policy(s3_client, name)
except BotoCoreError as exp:
module.fail_json_aws(exp, msg="Failed to get bucket policy")
except ClientError as exp:
if exp.response['Error']['Code'] != 'NotImplemented' or policy is not None:
module.fail_json_aws(exp, msg="Failed to get bucket policy")
else:
if policy is not None:
if isinstance(policy, string_types):
policy = json.loads(policy)
if not policy and current_policy:
try:
delete_bucket_policy(s3_client, name)
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to delete bucket policy")
current_policy = wait_policy_is_applied(module, s3_client, name, policy)
changed = True
elif compare_policies(current_policy, policy):
try:
put_bucket_policy(s3_client, name, policy)
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to update bucket policy")
current_policy = wait_policy_is_applied(module, s3_client, name, policy, should_fail=False)
if current_policy is None:
# As for request payement, it happens quite a lot of times that the put request was not taken into
# account, so we retry one more time
put_bucket_policy(s3_client, name, policy)
current_policy = wait_policy_is_applied(module, s3_client, name, policy, should_fail=True)
changed = True
result['policy'] = current_policy
# Tags
try:
current_tags_dict = get_current_bucket_tags_dict(s3_client, name)
except BotoCoreError as exp:
module.fail_json_aws(exp, msg="Failed to get bucket tags")
except ClientError as exp:
if exp.response['Error']['Code'] != 'NotImplemented' or tags is not None:
module.fail_json_aws(exp, msg="Failed to get bucket tags")
else:
if tags is not None:
# Tags are always returned as text
tags = dict((to_text(k), to_text(v)) for k, v in tags.items())
if not purge_tags:
# Ensure existing tags that aren't updated by desired tags remain
current_copy = current_tags_dict.copy()
current_copy.update(tags)
tags = current_copy
if current_tags_dict != tags:
if tags:
try:
put_bucket_tagging(s3_client, name, tags)
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to update bucket tags")
else:
if purge_tags:
try:
delete_bucket_tagging(s3_client, name)
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to delete bucket tags")
current_tags_dict = wait_tags_are_applied(module, s3_client, name, tags)
changed = True
result['tags'] = current_tags_dict
# Encryption
if hasattr(s3_client, "get_bucket_encryption"):
try:
current_encryption = get_bucket_encryption(s3_client, name)
except (ClientError, BotoCoreError) as e:
module.fail_json_aws(e, msg="Failed to get bucket encryption")
elif encryption is not None:
module.fail_json(msg="Using bucket encryption requires botocore version >= 1.7.41")
if encryption is not None:
current_encryption_algorithm = current_encryption.get('SSEAlgorithm') if current_encryption else None
current_encryption_key = current_encryption.get('KMSMasterKeyID') if current_encryption else None
if encryption == 'none' and current_encryption_algorithm is not None:
try:
delete_bucket_encryption(s3_client, name)
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to delete bucket encryption")
current_encryption = wait_encryption_is_applied(module, s3_client, name, None)
changed = True
elif encryption != 'none' and (encryption != current_encryption_algorithm) or (encryption == 'aws:kms' and current_encryption_key != encryption_key_id):
expected_encryption = {'SSEAlgorithm': encryption}
if encryption == 'aws:kms' and encryption_key_id is not None:
expected_encryption.update({'KMSMasterKeyID': encryption_key_id})
try:
put_bucket_encryption(s3_client, name, expected_encryption)
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to set bucket encryption")
current_encryption = wait_encryption_is_applied(module, s3_client, name, expected_encryption)
changed = True
result['encryption'] = current_encryption
module.exit_json(changed=changed, name=name, **result)
def bucket_exists(s3_client, bucket_name):
# head_bucket appeared to be really inconsistent, so we use list_buckets instead,
# and loop over all the buckets, even if we know it's less performant :(
all_buckets = s3_client.list_buckets(Bucket=bucket_name)['Buckets']
return any(bucket['Name'] == bucket_name for bucket in all_buckets)
@AWSRetry.exponential_backoff(max_delay=120)
def create_bucket(s3_client, bucket_name, location):
try:
configuration = {}
if location not in ('us-east-1', None):
configuration['LocationConstraint'] = location
if len(configuration) > 0:
s3_client.create_bucket(Bucket=bucket_name, CreateBucketConfiguration=configuration)
else:
s3_client.create_bucket(Bucket=bucket_name)
return True
except ClientError as e:
error_code = e.response['Error']['Code']
if error_code == 'BucketAlreadyOwnedByYou':
# We should never get there since we check the bucket presence before calling the create_or_update_bucket
# method. However, the AWS Api sometimes fails to report bucket presence, so we catch this exception
return False
else:
raise e
@AWSRetry.exponential_backoff(max_delay=120, catch_extra_error_codes=['NoSuchBucket'])
def put_bucket_tagging(s3_client, bucket_name, tags):
s3_client.put_bucket_tagging(Bucket=bucket_name, Tagging={'TagSet': ansible_dict_to_boto3_tag_list(tags)})
@AWSRetry.exponential_backoff(max_delay=120, catch_extra_error_codes=['NoSuchBucket'])
def put_bucket_policy(s3_client, bucket_name, policy):
s3_client.put_bucket_policy(Bucket=bucket_name, Policy=json.dumps(policy))
@AWSRetry.exponential_backoff(max_delay=120, catch_extra_error_codes=['NoSuchBucket'])
def delete_bucket_policy(s3_client, bucket_name):
s3_client.delete_bucket_policy(Bucket=bucket_name)
@AWSRetry.exponential_backoff(max_delay=120, catch_extra_error_codes=['NoSuchBucket'])
def get_bucket_policy(s3_client, bucket_name):
try:
current_policy = json.loads(s3_client.get_bucket_policy(Bucket=bucket_name).get('Policy'))
except ClientError as e:
if e.response['Error']['Code'] == 'NoSuchBucketPolicy':
current_policy = None
else:
raise e
return current_policy
@AWSRetry.exponential_backoff(max_delay=120, catch_extra_error_codes=['NoSuchBucket'])
def put_bucket_request_payment(s3_client, bucket_name, payer):
s3_client.put_bucket_request_payment(Bucket=bucket_name, RequestPaymentConfiguration={'Payer': payer})
@AWSRetry.exponential_backoff(max_delay=120, catch_extra_error_codes=['NoSuchBucket'])
def get_bucket_request_payment(s3_client, bucket_name):
return s3_client.get_bucket_request_payment(Bucket=bucket_name).get('Payer')
@AWSRetry.exponential_backoff(max_delay=120, catch_extra_error_codes=['NoSuchBucket'])
def get_bucket_versioning(s3_client, bucket_name):
return s3_client.get_bucket_versioning(Bucket=bucket_name)
@AWSRetry.exponential_backoff(max_delay=120, catch_extra_error_codes=['NoSuchBucket'])
def put_bucket_versioning(s3_client, bucket_name, required_versioning):
s3_client.put_bucket_versioning(Bucket=bucket_name, VersioningConfiguration={'Status': required_versioning})
@AWSRetry.exponential_backoff(max_delay=120, catch_extra_error_codes=['NoSuchBucket'])
def get_bucket_encryption(s3_client, bucket_name):
try:
result = s3_client.get_bucket_encryption(Bucket=bucket_name)
return result.get('ServerSideEncryptionConfiguration', {}).get('Rules', [])[0].get('ApplyServerSideEncryptionByDefault')
except ClientError as e:
if e.response['Error']['Code'] == 'ServerSideEncryptionConfigurationNotFoundError':
return None
else:
raise e
except (IndexError, KeyError):
return None
@AWSRetry.exponential_backoff(max_delay=120, catch_extra_error_codes=['NoSuchBucket'])
def put_bucket_encryption(s3_client, bucket_name, encryption):
server_side_encryption_configuration = {'Rules': [{'ApplyServerSideEncryptionByDefault': encryption}]}
s3_client.put_bucket_encryption(Bucket=bucket_name, ServerSideEncryptionConfiguration=server_side_encryption_configuration)
@AWSRetry.exponential_backoff(max_delay=120, catch_extra_error_codes=['NoSuchBucket'])
def delete_bucket_tagging(s3_client, bucket_name):
s3_client.delete_bucket_tagging(Bucket=bucket_name)
@AWSRetry.exponential_backoff(max_delay=120, catch_extra_error_codes=['NoSuchBucket'])
def delete_bucket_encryption(s3_client, bucket_name):
s3_client.delete_bucket_encryption(Bucket=bucket_name)
@AWSRetry.exponential_backoff(max_delay=120)
def delete_bucket(s3_client, bucket_name):
try:
s3_client.delete_bucket(Bucket=bucket_name)
except ClientError as e:
if e.response['Error']['Code'] == 'NoSuchBucket':
# This means bucket should have been in a deleting state when we checked it existence
# We just ignore the error
pass
else:
raise e
def wait_policy_is_applied(module, s3_client, bucket_name, expected_policy, should_fail=True):
for dummy in range(0, 12):
try:
current_policy = get_bucket_policy(s3_client, bucket_name)
except (ClientError, BotoCoreError) as e:
module.fail_json_aws(e, msg="Failed to get bucket policy")
if compare_policies(current_policy, expected_policy):
time.sleep(5)
else:
return current_policy
if should_fail:
module.fail_json(msg="Bucket policy failed to apply in the expected time")
else:
return None
def wait_payer_is_applied(module, s3_client, bucket_name, expected_payer, should_fail=True):
for dummy in range(0, 12):
try:
requester_pays_status = get_bucket_request_payment(s3_client, bucket_name)
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to get bucket request payment")
if requester_pays_status != expected_payer:
time.sleep(5)
else:
return requester_pays_status
if should_fail:
module.fail_json(msg="Bucket request payment failed to apply in the expected time")
else:
return None
def wait_encryption_is_applied(module, s3_client, bucket_name, expected_encryption):
for dummy in range(0, 12):
try:
encryption = get_bucket_encryption(s3_client, bucket_name)
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to get updated encryption for bucket")
if encryption != expected_encryption:
time.sleep(5)
else:
return encryption
module.fail_json(msg="Bucket encryption failed to apply in the expected time")
def wait_versioning_is_applied(module, s3_client, bucket_name, required_versioning):
for dummy in range(0, 24):
try:
versioning_status = get_bucket_versioning(s3_client, bucket_name)
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to get updated versioning for bucket")
if versioning_status.get('Status') != required_versioning:
time.sleep(8)
else:
return versioning_status
module.fail_json(msg="Bucket versioning failed to apply in the expected time")
def wait_tags_are_applied(module, s3_client, bucket_name, expected_tags_dict):
for dummy in range(0, 12):
try:
current_tags_dict = get_current_bucket_tags_dict(s3_client, bucket_name)
except (ClientError, BotoCoreError) as e:
module.fail_json_aws(e, msg="Failed to get bucket policy")
if current_tags_dict != expected_tags_dict:
time.sleep(5)
else:
return current_tags_dict
module.fail_json(msg="Bucket tags failed to apply in the expected time")
def get_current_bucket_tags_dict(s3_client, bucket_name):
try:
current_tags = s3_client.get_bucket_tagging(Bucket=bucket_name).get('TagSet')
except ClientError as e:
if e.response['Error']['Code'] == 'NoSuchTagSet':
return {}
raise e
return boto3_tag_list_to_ansible_dict(current_tags)
def paginated_list(s3_client, **pagination_params):
pg = s3_client.get_paginator('list_objects_v2')
for page in pg.paginate(**pagination_params):
yield [data['Key'] for data in page.get('Contents', [])]
def paginated_versions_list(s3_client, **pagination_params):
try:
pg = s3_client.get_paginator('list_object_versions')
for page in pg.paginate(**pagination_params):
# We have to merge the Versions and DeleteMarker lists here, as DeleteMarkers can still prevent a bucket deletion
yield [(data['Key'], data['VersionId']) for data in (page.get('Versions', []) + page.get('DeleteMarkers', []))]
except is_boto3_error_code('NoSuchBucket'):
yield []
def destroy_bucket(s3_client, module):
force = module.params.get("force")
name = module.params.get("name")
try:
bucket_is_present = bucket_exists(s3_client, name)
except EndpointConnectionError as e:
module.fail_json_aws(e, msg="Invalid endpoint provided: %s" % to_text(e))
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to check bucket presence")
if not bucket_is_present:
module.exit_json(changed=False)
if force:
# if there are contents then we need to delete them (including versions) before we can delete the bucket
try:
for key_version_pairs in paginated_versions_list(s3_client, Bucket=name):
formatted_keys = [{'Key': key, 'VersionId': version} for key, version in key_version_pairs]
for fk in formatted_keys:
# remove VersionId from cases where they are `None` so that
# unversioned objects are deleted using `DeleteObject`
# rather than `DeleteObjectVersion`, improving backwards
# compatibility with older IAM policies.
if not fk.get('VersionId'):
fk.pop('VersionId')
if formatted_keys:
resp = s3_client.delete_objects(Bucket=name, Delete={'Objects': formatted_keys})
if resp.get('Errors'):
module.fail_json(
msg='Could not empty bucket before deleting. Could not delete objects: {0}'.format(
', '.join([k['Key'] for k in resp['Errors']])
),
errors=resp['Errors'], response=resp
)
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed while deleting bucket")
try:
delete_bucket(s3_client, name)
s3_client.get_waiter('bucket_not_exists').wait(Bucket=name, WaiterConfig=dict(Delay=5, MaxAttempts=60))
except WaiterError as e:
module.fail_json_aws(e, msg='An error occurred waiting for the bucket to be deleted.')
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to delete bucket")
module.exit_json(changed=True)
def is_fakes3(s3_url):
""" Return True if s3_url has scheme fakes3:// """
if s3_url is not None:
return urlparse(s3_url).scheme in ('fakes3', 'fakes3s')
else:
return False
def get_s3_client(module, aws_connect_kwargs, location, ceph, s3_url):
if s3_url and ceph: # TODO - test this
ceph = urlparse(s3_url)
params = dict(module=module, conn_type='client', resource='s3', use_ssl=ceph.scheme == 'https', region=location, endpoint=s3_url, **aws_connect_kwargs)
elif is_fakes3(s3_url):
fakes3 = urlparse(s3_url)
port = fakes3.port
if fakes3.scheme == 'fakes3s':
protocol = "https"
if port is None:
port = 443
else:
protocol = "http"
if port is None:
port = 80
params = dict(module=module, conn_type='client', resource='s3', region=location,
endpoint="%s://%s:%s" % (protocol, fakes3.hostname, to_text(port)),
use_ssl=fakes3.scheme == 'fakes3s', **aws_connect_kwargs)
else:
params = dict(module=module, conn_type='client', resource='s3', region=location, endpoint=s3_url, **aws_connect_kwargs)
return boto3_conn(**params)
def main():
argument_spec = ec2_argument_spec()
argument_spec.update(
dict(
force=dict(default=False, type='bool'),
policy=dict(type='json'),
name=dict(required=True),
requester_pays=dict(default=False, type='bool'),
s3_url=dict(aliases=['S3_URL']),
state=dict(default='present', choices=['present', 'absent']),
tags=dict(type='dict'),
purge_tags=dict(type='bool', default=True),
versioning=dict(type='bool'),
ceph=dict(default=False, type='bool'),
encryption=dict(choices=['none', 'AES256', 'aws:kms']),
encryption_key_id=dict()
)
)
module = AnsibleAWSModule(
argument_spec=argument_spec,
)
region, ec2_url, aws_connect_kwargs = get_aws_connection_info(module, boto3=True)
if region in ('us-east-1', '', None):
# default to US Standard region
location = 'us-east-1'
else:
# Boto uses symbolic names for locations but region strings will
# actually work fine for everything except us-east-1 (US Standard)
location = region
s3_url = module.params.get('s3_url')
ceph = module.params.get('ceph')
# allow eucarc environment variables to be used if ansible vars aren't set
if not s3_url and 'S3_URL' in os.environ:
s3_url = os.environ['S3_URL']
if ceph and not s3_url:
module.fail_json(msg='ceph flavour requires s3_url')
# Look at s3_url and tweak connection settings
# if connecting to Ceph RGW, Walrus or fakes3
if s3_url:
for key in ['validate_certs', 'security_token', 'profile_name']:
aws_connect_kwargs.pop(key, None)
s3_client = get_s3_client(module, aws_connect_kwargs, location, ceph, s3_url)
if s3_client is None: # this should never happen
module.fail_json(msg='Unknown error, failed to create s3 connection, no information from boto.')
state = module.params.get("state")
encryption = module.params.get("encryption")
encryption_key_id = module.params.get("encryption_key_id")
# Parameter validation
if encryption_key_id is not None and encryption is None:
module.fail_json(msg="You must specify encryption parameter along with encryption_key_id.")
elif encryption_key_id is not None and encryption != 'aws:kms':
module.fail_json(msg="Only 'aws:kms' is a valid option for encryption parameter when you specify encryption_key_id.")
if state == 'present':
create_or_update_bucket(s3_client, module, location)
elif state == 'absent':
destroy_bucket(s3_client, module)
if __name__ == '__main__':
main()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,706 |
Cannot put space after comma in inventory list
|
##### SUMMARY
When configuring multiple inventory sources in `ansible.cfg`, you cannot add a space after the comma.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
config
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.10.0.dev0
config file = /home/alwyn/tmp/inventory-test/ansible.cfg
configured module search path = ['/home/alwyn/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/alwyn/.local/opt/pyenv/versions/3.8.1/envs/inventory-test/lib/python3.8/site-packages/ansible
executable location = /home/alwyn/.local/opt/pyenv/versions/3.8.1/envs/inventory-test/bin/ansible
python version = 3.8.1 (default, Feb 21 2020, 15:48:23) [GCC 9.2.1 20200130]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
DEFAULT_HOST_LIST(/home/alwyn/tmp/inventory-test/ansible.cfg) = ['/home/alwyn/tmp/inventory-test/hosts', '/home/alwyn/tmp/inventory-test/ hosts2']
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
Any/all (Linux?) platforms. Not sure about Windows.
I'm on Archlinux with kernel 5.5.4-zen1-1-zen.
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
Configure `default.inventory` in `ansible.cfg` like so:
```
[defaults]
inventory = hosts-foo, hosts-bar
```
1. Create two inventory files
- `hosts-foo` with contents
```
[foo]
localhost
```
- `hosts-bar` with contents
```
[bar]
localhost
```
1. `ansible bar -m ping`
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
The `bar` hostgroup is found and the ping module runs against localhost.
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
The `bar` hostgroup is not found as it tries to find it in `$directory/ hosts-bar` instead of `$directory/hosts-bar`.
<!--- Paste verbatim command output between quotes -->
```paste below
[WARNING]: You are running the development version of Ansible. You should only run Ansible from "devel" if you are modifying the Ansible engine, or trying out features under development.
This is a rapidly changing source of code and can become unstable at any point.
[WARNING]: Unable to parse /home/alwyn/tmp/inventory-test/ hosts-bar as an inventory source
[WARNING]: Could not match supplied host pattern, ignoring: bar
[WARNING]: No hosts matched, nothing to do
```
##### FURTHER DETAILS
This should get fixed by #67701.
|
https://github.com/ansible/ansible/issues/67706
|
https://github.com/ansible/ansible/pull/67701
|
726d6455d813e40f37e1298f0dce8a2f76709de4
|
9ea5bb336400e20178cc6fb3816aef30dbdc8b60
| 2020-02-24T15:43:08Z |
python
| 2020-02-25T14:16:27Z |
changelogs/fragments/pathlist_strip.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,706 |
Cannot put space after comma in inventory list
|
##### SUMMARY
When configuring multiple inventory sources in `ansible.cfg`, you cannot add a space after the comma.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
config
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.10.0.dev0
config file = /home/alwyn/tmp/inventory-test/ansible.cfg
configured module search path = ['/home/alwyn/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/alwyn/.local/opt/pyenv/versions/3.8.1/envs/inventory-test/lib/python3.8/site-packages/ansible
executable location = /home/alwyn/.local/opt/pyenv/versions/3.8.1/envs/inventory-test/bin/ansible
python version = 3.8.1 (default, Feb 21 2020, 15:48:23) [GCC 9.2.1 20200130]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
DEFAULT_HOST_LIST(/home/alwyn/tmp/inventory-test/ansible.cfg) = ['/home/alwyn/tmp/inventory-test/hosts', '/home/alwyn/tmp/inventory-test/ hosts2']
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
Any/all (Linux?) platforms. Not sure about Windows.
I'm on Archlinux with kernel 5.5.4-zen1-1-zen.
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
Configure `default.inventory` in `ansible.cfg` like so:
```
[defaults]
inventory = hosts-foo, hosts-bar
```
1. Create two inventory files
- `hosts-foo` with contents
```
[foo]
localhost
```
- `hosts-bar` with contents
```
[bar]
localhost
```
1. `ansible bar -m ping`
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
The `bar` hostgroup is found and the ping module runs against localhost.
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
The `bar` hostgroup is not found as it tries to find it in `$directory/ hosts-bar` instead of `$directory/hosts-bar`.
<!--- Paste verbatim command output between quotes -->
```paste below
[WARNING]: You are running the development version of Ansible. You should only run Ansible from "devel" if you are modifying the Ansible engine, or trying out features under development.
This is a rapidly changing source of code and can become unstable at any point.
[WARNING]: Unable to parse /home/alwyn/tmp/inventory-test/ hosts-bar as an inventory source
[WARNING]: Could not match supplied host pattern, ignoring: bar
[WARNING]: No hosts matched, nothing to do
```
##### FURTHER DETAILS
This should get fixed by #67701.
|
https://github.com/ansible/ansible/issues/67706
|
https://github.com/ansible/ansible/pull/67701
|
726d6455d813e40f37e1298f0dce8a2f76709de4
|
9ea5bb336400e20178cc6fb3816aef30dbdc8b60
| 2020-02-24T15:43:08Z |
python
| 2020-02-25T14:16:27Z |
lib/ansible/config/manager.py
|
# Copyright: (c) 2017, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import atexit
import io
import os
import os.path
import sys
import stat
import tempfile
import traceback
from collections import namedtuple
from yaml import load as yaml_load
try:
# use C version if possible for speedup
from yaml import CSafeLoader as SafeLoader
except ImportError:
from yaml import SafeLoader
from ansible.config.data import ConfigData
from ansible.errors import AnsibleOptionsError, AnsibleError
from ansible.module_utils._text import to_text, to_bytes, to_native
from ansible.module_utils.common._collections_compat import Sequence
from ansible.module_utils.six import PY3, string_types
from ansible.module_utils.six.moves import configparser
from ansible.module_utils.parsing.convert_bool import boolean
from ansible.parsing.quoting import unquote
from ansible.parsing.yaml.objects import AnsibleVaultEncryptedUnicode
from ansible.utils import py3compat
from ansible.utils.path import cleanup_tmp_file, makedirs_safe, unfrackpath
Plugin = namedtuple('Plugin', 'name type')
Setting = namedtuple('Setting', 'name value origin type')
INTERNAL_DEFS = {'lookup': ('_terms',)}
def _get_entry(plugin_type, plugin_name, config):
''' construct entry for requested config '''
entry = ''
if plugin_type:
entry += 'plugin_type: %s ' % plugin_type
if plugin_name:
entry += 'plugin: %s ' % plugin_name
entry += 'setting: %s ' % config
return entry
# FIXME: see if we can unify in module_utils with similar function used by argspec
def ensure_type(value, value_type, origin=None):
''' return a configuration variable with casting
:arg value: The value to ensure correct typing of
:kwarg value_type: The type of the value. This can be any of the following strings:
:boolean: sets the value to a True or False value
:bool: Same as 'boolean'
:integer: Sets the value to an integer or raises a ValueType error
:int: Same as 'integer'
:float: Sets the value to a float or raises a ValueType error
:list: Treats the value as a comma separated list. Split the value
and return it as a python list.
:none: Sets the value to None
:path: Expands any environment variables and tilde's in the value.
:tmppath: Create a unique temporary directory inside of the directory
specified by value and return its path.
:temppath: Same as 'tmppath'
:tmp: Same as 'tmppath'
:pathlist: Treat the value as a typical PATH string. (On POSIX, this
means colon separated strings.) Split the value and then expand
each part for environment variables and tildes.
:pathspec: Treat the value as a PATH string. Expands any environment variables
tildes's in the value.
:str: Sets the value to string types.
:string: Same as 'str'
'''
errmsg = ''
basedir = None
if origin and os.path.isabs(origin) and os.path.exists(to_bytes(origin)):
basedir = origin
if value_type:
value_type = value_type.lower()
if value is not None:
if value_type in ('boolean', 'bool'):
value = boolean(value, strict=False)
elif value_type in ('integer', 'int'):
value = int(value)
elif value_type == 'float':
value = float(value)
elif value_type == 'list':
if isinstance(value, string_types):
value = [x.strip() for x in value.split(',')]
elif not isinstance(value, Sequence):
errmsg = 'list'
elif value_type == 'none':
if value == "None":
value = None
if value is not None:
errmsg = 'None'
elif value_type == 'path':
if isinstance(value, string_types):
value = resolve_path(value, basedir=basedir)
else:
errmsg = 'path'
elif value_type in ('tmp', 'temppath', 'tmppath'):
if isinstance(value, string_types):
value = resolve_path(value, basedir=basedir)
if not os.path.exists(value):
makedirs_safe(value, 0o700)
prefix = 'ansible-local-%s' % os.getpid()
value = tempfile.mkdtemp(prefix=prefix, dir=value)
atexit.register(cleanup_tmp_file, value, warn=True)
else:
errmsg = 'temppath'
elif value_type == 'pathspec':
if isinstance(value, string_types):
value = value.split(os.pathsep)
if isinstance(value, Sequence):
value = [resolve_path(x, basedir=basedir) for x in value]
else:
errmsg = 'pathspec'
elif value_type == 'pathlist':
if isinstance(value, string_types):
value = value.split(',')
if isinstance(value, Sequence):
value = [resolve_path(x, basedir=basedir) for x in value]
else:
errmsg = 'pathlist'
elif value_type in ('str', 'string'):
if isinstance(value, string_types):
value = unquote(to_text(value, errors='surrogate_or_strict'))
else:
errmsg = 'string'
# defaults to string type
elif isinstance(value, string_types):
value = unquote(to_text(value, errors='surrogate_or_strict'))
if errmsg:
raise ValueError('Invalid type provided for "%s": %s' % (errmsg, to_native(value)))
return to_text(value, errors='surrogate_or_strict', nonstring='passthru')
# FIXME: see if this can live in utils/path
def resolve_path(path, basedir=None):
''' resolve relative or 'variable' paths '''
if '{{CWD}}' in path: # allow users to force CWD using 'magic' {{CWD}}
path = path.replace('{{CWD}}', os.getcwd())
return unfrackpath(path, follow=False, basedir=basedir)
# FIXME: generic file type?
def get_config_type(cfile):
ftype = None
if cfile is not None:
ext = os.path.splitext(cfile)[-1]
if ext in ('.ini', '.cfg'):
ftype = 'ini'
elif ext in ('.yaml', '.yml'):
ftype = 'yaml'
else:
raise AnsibleOptionsError("Unsupported configuration file extension for %s: %s" % (cfile, to_native(ext)))
return ftype
# FIXME: can move to module_utils for use for ini plugins also?
def get_ini_config_value(p, entry):
''' returns the value of last ini entry found '''
value = None
if p is not None:
try:
value = p.get(entry.get('section', 'defaults'), entry.get('key', ''), raw=True)
except Exception: # FIXME: actually report issues here
pass
return value
def find_ini_config_file(warnings=None):
''' Load INI Config File order(first found is used): ENV, CWD, HOME, /etc/ansible '''
# FIXME: eventually deprecate ini configs
if warnings is None:
# Note: In this case, warnings does nothing
warnings = set()
# A value that can never be a valid path so that we can tell if ANSIBLE_CONFIG was set later
# We can't use None because we could set path to None.
SENTINEL = object
potential_paths = []
# Environment setting
path_from_env = os.getenv("ANSIBLE_CONFIG", SENTINEL)
if path_from_env is not SENTINEL:
path_from_env = unfrackpath(path_from_env, follow=False)
if os.path.isdir(to_bytes(path_from_env)):
path_from_env = os.path.join(path_from_env, "ansible.cfg")
potential_paths.append(path_from_env)
# Current working directory
warn_cmd_public = False
try:
cwd = os.getcwd()
perms = os.stat(cwd)
cwd_cfg = os.path.join(cwd, "ansible.cfg")
if perms.st_mode & stat.S_IWOTH:
# Working directory is world writable so we'll skip it.
# Still have to look for a file here, though, so that we know if we have to warn
if os.path.exists(cwd_cfg):
warn_cmd_public = True
else:
potential_paths.append(cwd_cfg)
except OSError:
# If we can't access cwd, we'll simply skip it as a possible config source
pass
# Per user location
potential_paths.append(unfrackpath("~/.ansible.cfg", follow=False))
# System location
potential_paths.append("/etc/ansible/ansible.cfg")
for path in potential_paths:
b_path = to_bytes(path)
if os.path.exists(b_path) and os.access(b_path, os.R_OK):
break
else:
path = None
# Emit a warning if all the following are true:
# * We did not use a config from ANSIBLE_CONFIG
# * There's an ansible.cfg in the current working directory that we skipped
if path_from_env != path and warn_cmd_public:
warnings.add(u"Ansible is being run in a world writable directory (%s),"
u" ignoring it as an ansible.cfg source."
u" For more information see"
u" https://docs.ansible.com/ansible/devel/reference_appendices/config.html#cfg-in-world-writable-dir"
% to_text(cwd))
return path
class ConfigManager(object):
DEPRECATED = []
WARNINGS = set()
def __init__(self, conf_file=None, defs_file=None):
self._base_defs = {}
self._plugins = {}
self._parsers = {}
self._config_file = conf_file
self.data = ConfigData()
self._base_defs = self._read_config_yaml_file(defs_file or ('%s/base.yml' % os.path.dirname(__file__)))
if self._config_file is None:
# set config using ini
self._config_file = find_ini_config_file(self.WARNINGS)
# consume configuration
if self._config_file:
# initialize parser and read config
self._parse_config_file()
# update constants
self.update_config_data()
try:
self.update_module_defaults_groups()
except Exception as e:
# Since this is a 2.7 preview feature, we want to have it fail as gracefully as possible when there are issues.
sys.stderr.write('Could not load module_defaults_groups: %s: %s\n\n' % (type(e).__name__, e))
self.module_defaults_groups = {}
def _read_config_yaml_file(self, yml_file):
# TODO: handle relative paths as relative to the directory containing the current playbook instead of CWD
# Currently this is only used with absolute paths to the `ansible/config` directory
yml_file = to_bytes(yml_file)
if os.path.exists(yml_file):
with open(yml_file, 'rb') as config_def:
return yaml_load(config_def, Loader=SafeLoader) or {}
raise AnsibleError(
"Missing base YAML definition file (bad install?): %s" % to_native(yml_file))
def _parse_config_file(self, cfile=None):
''' return flat configuration settings from file(s) '''
# TODO: take list of files with merge/nomerge
if cfile is None:
cfile = self._config_file
ftype = get_config_type(cfile)
if cfile is not None:
if ftype == 'ini':
self._parsers[cfile] = configparser.ConfigParser()
with open(to_bytes(cfile), 'rb') as f:
try:
cfg_text = to_text(f.read(), errors='surrogate_or_strict')
except UnicodeError as e:
raise AnsibleOptionsError("Error reading config file(%s) because the config file was not utf8 encoded: %s" % (cfile, to_native(e)))
try:
if PY3:
self._parsers[cfile].read_string(cfg_text)
else:
cfg_file = io.StringIO(cfg_text)
self._parsers[cfile].readfp(cfg_file)
except configparser.Error as e:
raise AnsibleOptionsError("Error reading config file (%s): %s" % (cfile, to_native(e)))
# FIXME: this should eventually handle yaml config files
# elif ftype == 'yaml':
# with open(cfile, 'rb') as config_stream:
# self._parsers[cfile] = yaml.safe_load(config_stream)
else:
raise AnsibleOptionsError("Unsupported configuration file type: %s" % to_native(ftype))
def _find_yaml_config_files(self):
''' Load YAML Config Files in order, check merge flags, keep origin of settings'''
pass
def get_plugin_options(self, plugin_type, name, keys=None, variables=None, direct=None):
options = {}
defs = self.get_configuration_definitions(plugin_type, name)
for option in defs:
options[option] = self.get_config_value(option, plugin_type=plugin_type, plugin_name=name, keys=keys, variables=variables, direct=direct)
return options
def get_plugin_vars(self, plugin_type, name):
pvars = []
for pdef in self.get_configuration_definitions(plugin_type, name).values():
if 'vars' in pdef and pdef['vars']:
for var_entry in pdef['vars']:
pvars.append(var_entry['name'])
return pvars
def get_configuration_definition(self, name, plugin_type=None, plugin_name=None):
ret = {}
if plugin_type is None:
ret = self._base_defs.get(name, None)
elif plugin_name is None:
ret = self._plugins.get(plugin_type, {}).get(name, None)
else:
ret = self._plugins.get(plugin_type, {}).get(plugin_name, {}).get(name, None)
return ret
def get_configuration_definitions(self, plugin_type=None, name=None):
''' just list the possible settings, either base or for specific plugins or plugin '''
ret = {}
if plugin_type is None:
ret = self._base_defs
elif name is None:
ret = self._plugins.get(plugin_type, {})
else:
ret = self._plugins.get(plugin_type, {}).get(name, {})
return ret
def _loop_entries(self, container, entry_list):
''' repeat code for value entry assignment '''
value = None
origin = None
for entry in entry_list:
name = entry.get('name')
try:
temp_value = container.get(name, None)
except UnicodeEncodeError:
self.WARNINGS.add(u'value for config entry {0} contains invalid characters, ignoring...'.format(to_text(name)))
continue
if temp_value is not None: # only set if entry is defined in container
# inline vault variables should be converted to a text string
if isinstance(temp_value, AnsibleVaultEncryptedUnicode):
temp_value = to_text(temp_value, errors='surrogate_or_strict')
value = temp_value
origin = name
# deal with deprecation of setting source, if used
if 'deprecated' in entry:
self.DEPRECATED.append((entry['name'], entry['deprecated']))
return value, origin
def get_config_value(self, config, cfile=None, plugin_type=None, plugin_name=None, keys=None, variables=None, direct=None):
''' wrapper '''
try:
value, _drop = self.get_config_value_and_origin(config, cfile=cfile, plugin_type=plugin_type, plugin_name=plugin_name,
keys=keys, variables=variables, direct=direct)
except AnsibleError:
raise
except Exception as e:
raise AnsibleError("Unhandled exception when retrieving %s:\n%s" % (config, to_native(e)), orig_exc=e)
return value
def get_config_value_and_origin(self, config, cfile=None, plugin_type=None, plugin_name=None, keys=None, variables=None, direct=None):
''' Given a config key figure out the actual value and report on the origin of the settings '''
if cfile is None:
# use default config
cfile = self._config_file
# Note: sources that are lists listed in low to high precedence (last one wins)
value = None
origin = None
defs = self.get_configuration_definitions(plugin_type, plugin_name)
if config in defs:
# direct setting via plugin arguments, can set to None so we bypass rest of processing/defaults
direct_aliases = []
if direct:
direct_aliases = [direct[alias] for alias in defs[config].get('aliases', []) if alias in direct]
if direct and config in direct:
value = direct[config]
origin = 'Direct'
elif direct and direct_aliases:
value = direct_aliases[0]
origin = 'Direct'
else:
# Use 'variable overrides' if present, highest precedence, but only present when querying running play
if variables and defs[config].get('vars'):
value, origin = self._loop_entries(variables, defs[config]['vars'])
origin = 'var: %s' % origin
# use playbook keywords if you have em
if value is None and keys and config in keys:
value, origin = keys[config], 'keyword'
origin = 'keyword: %s' % origin
# env vars are next precedence
if value is None and defs[config].get('env'):
value, origin = self._loop_entries(py3compat.environ, defs[config]['env'])
origin = 'env: %s' % origin
# try config file entries next, if we have one
if self._parsers.get(cfile, None) is None:
self._parse_config_file(cfile)
if value is None and cfile is not None:
ftype = get_config_type(cfile)
if ftype and defs[config].get(ftype):
if ftype == 'ini':
# load from ini config
try: # FIXME: generalize _loop_entries to allow for files also, most of this code is dupe
for ini_entry in defs[config]['ini']:
temp_value = get_ini_config_value(self._parsers[cfile], ini_entry)
if temp_value is not None:
value = temp_value
origin = cfile
if 'deprecated' in ini_entry:
self.DEPRECATED.append(('[%s]%s' % (ini_entry['section'], ini_entry['key']), ini_entry['deprecated']))
except Exception as e:
sys.stderr.write("Error while loading ini config %s: %s" % (cfile, to_native(e)))
elif ftype == 'yaml':
# FIXME: implement, also , break down key from defs (. notation???)
origin = cfile
# set default if we got here w/o a value
if value is None:
if defs[config].get('required', False):
if not plugin_type or config not in INTERNAL_DEFS.get(plugin_type, {}):
raise AnsibleError("No setting was provided for required configuration %s" %
to_native(_get_entry(plugin_type, plugin_name, config)))
else:
value = defs[config].get('default')
origin = 'default'
# skip typing as this is a templated default that will be resolved later in constants, which has needed vars
if plugin_type is None and isinstance(value, string_types) and (value.startswith('{{') and value.endswith('}}')):
return value, origin
# ensure correct type, can raise exceptions on mismatched types
try:
value = ensure_type(value, defs[config].get('type'), origin=origin)
except ValueError as e:
if origin.startswith('env:') and value == '':
# this is empty env var for non string so we can set to default
origin = 'default'
value = ensure_type(defs[config].get('default'), defs[config].get('type'), origin=origin)
else:
raise AnsibleOptionsError('Invalid type for configuration option %s: %s' %
(to_native(_get_entry(plugin_type, plugin_name, config)), to_native(e)))
# deal with deprecation of the setting
if 'deprecated' in defs[config] and origin != 'default':
self.DEPRECATED.append((config, defs[config].get('deprecated')))
else:
raise AnsibleError('Requested entry (%s) was not defined in configuration.' % to_native(_get_entry(plugin_type, plugin_name, config)))
return value, origin
def initialize_plugin_configuration_definitions(self, plugin_type, name, defs):
if plugin_type not in self._plugins:
self._plugins[plugin_type] = {}
self._plugins[plugin_type][name] = defs
def update_module_defaults_groups(self):
defaults_config = self._read_config_yaml_file(
'%s/module_defaults.yml' % os.path.join(os.path.dirname(__file__))
)
if defaults_config.get('version') not in ('1', '1.0', 1, 1.0):
raise AnsibleError('module_defaults.yml has an invalid version "%s" for configuration. Could be a bad install.' % defaults_config.get('version'))
self.module_defaults_groups = defaults_config.get('groupings', {})
def update_config_data(self, defs=None, configfile=None):
''' really: update constants '''
if defs is None:
defs = self._base_defs
if configfile is None:
configfile = self._config_file
if not isinstance(defs, dict):
raise AnsibleOptionsError("Invalid configuration definition type: %s for %s" % (type(defs), defs))
# update the constant for config file
self.data.update_setting(Setting('CONFIG_FILE', configfile, '', 'string'))
origin = None
# env and config defs can have several entries, ordered in list from lowest to highest precedence
for config in defs:
if not isinstance(defs[config], dict):
raise AnsibleOptionsError("Invalid configuration definition '%s': type is %s" % (to_native(config), type(defs[config])))
# get value and origin
try:
value, origin = self.get_config_value_and_origin(config, configfile)
except Exception as e:
# Printing the problem here because, in the current code:
# (1) we can't reach the error handler for AnsibleError before we
# hit a different error due to lack of working config.
# (2) We don't have access to display yet because display depends on config
# being properly loaded.
#
# If we start getting double errors printed from this section of code, then the
# above problem #1 has been fixed. Revamp this to be more like the try: except
# in get_config_value() at that time.
sys.stderr.write("Unhandled error:\n %s\n\n" % traceback.format_exc())
raise AnsibleError("Invalid settings supplied for %s: %s\n" % (config, to_native(e)), orig_exc=e)
# set the constant
self.data.update_setting(Setting(config, value, origin, defs[config].get('type', 'string')))
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,672 |
win_dns_record: Add documentation for multi-valued entries
|
##### SUMMARY
#67628 exposed an area in which `win_dns_record` could be improved, as far as _documenting_ how to add multiple values to a given `A` (or other) record type.
Although the `value` attribute is clearly documented as taking a _list_, **none of the examples actually show that** --- leading to the mistaken impression in the case of #67628 that there was no existing way to do that. A few quick examples would go a long way.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
`win_dns_record`
##### ANSIBLE VERSION
Affects all versions that have this module (ie 2.8 and up)
##### CONFIGURATION
n/a
##### OS / ENVIRONMENT
Windows 2012+ targets only
##### STEPS TO REPRODUCE
n/a
##### Musings
Should give examples for both `A` and `MX` records, since those are (probably) the most common multi-valued record types.
|
https://github.com/ansible/ansible/issues/67672
|
https://github.com/ansible/ansible/pull/67744
|
1b5c69deefe7fe5e14b37f93b8a2aab90e6663be
|
e9f22a22693d1cc2ca4da02490e9a3589cb9fe36
| 2020-02-22T23:09:13Z |
python
| 2020-02-25T21:03:03Z |
lib/ansible/modules/windows/win_dns_record.py
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright: (c) 2019, Hitachi ID Systems, Inc.
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
# This is a windows documentation stub. The actual code lives in the .ps1
# file of the same name.
ANSIBLE_METADATA = {'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'}
DOCUMENTATION = r'''
---
module: win_dns_record
version_added: "2.8"
short_description: Manage Windows Server DNS records
description:
- Manage DNS records within an existing Windows Server DNS zone.
author: John Nelson (@johnboy2)
requirements:
- This module requires Windows 8, Server 2012, or newer.
options:
name:
description:
- The name of the record.
required: yes
type: str
state:
description:
- Whether the record should exist or not.
choices: [ absent, present ]
default: present
type: str
ttl:
description:
- The "time to live" of the record, in seconds.
- Ignored when C(state=absent).
- Valid range is 1 - 31557600.
- Note that an Active Directory forest can specify a minimum TTL, and will
dynamically "round up" other values to that minimum.
default: 3600
type: int
type:
description:
- The type of DNS record to manage.
choices: [ A, AAAA, CNAME, PTR ]
required: yes
type: str
value:
description:
- The value(s) to specify. Required when C(state=present).
- When C(type=PTR) only the partial part of the IP should be given.
aliases: [ values ]
type: list
zone:
description:
- The name of the zone to manage (eg C(example.com)).
- The zone must already exist.
required: yes
type: str
computer_name:
description:
- Specifies a DNS server.
- You can specify an IP address or any value that resolves to an IP
address, such as a fully qualified domain name (FQDN), host name, or
NETBIOS name.
type: str
'''
EXAMPLES = r'''
- name: Create database server alias
win_dns_record:
name: "db1"
type: "CNAME"
value: "cgyl1404p.amer.example.com"
zone: "amer.example.com"
- name: PTR example
win_dns_record:
name: "1.1.1"
type: "PTR"
value: "db1"
zone: "10.in-addr.arpa"
- name: Remove static record
win_dns_record:
name: "db1"
type: "A"
state: absent
zone: "amer.example.com"
'''
RETURN = r'''
'''
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,172 |
Add CI platform: freebsd/12.1
|
##### SUMMARY
Replace the `freebsd/12.0` platform in the test matrix with `freebsd/12.1`.
FreeBSD 12.1 was released in November, 2019.
##### ISSUE TYPE
Feature Idea
##### COMPONENT NAME
shippable.yml
|
https://github.com/ansible/ansible/issues/67172
|
https://github.com/ansible/ansible/pull/67659
|
e0eee3c37e8e802113d5c3c41681b1cf4af7b0e9
|
7a42354021272dccf9037352c7ce645a9f082c4f
| 2020-02-06T17:31:07Z |
python
| 2020-02-28T22:12:55Z |
shippable.yml
|
language: python
env:
matrix:
- T=none
matrix:
exclude:
- env: T=none
include:
- env: T=sanity/1
- env: T=sanity/2
- env: T=sanity/3
- env: T=sanity/4
- env: T=sanity/5
- env: T=units/2.6/1
- env: T=units/2.7/1
- env: T=units/3.5/1
- env: T=units/3.6/1
- env: T=units/3.7/1
- env: T=units/3.8/1
- env: T=units/2.6/2
- env: T=units/2.7/2
- env: T=units/3.5/2
- env: T=units/3.6/2
- env: T=units/3.7/2
- env: T=units/3.8/2
- env: T=windows/2012/1
- env: T=windows/2012-R2/1
- env: T=windows/2016/1
- env: T=windows/2019/1
- env: T=windows/2012/2
- env: T=windows/2012-R2/2
- env: T=windows/2016/2
- env: T=windows/2019/2
- env: T=windows/2012/3
- env: T=windows/2012-R2/3
- env: T=windows/2016/3
- env: T=windows/2019/3
- env: T=windows/2012/4
- env: T=windows/2012-R2/4
- env: T=windows/2016/4
- env: T=windows/2019/4
- env: T=windows/2012/5
- env: T=windows/2012-R2/5
- env: T=windows/2016/5
- env: T=windows/2019/5
- env: T=windows/2012/6
- env: T=windows/2012-R2/6
- env: T=windows/2016/6
- env: T=windows/2019/6
- env: T=windows/2012/7
- env: T=windows/2012-R2/7
- env: T=windows/2016/7
- env: T=windows/2019/7
- env: T=i/windows/2012
- env: T=i/windows/2012-R2
- env: T=i/windows/2016
- env: T=i/windows/2019
- env: T=ios/csr1000v//1
- env: T=vyos/1.1.8/2.7/1
- env: T=vyos/1.1.8/3.6/1
- env: T=i/ios/csr1000v//1
- env: T=i/vyos/1.1.8/2.7/1
- env: T=i/vyos/1.1.8/3.6/1
- env: T=aix/7.2/1
- env: T=osx/10.11/1
- env: T=rhel/7.6/1
- env: T=rhel/8.1/1
- env: T=freebsd/11.1/1
- env: T=freebsd/12.0/1
- env: T=linux/centos6/1
- env: T=linux/centos7/1
- env: T=linux/centos8/1
- env: T=linux/fedora30/1
- env: T=linux/fedora31/1
- env: T=linux/opensuse15py2/1
- env: T=linux/opensuse15/1
- env: T=linux/ubuntu1604/1
- env: T=linux/ubuntu1804/1
- env: T=aix/7.2/2
- env: T=osx/10.11/2
- env: T=rhel/7.6/2
- env: T=rhel/8.1/2
- env: T=freebsd/11.1/2
- env: T=freebsd/12.0/2
- env: T=linux/centos6/2
- env: T=linux/centos7/2
- env: T=linux/centos8/2
- env: T=linux/fedora30/2
- env: T=linux/fedora31/2
- env: T=linux/opensuse15py2/2
- env: T=linux/opensuse15/2
- env: T=linux/ubuntu1604/2
- env: T=linux/ubuntu1804/2
- env: T=aix/7.2/3
- env: T=osx/10.11/3
- env: T=rhel/7.6/3
- env: T=rhel/8.1/3
- env: T=freebsd/11.1/3
- env: T=freebsd/12.0/3
- env: T=linux/centos6/3
- env: T=linux/centos7/3
- env: T=linux/centos8/3
- env: T=linux/fedora30/3
- env: T=linux/fedora31/3
- env: T=linux/opensuse15py2/3
- env: T=linux/opensuse15/3
- env: T=linux/ubuntu1604/3
- env: T=linux/ubuntu1804/3
- env: T=aix/7.2/4
- env: T=osx/10.11/4
- env: T=rhel/7.6/4
- env: T=rhel/8.1/4
- env: T=freebsd/11.1/4
- env: T=freebsd/12.0/4
- env: T=linux/centos6/4
- env: T=linux/centos7/4
- env: T=linux/centos8/4
- env: T=linux/fedora30/4
- env: T=linux/fedora31/4
- env: T=linux/opensuse15py2/4
- env: T=linux/opensuse15/4
- env: T=linux/ubuntu1604/4
- env: T=linux/ubuntu1804/4
- env: T=aix/7.2/5
- env: T=osx/10.11/5
- env: T=rhel/7.6/5
- env: T=rhel/8.1/5
- env: T=freebsd/11.1/5
- env: T=freebsd/12.0/5
- env: T=linux/centos6/5
- env: T=linux/centos7/5
- env: T=linux/centos8/5
- env: T=linux/fedora30/5
- env: T=linux/fedora31/5
- env: T=linux/opensuse15py2/5
- env: T=linux/opensuse15/5
- env: T=linux/ubuntu1604/5
- env: T=linux/ubuntu1804/5
- env: T=aws/2.7/1
- env: T=aws/3.6/1
- env: T=aws/2.7/2
- env: T=aws/3.6/2
- env: T=aws/2.7/3
- env: T=aws/3.6/3
- env: T=aws/2.7/4
- env: T=aws/3.6/4
- env: T=i/aws/2.7/1
- env: T=i/aws/3.6/1
- env: T=azure/2.7/1
- env: T=azure/3.6/1
- env: T=azure/2.7/2
- env: T=azure/3.6/2
- env: T=azure/2.7/3
- env: T=azure/3.6/3
- env: T=azure/2.7/4
- env: T=azure/3.6/4
- env: T=azure/2.7/5
- env: T=azure/3.6/5
- env: T=azure/2.7/6
- env: T=azure/3.6/6
- env: T=azure/2.7/7
- env: T=azure/3.6/7
- env: T=azure/2.7/8
- env: T=azure/3.6/8
- env: T=azure/2.7/9
- env: T=azure/3.6/9
- env: T=azure/2.7/10
- env: T=azure/3.6/10
- env: T=azure/2.7/11
- env: T=azure/3.6/11
- env: T=i/azure/2.7/1
- env: T=i/azure/3.6/1
- env: T=vcenter/2.7/1
- env: T=vcenter/3.6/1
- env: T=vcenter/2.7/2
- env: T=vcenter/3.6/2
- env: T=i/vcenter//1
- env: T=cs/2.7/1
- env: T=cs/3.6/1
- env: T=cs/2.7/2
- env: T=cs/3.6/2
- env: T=i/cs//1
- env: T=tower/2.7/1
- env: T=tower/3.6/1
- env: T=i/tower//1
- env: T=cloud/2.7/1
- env: T=cloud/3.6/1
- env: T=i/cloud//1
- env: T=hcloud/2.7/1
- env: T=hcloud/3.6/1
- env: T=hcloud/2.7/2
- env: T=hcloud/3.6/2
- env: T=i/hcloud//1
branches:
except:
- "*-patch-*"
- "revert-*-*"
build:
ci:
- test/utils/shippable/timing.sh test/utils/shippable/shippable.sh $T
integrations:
notifications:
- integrationName: email
type: email
on_success: never
on_failure: never
on_start: never
on_pull_request: never
- integrationName: irc
type: irc
recipients:
- "chat.freenode.net#ansible-notices"
on_success: change
on_failure: always
on_start: never
on_pull_request: always
- integrationName: slack
type: slack
recipients:
- "#shippable"
on_success: change
on_failure: always
on_start: never
on_pull_request: never
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,172 |
Add CI platform: freebsd/12.1
|
##### SUMMARY
Replace the `freebsd/12.0` platform in the test matrix with `freebsd/12.1`.
FreeBSD 12.1 was released in November, 2019.
##### ISSUE TYPE
Feature Idea
##### COMPONENT NAME
shippable.yml
|
https://github.com/ansible/ansible/issues/67172
|
https://github.com/ansible/ansible/pull/67659
|
e0eee3c37e8e802113d5c3c41681b1cf4af7b0e9
|
7a42354021272dccf9037352c7ce645a9f082c4f
| 2020-02-06T17:31:07Z |
python
| 2020-02-28T22:12:55Z |
test/lib/ansible_test/_data/completion/remote.txt
|
freebsd/11.1 python=2.7,3.6 python_dir=/usr/local/bin
freebsd/12.0 python=3.6,2.7 python_dir=/usr/local/bin
osx/10.11 python=2.7 python_dir=/usr/local/bin
rhel/7.6 python=2.7
rhel/8.1 python=3.6
aix/7.2 python=2.7 httptester=disabled temp-unicode=disabled pip-check=disabled
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 66,135 |
gitlab_project_variable: support hidden/protected variables
|
##### SUMMARY
gitlab_project_variable: support hidden/protected variables
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
gitlab_project_variable
|
https://github.com/ansible/ansible/issues/66135
|
https://github.com/ansible/ansible/pull/67461
|
7a42354021272dccf9037352c7ce645a9f082c4f
|
8ab304af4450c883700b144bcc03c395f205860f
| 2019-12-30T15:57:13Z |
python
| 2020-02-28T22:17:03Z |
changelogs/fragments/67461-gitlab-project-variable-masked-protected.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 66,135 |
gitlab_project_variable: support hidden/protected variables
|
##### SUMMARY
gitlab_project_variable: support hidden/protected variables
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
gitlab_project_variable
|
https://github.com/ansible/ansible/issues/66135
|
https://github.com/ansible/ansible/pull/67461
|
7a42354021272dccf9037352c7ce645a9f082c4f
|
8ab304af4450c883700b144bcc03c395f205860f
| 2019-12-30T15:57:13Z |
python
| 2020-02-28T22:17:03Z |
lib/ansible/modules/source_control/gitlab/gitlab_project_variable.py
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright: (c) 2019, Markus Bergholz ([email protected])
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'}
DOCUMENTATION = '''
module: gitlab_project_variable
short_description: Creates/updates/deletes GitLab Projects Variables
description:
- When a project variable does not exist, it will be created.
- When a project variable does exist, its value will be updated when the values are different.
- Variables which are untouched in the playbook, but are not untouched in the GitLab project,
they stay untouched (I(purge) is C(false)) or will be deleted (I(purge) is C(true)).
version_added: "2.9"
author:
- "Markus Bergholz (@markuman)"
requirements:
- python >= 2.7
- python-gitlab python module
extends_documentation_fragment:
- auth_basic
options:
state:
description:
- Create or delete project variable.
- Possible values are present and absent.
default: present
type: str
choices: ["present", "absent"]
api_token:
description:
- GitLab access token with API permissions.
required: true
type: str
project:
description:
- The path and name of the project.
required: true
type: str
purge:
description:
- When set to true, all variables which are not untouched in the task will be deleted.
default: false
type: bool
vars:
description:
- A list of key value pairs.
default: {}
type: dict
'''
EXAMPLES = '''
- name: Set or update some CI/CD variables
gitlab_project_variable:
api_url: https://gitlab.com
api_token: secret_access_token
project: markuman/dotfiles
purge: false
vars:
ACCESS_KEY_ID: abc123
SECRET_ACCESS_KEY: 321cba
- name: Delete one variable
gitlab_project_variable:
api_url: https://gitlab.com
api_token: secret_access_token
project: markuman/dotfiles
state: absent
vars:
ACCESS_KEY_ID: abc123
'''
RETURN = '''
project_variable:
description: Four lists of the variablenames which were added, updated, removed or exist.
returned: always
type: dict
contains:
added:
description: A list of variables which were created.
returned: always
type: list
sample: "['ACCESS_KEY_ID', 'SECRET_ACCESS_KEY']"
untouched:
description: A list of variables which exist.
returned: always
type: list
sample: "['ACCESS_KEY_ID', 'SECRET_ACCESS_KEY']"
removed:
description: A list of variables which were deleted.
returned: always
type: list
sample: "['ACCESS_KEY_ID', 'SECRET_ACCESS_KEY']"
updated:
description: A list of variables whose values were changed.
returned: always
type: list
sample: "['ACCESS_KEY_ID', 'SECRET_ACCESS_KEY']"
'''
import traceback
from ansible.module_utils.basic import AnsibleModule, missing_required_lib
from ansible.module_utils._text import to_native
from ansible.module_utils.api import basic_auth_argument_spec
GITLAB_IMP_ERR = None
try:
import gitlab
HAS_GITLAB_PACKAGE = True
except Exception:
GITLAB_IMP_ERR = traceback.format_exc()
HAS_GITLAB_PACKAGE = False
from ansible.module_utils.gitlab import gitlabAuthentication
class GitlabProjectVariables(object):
def __init__(self, module, gitlab_instance):
self.repo = gitlab_instance
self.project = self.get_project(module.params['project'])
self._module = module
def get_project(self, project_name):
return self.repo.projects.get(project_name)
def list_all_project_variables(self):
return self.project.variables.list()
def create_variable(self, key, value):
if self._module.check_mode:
return
return self.project.variables.create({"key": key, "value": value})
def update_variable(self, var, value):
if var.value == value:
return False
if self._module.check_mode:
return True
var.value = value
var.save()
return True
def delete_variable(self, key):
if self._module.check_mode:
return
return self.project.variables.delete(key)
def native_python_main(this_gitlab, purge, var_list, state):
change = False
return_value = dict(added=list(), updated=list(), removed=list(), untouched=list())
gitlab_keys = this_gitlab.list_all_project_variables()
existing_variables = [x.get_id() for x in gitlab_keys]
for key in var_list:
if key in existing_variables:
index = existing_variables.index(key)
existing_variables[index] = None
if state == 'present':
single_change = this_gitlab.update_variable(
gitlab_keys[index], var_list[key])
change = single_change or change
if single_change:
return_value['updated'].append(key)
else:
return_value['untouched'].append(key)
elif state == 'absent':
this_gitlab.delete_variable(key)
change = True
return_value['removed'].append(key)
elif key not in existing_variables and state == 'present':
this_gitlab.create_variable(key, var_list[key])
change = True
return_value['added'].append(key)
existing_variables = list(filter(None, existing_variables))
if purge:
for item in existing_variables:
this_gitlab.delete_variable(item)
change = True
return_value['removed'].append(item)
else:
return_value['untouched'].extend(existing_variables)
return change, return_value
def main():
argument_spec = basic_auth_argument_spec()
argument_spec.update(
api_token=dict(type='str', required=True, no_log=True),
project=dict(type='str', required=True),
purge=dict(type='bool', required=False, default=False),
vars=dict(type='dict', required=False, default=dict(), no_log=True),
state=dict(type='str', default="present", choices=["absent", "present"])
)
module = AnsibleModule(
argument_spec=argument_spec,
mutually_exclusive=[
['api_username', 'api_token'],
['api_password', 'api_token'],
],
required_together=[
['api_username', 'api_password'],
],
required_one_of=[
['api_username', 'api_token']
],
supports_check_mode=True
)
purge = module.params['purge']
var_list = module.params['vars']
state = module.params['state']
if not HAS_GITLAB_PACKAGE:
module.fail_json(msg=missing_required_lib("python-gitlab"), exception=GITLAB_IMP_ERR)
gitlab_instance = gitlabAuthentication(module)
this_gitlab = GitlabProjectVariables(module=module, gitlab_instance=gitlab_instance)
change, return_value = native_python_main(this_gitlab, purge, var_list, state)
module.exit_json(changed=change, project_variable=return_value)
if __name__ == '__main__':
main()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 66,135 |
gitlab_project_variable: support hidden/protected variables
|
##### SUMMARY
gitlab_project_variable: support hidden/protected variables
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
gitlab_project_variable
|
https://github.com/ansible/ansible/issues/66135
|
https://github.com/ansible/ansible/pull/67461
|
7a42354021272dccf9037352c7ce645a9f082c4f
|
8ab304af4450c883700b144bcc03c395f205860f
| 2019-12-30T15:57:13Z |
python
| 2020-02-28T22:17:03Z |
test/integration/targets/gitlab_project_variable/tasks/main.yml
|
- name: Install required libs
pip:
name: python-gitlab
state: present
- name: purge all variables for check_mode test
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
purge: True
- name: add a variable value in check_mode
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
vars:
ACCESS_KEY_ID: checkmode
check_mode: yes
register: gitlab_project_variable_state
- name: check_mode state must be changed
assert:
that:
- gitlab_project_variable_state is changed
- name: apply add value from check_mode test
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
vars:
ACCESS_KEY_ID: checkmode
register: gitlab_project_variable_state
- name: state must be changed
assert:
that:
- gitlab_project_variable_state is changed
- name: change a variable value in check_mode again
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
vars:
ACCESS_KEY_ID: checkmode
check_mode: yes
register: gitlab_project_variable_state
- name: check_mode state must not be changed
assert:
that:
- gitlab_project_variable_state is not changed
- name: apply again the value change from check_mode test
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
vars:
ACCESS_KEY_ID: checkmode
register: gitlab_project_variable_state
- name: state must not be changed
assert:
that:
- gitlab_project_variable_state is not changed
- name: purge all variables at the beginning
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
purge: True
- name: set two test variables
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
vars:
ACCESS_KEY_ID: abc123
SECRET_ACCESS_KEY: 321cba
register: gitlab_project_variable_state
- name: set two test variables state must be changed
assert:
that:
- gitlab_project_variable_state is changed
- gitlab_project_variable_state.project_variable.added|length == 2
- gitlab_project_variable_state.project_variable.untouched|length == 0
- gitlab_project_variable_state.project_variable.removed|length == 0
- gitlab_project_variable_state.project_variable.updated|length == 0
- name: re-set two test variables
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
vars:
ACCESS_KEY_ID: abc123
SECRET_ACCESS_KEY: 321cba
register: gitlab_project_variable_state
- name: re-set two test variables state must not be changed
assert:
that:
- gitlab_project_variable_state is not changed
- gitlab_project_variable_state.project_variable.added|length == 0
- gitlab_project_variable_state.project_variable.untouched|length == 2
- gitlab_project_variable_state.project_variable.removed|length == 0
- gitlab_project_variable_state.project_variable.updated|length == 0
- name: edit one variable
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
vars:
ACCESS_KEY_ID: changed
purge: False
register: gitlab_project_variable_state
- name: edit one variable state must be changed
assert:
that:
- gitlab_project_variable_state.changed
- gitlab_project_variable_state.project_variable.added|length == 0
- gitlab_project_variable_state.project_variable.untouched|length == 1
- gitlab_project_variable_state.project_variable.removed|length == 0
- gitlab_project_variable_state.project_variable.updated|length == 1
- gitlab_project_variable_state.project_variable.updated[0] == "ACCESS_KEY_ID"
- name: append one variable
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
vars:
some: value
purge: False
register: gitlab_project_variable_state
- name: append one variable state must be changed
assert:
that:
- gitlab_project_variable_state.changed
- gitlab_project_variable_state.project_variable.added|length == 1
- gitlab_project_variable_state.project_variable.untouched|length == 2
- gitlab_project_variable_state.project_variable.removed|length == 0
- gitlab_project_variable_state.project_variable.updated|length == 0
- gitlab_project_variable_state.project_variable.added[0] == "some"
- name: re-set all variables
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
vars:
ACCESS_KEY_ID: changed
SECRET_ACCESS_KEY: 321cba
some: value
register: gitlab_project_variable_state
- name: re-set all variables state must not be changed
assert:
that:
- not gitlab_project_variable_state.changed
- gitlab_project_variable_state.project_variable.added|length == 0
- gitlab_project_variable_state.project_variable.untouched|length == 3
- gitlab_project_variable_state.project_variable.removed|length == 0
- gitlab_project_variable_state.project_variable.updated|length == 0
- name: set one variables and purge all others
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
vars:
some: value
purge: True
register: gitlab_project_variable_state
- name: set one variables and purge all others state must be changed
assert:
that:
- gitlab_project_variable_state.changed
- gitlab_project_variable_state.project_variable.added|length == 0
- gitlab_project_variable_state.project_variable.untouched|length == 1
- gitlab_project_variable_state.project_variable.removed|length == 2
- gitlab_project_variable_state.project_variable.updated|length == 0
- name: only one variable is left
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
vars:
some: value
purge: False
register: gitlab_project_variable_state
- name: only one variable is left state must not be changed
assert:
that:
- not gitlab_project_variable_state.changed
- gitlab_project_variable_state.project_variable.added|length == 0
- gitlab_project_variable_state.project_variable.untouched|length == 1
- gitlab_project_variable_state.project_variable.removed|length == 0
- gitlab_project_variable_state.project_variable.updated|length == 0
- gitlab_project_variable_state.project_variable.untouched[0] == "some"
- name: delete the last left variable
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
state: absent
vars:
some: value
register: gitlab_project_variable_state
- name: no variable is left state must be changed
assert:
that:
- gitlab_project_variable_state.changed
- gitlab_project_variable_state.project_variable.added|length == 0
- gitlab_project_variable_state.project_variable.untouched|length == 0
- gitlab_project_variable_state.project_variable.removed|length == 1
- gitlab_project_variable_state.project_variable.updated|length == 0
- gitlab_project_variable_state.project_variable.removed[0] == "some"
- name: check that no variables are left
gitlab_project_variable:
api_url: "{{ gitlab_host }}"
api_token: "{{ gitlab_login_token }}"
project: "{{ gitlab_project_name }}"
purge: True
register: gitlab_project_variable_state
- name: check that no variables are untoucheded state must be changed
assert:
that:
- not gitlab_project_variable_state.changed
- gitlab_project_variable_state.project_variable.added|length == 0
- gitlab_project_variable_state.project_variable.untouched|length == 0
- gitlab_project_variable_state.project_variable.removed|length == 0
- gitlab_project_variable_state.project_variable.updated|length == 0
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,795 |
win_unzip path traversal with specially crafted archive
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
CVE-2020-1737
<!--- Explain the problem briefly below -->
A specially crafted zip archive could result in path traversal in the `win_unzip` module.
The `Extract-Zip` function doesn't check if the extracted path belongs to the destination folder.
A possible solution is to [check destination path](https://docs.microsoft.com/en-us/dotnet/api/system.io.compression.ziparchive?view=netframework-4.8).
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
`lib/ansible/modules/windows/win_unzip.ps1`
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
2.10
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
default
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/67795
|
https://github.com/ansible/ansible/pull/67799
|
8ab304af4450c883700b144bcc03c395f205860f
|
d30c57ab22db24f6901166fcc3155667bdd3443f
| 2020-02-26T19:55:42Z |
python
| 2020-02-28T22:56:21Z |
changelogs/fragments/win-unzip-check-extraction-path.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,795 |
win_unzip path traversal with specially crafted archive
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
CVE-2020-1737
<!--- Explain the problem briefly below -->
A specially crafted zip archive could result in path traversal in the `win_unzip` module.
The `Extract-Zip` function doesn't check if the extracted path belongs to the destination folder.
A possible solution is to [check destination path](https://docs.microsoft.com/en-us/dotnet/api/system.io.compression.ziparchive?view=netframework-4.8).
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
`lib/ansible/modules/windows/win_unzip.ps1`
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
2.10
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
default
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/67795
|
https://github.com/ansible/ansible/pull/67799
|
8ab304af4450c883700b144bcc03c395f205860f
|
d30c57ab22db24f6901166fcc3155667bdd3443f
| 2020-02-26T19:55:42Z |
python
| 2020-02-28T22:56:21Z |
lib/ansible/modules/windows/win_unzip.ps1
|
#!powershell
# Copyright: (c) 2015, Phil Schwartz <[email protected]>
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
#Requires -Module Ansible.ModuleUtils.Legacy
# TODO: This module is not idempotent (it will always unzip and report change)
$ErrorActionPreference = "Stop"
$pcx_extensions = @('.bz2', '.gz', '.msu', '.tar', '.zip')
$params = Parse-Args $args -supports_check_mode $true
$check_mode = Get-AnsibleParam -obj $params -name "_ansible_check_mode" -type "bool" -default $false
$src = Get-AnsibleParam -obj $params -name "src" -type "path" -failifempty $true
$dest = Get-AnsibleParam -obj $params -name "dest" -type "path" -failifempty $true
$creates = Get-AnsibleParam -obj $params -name "creates" -type "path"
$recurse = Get-AnsibleParam -obj $params -name "recurse" -type "bool" -default $false
$delete_archive = Get-AnsibleParam -obj $params -name "delete_archive" -type "bool" -default $false -aliases 'rm'
# Fixes a fail error message (when the task actually succeeds) for a
# "Convert-ToJson: The converted JSON string is in bad format"
# This happens when JSON is parsing a string that ends with a "\",
# which is possible when specifying a directory to download to.
# This catches that possible error, before assigning the JSON $result
$result = @{
changed = $false
dest = $dest -replace '\$',''
removed = $false
src = $src -replace '\$',''
}
Function Extract-Zip($src, $dest) {
$archive = [System.IO.Compression.ZipFile]::Open($src, [System.IO.Compression.ZipArchiveMode]::Read, [System.Text.Encoding]::UTF8)
foreach ($entry in $archive.Entries) {
$archive_name = $entry.FullName
$entry_target_path = [System.IO.Path]::Combine($dest, $archive_name)
$entry_dir = [System.IO.Path]::GetDirectoryName($entry_target_path)
if (-not (Test-Path -LiteralPath $entry_dir)) {
New-Item -Path $entry_dir -ItemType Directory -WhatIf:$check_mode | Out-Null
$result.changed = $true
}
if ((-not ($entry_target_path.EndsWith("\") -or $entry_target_path.EndsWith("/"))) -and (-not $check_mode)) {
[System.IO.Compression.ZipFileExtensions]::ExtractToFile($entry, $entry_target_path, $true)
}
$result.changed = $true
}
$archive.Dispose()
}
Function Extract-ZipLegacy($src, $dest) {
# [System.IO.Compression.ZipFile] was only added in .net 4.5, this is used
# when .net is older than that.
$shell = New-Object -ComObject Shell.Application
$zip = $shell.NameSpace([IO.Path]::GetFullPath($src))
$dest_path = $shell.NameSpace([IO.Path]::GetFullPath($dest))
$shell = New-Object -ComObject Shell.Application
if (-not $check_mode) {
# https://msdn.microsoft.com/en-us/library/windows/desktop/bb787866.aspx
# From Folder.CopyHere documentation, 1044 means:
# - 1024: do not display a user interface if an error occurs
# - 16: respond with "yes to all" for any dialog box that is displayed
# - 4: do not display a progress dialog box
$dest_path.CopyHere($zip.Items(), 1044)
}
$result.changed = $true
}
If ($creates -and (Test-Path -LiteralPath $creates)) {
$result.skipped = $true
$result.msg = "The file or directory '$creates' already exists."
Exit-Json -obj $result
}
If (-Not (Test-Path -LiteralPath $src)) {
Fail-Json -obj $result -message "File '$src' does not exist."
}
$ext = [System.IO.Path]::GetExtension($src)
If (-Not (Test-Path -LiteralPath $dest -PathType Container)){
Try{
New-Item -ItemType "directory" -path $dest -WhatIf:$check_mode | out-null
} Catch {
Fail-Json -obj $result -message "Error creating '$dest' directory! Msg: $($_.Exception.Message)"
}
}
If ($ext -eq ".zip" -And $recurse -eq $false) {
# TODO: PS v5 supports zip extraction, use that if available
$use_legacy = $false
try {
# determines if .net 4.5 is available, if this fails we need to fall
# back to the legacy COM Shell.Application to extract the zip
Add-Type -AssemblyName System.IO.Compression.FileSystem | Out-Null
Add-Type -AssemblyName System.IO.Compression | Out-Null
} catch {
$use_legacy = $true
}
if ($use_legacy) {
try {
Extract-ZipLegacy -src $src -dest $dest
} catch {
Fail-Json -obj $result -message "Error unzipping '$src' to '$dest'!. Method: COM Shell.Application, Exception: $($_.Exception.Message)"
}
} else {
try {
Extract-Zip -src $src -dest $dest
} catch {
Fail-Json -obj $result -message "Error unzipping '$src' to '$dest'!. Method: System.IO.Compression.ZipFile, Exception: $($_.Exception.Message)"
}
}
} Else {
# Check if PSCX is installed
$list = Get-Module -ListAvailable
If (-Not ($list -match "PSCX")) {
Fail-Json -obj $result -message "PowerShellCommunityExtensions PowerShell Module (PSCX) is required for non-'.zip' compressed archive types."
} Else {
$result.pscx_status = "present"
}
Try {
Import-Module PSCX
}
Catch {
Fail-Json $result "Error importing module PSCX"
}
Try {
Expand-Archive -Path $src -OutputPath $dest -Force -WhatIf:$check_mode
} Catch {
Fail-Json -obj $result -message "Error expanding '$src' to '$dest'! Msg: $($_.Exception.Message)"
}
If ($recurse) {
Get-ChildItem -LiteralPath $dest -recurse | Where-Object {$pcx_extensions -contains $_.extension} | ForEach-Object {
Try {
Expand-Archive $_.FullName -OutputPath $dest -Force -WhatIf:$check_mode
} Catch {
Fail-Json -obj $result -message "Error recursively expanding '$src' to '$dest'! Msg: $($_.Exception.Message)"
}
If ($delete_archive) {
Remove-Item -LiteralPath $_.FullName -Force -WhatIf:$check_mode
$result.removed = $true
}
}
}
$result.changed = $true
}
If ($delete_archive){
try {
Remove-Item -LiteralPath $src -Recurse -Force -WhatIf:$check_mode
} catch {
Fail-Json -obj $result -message "failed to delete archive at '$src': $($_.Exception.Message)"
}
$result.removed = $true
}
Exit-Json $result
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,795 |
win_unzip path traversal with specially crafted archive
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
CVE-2020-1737
<!--- Explain the problem briefly below -->
A specially crafted zip archive could result in path traversal in the `win_unzip` module.
The `Extract-Zip` function doesn't check if the extracted path belongs to the destination folder.
A possible solution is to [check destination path](https://docs.microsoft.com/en-us/dotnet/api/system.io.compression.ziparchive?view=netframework-4.8).
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
`lib/ansible/modules/windows/win_unzip.ps1`
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
2.10
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
default
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/67795
|
https://github.com/ansible/ansible/pull/67799
|
8ab304af4450c883700b144bcc03c395f205860f
|
d30c57ab22db24f6901166fcc3155667bdd3443f
| 2020-02-26T19:55:42Z |
python
| 2020-02-28T22:56:21Z |
test/integration/targets/win_unzip/files/create_crafty_zip_files.py
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,795 |
win_unzip path traversal with specially crafted archive
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
CVE-2020-1737
<!--- Explain the problem briefly below -->
A specially crafted zip archive could result in path traversal in the `win_unzip` module.
The `Extract-Zip` function doesn't check if the extracted path belongs to the destination folder.
A possible solution is to [check destination path](https://docs.microsoft.com/en-us/dotnet/api/system.io.compression.ziparchive?view=netframework-4.8).
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
`lib/ansible/modules/windows/win_unzip.ps1`
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
2.10
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
default
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/67795
|
https://github.com/ansible/ansible/pull/67799
|
8ab304af4450c883700b144bcc03c395f205860f
|
d30c57ab22db24f6901166fcc3155667bdd3443f
| 2020-02-26T19:55:42Z |
python
| 2020-02-28T22:56:21Z |
test/integration/targets/win_unzip/tasks/main.yml
|
---
- name: create test directory
win_file:
path: '{{ win_unzip_dir }}\output'
state: directory
- name: create local zip file with non-ascii chars
script: create_zip.py {{ output_dir + '/win_unzip.zip' | quote }}
delegate_to: localhost
- name: copy across zip to Windows host
win_copy:
src: '{{ output_dir }}/win_unzip.zip'
dest: '{{ win_unzip_dir }}\win_unzip.zip'
- name: unarchive zip (check)
win_unzip:
src: '{{ win_unzip_dir }}\win_unzip.zip'
dest: '{{ win_unzip_dir }}\output'
register: unzip_check
check_mode: yes
- name: get result of unarchive zip (check)
win_stat:
path: '{{ win_unzip_dir }}\output\café.txt'
register: unzip_actual_check
- name: assert result of unarchive zip (check)
assert:
that:
- unzip_check is changed
- not unzip_check.removed
- not unzip_actual_check.stat.exists
- name: unarchive zip
win_unzip:
src: '{{ win_unzip_dir }}\win_unzip.zip'
dest: '{{ win_unzip_dir }}\output'
register: unzip
- name: get result of unarchive zip
slurp:
path: '{{ win_unzip_dir }}\output\café.txt'
register: unzip_actual
- name: assert result of unarchive zip
assert:
that:
- unzip is changed
- not unzip.removed
- unzip_actual.content | b64decode == 'café.txt'
# Module is not idempotent, will always change without creates
- name: unarchive zip again without creates
win_unzip:
src: '{{ win_unzip_dir }}\win_unzip.zip'
dest: '{{ win_unzip_dir }}\output'
register: unzip_again
- name: assert unarchive zip again without creates
assert:
that:
- unzip_again is changed
- not unzip_again.removed
- name: unarchive zip with creates
win_unzip:
src: '{{ win_unzip_dir }}\win_unzip.zip'
dest: '{{ win_unzip_dir }}\outout'
creates: '{{ win_unzip_dir }}\output\café.txt'
register: unzip_again_creates
- name: assert unarchive zip with creates
assert:
that:
- not unzip_again_creates is changed
- not unzip_again_creates.removed
- name: unarchive zip with delete (check)
win_unzip:
src: '{{ win_unzip_dir }}\win_unzip.zip'
dest: '{{ win_unzip_dir }}\output'
delete_archive: yes
register: unzip_delete_check
check_mode: yes
- name: get result of unarchive zip with delete (check)
win_stat:
path: '{{ win_unzip_dir }}\win_unzip.zip'
register: unzip_delete_actual_check
- name: assert unarchive zip with delete (check)
assert:
that:
- unzip_delete_check is changed
- unzip_delete_check.removed
- unzip_delete_actual_check.stat.exists
- name: unarchive zip with delete
win_unzip:
src: '{{ win_unzip_dir }}\win_unzip.zip'
dest: '{{ win_unzip_dir }}\output'
delete_archive: yes
register: unzip_delete
- name: get result of unarchive zip with delete
win_stat:
path: '{{ win_unzip_dir }}\win_unzip.zip'
register: unzip_delete_actual
- name: assert unarchive zip with delete
assert:
that:
- unzip_delete is changed
- unzip_delete.removed
- not unzip_delete_actual.stat.exists
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,273 |
win_timezone cannot set dstoff timezones
|
##### SUMMARY
When attempting to use the win_timezone module to set a timezone to "Eastern Standard Time_dstoff" an error displays saying it is not supported.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
win_timezone
##### ANSIBLE VERSION
ansible 2.9.3
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/home/zinkj/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.17 (default, Nov 7 2019, 10:07:09) [GCC 7.4.0]
##### CONFIGURATION
(Empty)
##### OS / ENVIRONMENT
Target OS: Win10 Enterprise Build 1909
##### STEPS TO REPRODUCE
```
- name: Set TimeZone
win_timezone:
timezone: 'Eastern Standard Time_dstoff'
```
##### EXPECTED RESULTS
Timezone is changed.
##### ACTUAL RESULTS
Timezone fails to change.
```
null: TASK [bootstrap : Set TimeZone] ************************************************
null: task path: /mnt/c/regfarm/framework/revitfarm/setup/node/ansible/roles/bootstrap/tasks/bootstrap_tasks.yml:1
null: Monday 10 February 2020 08:52:14 -0500 (0:00:09.675) 0:00:09.751 *******
null: Monday 10 February 2020 08:52:14 -0500 (0:00:09.675) 0:00:09.750 *******
null: Using module file /usr/lib/python2.7/dist-packages/ansible/modules/windows/win_timezone.ps1
null: Pipelining is enabled.
null: <127.0.0.1> ESTABLISH SSH CONNECTION FOR USER: <sensitive>
null: <127.0.0.1> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o Port=56707 -o 'IdentityFile="/tmp/ansible-key277440362"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey
-o PasswordAuthentication=no -o 'User="<sensitive>"' -o ConnectTimeout=1800 -o ControlPath=/home/zinkj/.ansible/cp/d221cde98b 127.0.0.1 'chcp.com 65001 > $null ; PowerShell -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -EncodedCommand UABvAHcAZQByAFMAaABlAGwAbAAgAC0ATgBvAFAAcgBvAGYAaQBsAGUAIAAtAE4AbwBuAEkAbgB0AGUAcgBhAGMAdABpAHYAZQAgAC0ARQB4AGUAYwB1AHQAaQBvAG4AUABvAGwAaQBjAHkAIABVAG4AcgBlAHMAdAByAGkAYwB0AGUAZAAgAC0ARQBuAGMAbwBkAGUAZABDAG8AbQBtAGEAbgBkACAASgBnAEIAagBBAEcAZwBBAFkAdwBCAHcAQQBDADQAQQBZAHcAQgB2AEEARwAwAEEASQBBAEEAMgBBAEQAVQBBAE0AQQBBAHcAQQBEAEUAQQBJAEEAQQArAEEAQwBBAEEASgBBAEIAdQBBAEgAVQBBAGIAQQBCAHMAQQBBAG8AQQBKAEEAQgBsAEEASABnAEEAWgBRAEIAagBBAEYAOABBAGQAdwBCAHkAQQBHAEUAQQBjAEEAQgB3AEEARwBVAEEAYwBnAEIAZgBBAEgATQBBAGQAQQBCAHkAQQBDAEEAQQBQAFEAQQBnAEEAQwBRAEEAYQBRAEIAdQBBAEgAQQBBAGQAUQBCADAAQQBDAEEAQQBmAEEAQQBnAEEARQA4AEEAZABRAEIAMABBAEMAMABBAFUAdwBCADAAQQBIAEkAQQBhAFEAQgB1AEEARwBjAEEAQwBnAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAGcAQQBEADAAQQBJAEEAQQBrAEEARwBVAEEAZQBBAEIAbABBAEcATQBBAFgAdwBCADMAQQBIAEkAQQBZAFEAQgB3AEEASABBAEEAWgBRAEIAeQBBAEYAOABBAGMAdwBCADAAQQBIAEkAQQBMAGcAQgBUAEEASABBAEEAYgBBAEIAcABBAEgAUQBBAEsAQQBCAEEAQQBDAGcAQQBJAGcAQgBnAEEARABBAEEAWQBBAEEAdwBBAEcAQQBBAE0AQQBCAGcAQQBEAEEAQQBJAGcAQQBwAEEAQwB3AEEASQBBAEEAeQBBAEMAdwBBAEkAQQBCAGIAQQBGAE0AQQBkAEEAQgB5AEEARwBrAEEAYgBnAEIAbgBBAEYATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBQAEEASABBAEEAZABBAEIAcABBAEcAOABBAGIAZwBCAHoAQQBGADAAQQBPAGcAQQA2AEEARgBJAEEAWgBRAEIAdABBAEcAOABBAGQAZwBCAGwAQQBFAFUAQQBiAFEAQgB3AEEASABRAEEAZQBRAEIARgBBAEcANABBAGQAQQBCAHkAQQBHAGsAQQBaAFEAQgB6AEEAQwBrAEEAQwBnAEIASgBBAEcAWQBBAEkAQQBBAG8AQQBDADAAQQBiAGcAQgB2AEEASABRAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAHUAQQBFAHcAQQBaAFEAQgB1AEEARwBjAEEAZABBAEIAbwBBAEMAQQBBAEwAUQBCAGwAQQBIAEUAQQBJAEEAQQB5AEEAQwBrAEEASQBBAEIANwBBAEMAQQBBAGQAQQBCAG8AQQBIAEkAQQBiAHcAQgAzAEEAQwBBAEEASQBnAEIAcABBAEcANABBAGQAZwBCAGgAQQBHAHcAQQBhAFEAQgBrAEEAQwBBAEEAYwBBAEIAaABBAEgAawBBAGIAQQBCAHYAQQBHAEUAQQBaAEEAQQBpAEEAQwBBAEEAZgBRAEEASwBBAEYATQBBAFoAUQBCADAAQQBDADAAQQBWAGcAQgBoAEEASABJAEEAYQBRAEIAaABBAEcASQBBAGIAQQBCAGwAQQBDAEEAQQBMAFEAQgBPAEEARwBFAEEAYgBRAEIAbABBAEMAQQBBAGEAZwBCAHoAQQBHADgAQQBiAGcAQgBmAEEASABJAEEAWQBRAEIAMwBBAEMAQQBBAEwAUQBCAFcAQQBHAEUAQQBiAEEAQgAxAEEARwBVAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEUAQQBYAFEAQQBLAEEAQwBRAEEAWgBRAEIANABBAEcAVQBBAFkAdwBCAGYAQQBIAGMAQQBjAGcAQgBoAEEASABBAEEAYwBBAEIAbABBAEgASQBBAEkAQQBBADkAQQBDAEEAQQBXAHcAQgBUAEEARwBNAEEAYwBnAEIAcABBAEgAQQBBAGQAQQBCAEMAQQBHAHcAQQBiAHcAQgBqAEEARwBzAEEAWABRAEEANgBBAEQAbwBBAFEAdwBCAHkAQQBHAFUAQQBZAFEAQgAwAEEARwBVAEEASwBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEEAQQBYAFEAQQBwAEEAQQBvAEEASgBnAEEAawBBAEcAVQBBAGUAQQBCAGwAQQBHAE0AQQBYAHcAQgAzAEEASABJAEEAWQBRAEIAdwBBAEgAQQBBAFoAUQBCAHkAQQBBAD0APQA='
null: <127.0.0.1> (0, '{"changed":false,"msg":"The specified timezone: Eastern Standard Time_dstoff isn\\u0027t supported on the machine.","timezone":"Eastern Standard Time_dstoff","previous_timezone":"Eastern Standard Time","failed":true}\r\n', 'OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 19: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 610\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\n#< CLIXML\r\n<Objs Version="1.1.0.1" xmlns="http://schemas.microsoft.com/powershell/2004/04"><Obj S="progress" RefId="0"><TN RefId="0"><T>System.Management.Automation.PSCustomObject</T><T>System.Object</T></TN><MS><I64 N="SourceId">1</I64><PR N="Record"><AV>Preparing modules for first use.</AV><AI>0</AI><Nil /><PI>-1</PI><PC>-1</PC><T>Completed</T><SR>-1</SR><SD> </SD></PR></MS></Obj></Objs>debug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
null: fatal: [default]: FAILED! => {
null: "changed": false,
null: "msg": "The specified timezone: Eastern Standard Time_dstoff isn't supported on the machine.",
null: "timezone": "Eastern Standard Time_dstoff"
null: }
```
##### ADDITIONAL INFO
I confirmed both locally and through SSH that I can use tzuil /s "Eastern Standard Time_dstoff" and the change is accepted. Only through ansible does it not accept the timezone change.
There is also no issue changing timezone for non-dstoff timezones:
```
null:
null: TASK [bootstrap : Set TimeZone] ************************************************
null: Monday 10 February 2020 09:01:46 -0500 (0:00:08.965) 0:00:09.037 *******
null: Monday 10 February 2020 09:01:46 -0500 (0:00:08.965) 0:00:09.037 *******
null: changed: [default] => {"changed": true, "previous_timezone": "Central Standard Time", "timezone": "Eastern Standard Time"}
null:
```
Working around the issue for now with:
```
- name: Set TimeZone
win_command: 'tzutil /s "Eastern Standard Time_dstoff"'
```
|
https://github.com/ansible/ansible/issues/67273
|
https://github.com/ansible/ansible/pull/67892
|
64a28641586e384f1c35d5f573735f3e5045db20
|
2e38f80f9e5d6b46c5648e19bcb18b69dbc64762
| 2020-02-10T14:02:20Z |
python
| 2020-03-01T22:02:38Z |
changelogs/fragments/win_timezone-Allow-dstoff.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,273 |
win_timezone cannot set dstoff timezones
|
##### SUMMARY
When attempting to use the win_timezone module to set a timezone to "Eastern Standard Time_dstoff" an error displays saying it is not supported.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
win_timezone
##### ANSIBLE VERSION
ansible 2.9.3
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/home/zinkj/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.17 (default, Nov 7 2019, 10:07:09) [GCC 7.4.0]
##### CONFIGURATION
(Empty)
##### OS / ENVIRONMENT
Target OS: Win10 Enterprise Build 1909
##### STEPS TO REPRODUCE
```
- name: Set TimeZone
win_timezone:
timezone: 'Eastern Standard Time_dstoff'
```
##### EXPECTED RESULTS
Timezone is changed.
##### ACTUAL RESULTS
Timezone fails to change.
```
null: TASK [bootstrap : Set TimeZone] ************************************************
null: task path: /mnt/c/regfarm/framework/revitfarm/setup/node/ansible/roles/bootstrap/tasks/bootstrap_tasks.yml:1
null: Monday 10 February 2020 08:52:14 -0500 (0:00:09.675) 0:00:09.751 *******
null: Monday 10 February 2020 08:52:14 -0500 (0:00:09.675) 0:00:09.750 *******
null: Using module file /usr/lib/python2.7/dist-packages/ansible/modules/windows/win_timezone.ps1
null: Pipelining is enabled.
null: <127.0.0.1> ESTABLISH SSH CONNECTION FOR USER: <sensitive>
null: <127.0.0.1> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o Port=56707 -o 'IdentityFile="/tmp/ansible-key277440362"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey
-o PasswordAuthentication=no -o 'User="<sensitive>"' -o ConnectTimeout=1800 -o ControlPath=/home/zinkj/.ansible/cp/d221cde98b 127.0.0.1 'chcp.com 65001 > $null ; PowerShell -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -EncodedCommand UABvAHcAZQByAFMAaABlAGwAbAAgAC0ATgBvAFAAcgBvAGYAaQBsAGUAIAAtAE4AbwBuAEkAbgB0AGUAcgBhAGMAdABpAHYAZQAgAC0ARQB4AGUAYwB1AHQAaQBvAG4AUABvAGwAaQBjAHkAIABVAG4AcgBlAHMAdAByAGkAYwB0AGUAZAAgAC0ARQBuAGMAbwBkAGUAZABDAG8AbQBtAGEAbgBkACAASgBnAEIAagBBAEcAZwBBAFkAdwBCAHcAQQBDADQAQQBZAHcAQgB2AEEARwAwAEEASQBBAEEAMgBBAEQAVQBBAE0AQQBBAHcAQQBEAEUAQQBJAEEAQQArAEEAQwBBAEEASgBBAEIAdQBBAEgAVQBBAGIAQQBCAHMAQQBBAG8AQQBKAEEAQgBsAEEASABnAEEAWgBRAEIAagBBAEYAOABBAGQAdwBCAHkAQQBHAEUAQQBjAEEAQgB3AEEARwBVAEEAYwBnAEIAZgBBAEgATQBBAGQAQQBCAHkAQQBDAEEAQQBQAFEAQQBnAEEAQwBRAEEAYQBRAEIAdQBBAEgAQQBBAGQAUQBCADAAQQBDAEEAQQBmAEEAQQBnAEEARQA4AEEAZABRAEIAMABBAEMAMABBAFUAdwBCADAAQQBIAEkAQQBhAFEAQgB1AEEARwBjAEEAQwBnAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAGcAQQBEADAAQQBJAEEAQQBrAEEARwBVAEEAZQBBAEIAbABBAEcATQBBAFgAdwBCADMAQQBIAEkAQQBZAFEAQgB3AEEASABBAEEAWgBRAEIAeQBBAEYAOABBAGMAdwBCADAAQQBIAEkAQQBMAGcAQgBUAEEASABBAEEAYgBBAEIAcABBAEgAUQBBAEsAQQBCAEEAQQBDAGcAQQBJAGcAQgBnAEEARABBAEEAWQBBAEEAdwBBAEcAQQBBAE0AQQBCAGcAQQBEAEEAQQBJAGcAQQBwAEEAQwB3AEEASQBBAEEAeQBBAEMAdwBBAEkAQQBCAGIAQQBGAE0AQQBkAEEAQgB5AEEARwBrAEEAYgBnAEIAbgBBAEYATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBQAEEASABBAEEAZABBAEIAcABBAEcAOABBAGIAZwBCAHoAQQBGADAAQQBPAGcAQQA2AEEARgBJAEEAWgBRAEIAdABBAEcAOABBAGQAZwBCAGwAQQBFAFUAQQBiAFEAQgB3AEEASABRAEEAZQBRAEIARgBBAEcANABBAGQAQQBCAHkAQQBHAGsAQQBaAFEAQgB6AEEAQwBrAEEAQwBnAEIASgBBAEcAWQBBAEkAQQBBAG8AQQBDADAAQQBiAGcAQgB2AEEASABRAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAHUAQQBFAHcAQQBaAFEAQgB1AEEARwBjAEEAZABBAEIAbwBBAEMAQQBBAEwAUQBCAGwAQQBIAEUAQQBJAEEAQQB5AEEAQwBrAEEASQBBAEIANwBBAEMAQQBBAGQAQQBCAG8AQQBIAEkAQQBiAHcAQgAzAEEAQwBBAEEASQBnAEIAcABBAEcANABBAGQAZwBCAGgAQQBHAHcAQQBhAFEAQgBrAEEAQwBBAEEAYwBBAEIAaABBAEgAawBBAGIAQQBCAHYAQQBHAEUAQQBaAEEAQQBpAEEAQwBBAEEAZgBRAEEASwBBAEYATQBBAFoAUQBCADAAQQBDADAAQQBWAGcAQgBoAEEASABJAEEAYQBRAEIAaABBAEcASQBBAGIAQQBCAGwAQQBDAEEAQQBMAFEAQgBPAEEARwBFAEEAYgBRAEIAbABBAEMAQQBBAGEAZwBCAHoAQQBHADgAQQBiAGcAQgBmAEEASABJAEEAWQBRAEIAMwBBAEMAQQBBAEwAUQBCAFcAQQBHAEUAQQBiAEEAQgAxAEEARwBVAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEUAQQBYAFEAQQBLAEEAQwBRAEEAWgBRAEIANABBAEcAVQBBAFkAdwBCAGYAQQBIAGMAQQBjAGcAQgBoAEEASABBAEEAYwBBAEIAbABBAEgASQBBAEkAQQBBADkAQQBDAEEAQQBXAHcAQgBUAEEARwBNAEEAYwBnAEIAcABBAEgAQQBBAGQAQQBCAEMAQQBHAHcAQQBiAHcAQgBqAEEARwBzAEEAWABRAEEANgBBAEQAbwBBAFEAdwBCAHkAQQBHAFUAQQBZAFEAQgAwAEEARwBVAEEASwBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEEAQQBYAFEAQQBwAEEAQQBvAEEASgBnAEEAawBBAEcAVQBBAGUAQQBCAGwAQQBHAE0AQQBYAHcAQgAzAEEASABJAEEAWQBRAEIAdwBBAEgAQQBBAFoAUQBCAHkAQQBBAD0APQA='
null: <127.0.0.1> (0, '{"changed":false,"msg":"The specified timezone: Eastern Standard Time_dstoff isn\\u0027t supported on the machine.","timezone":"Eastern Standard Time_dstoff","previous_timezone":"Eastern Standard Time","failed":true}\r\n', 'OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 19: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 610\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\n#< CLIXML\r\n<Objs Version="1.1.0.1" xmlns="http://schemas.microsoft.com/powershell/2004/04"><Obj S="progress" RefId="0"><TN RefId="0"><T>System.Management.Automation.PSCustomObject</T><T>System.Object</T></TN><MS><I64 N="SourceId">1</I64><PR N="Record"><AV>Preparing modules for first use.</AV><AI>0</AI><Nil /><PI>-1</PI><PC>-1</PC><T>Completed</T><SR>-1</SR><SD> </SD></PR></MS></Obj></Objs>debug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
null: fatal: [default]: FAILED! => {
null: "changed": false,
null: "msg": "The specified timezone: Eastern Standard Time_dstoff isn't supported on the machine.",
null: "timezone": "Eastern Standard Time_dstoff"
null: }
```
##### ADDITIONAL INFO
I confirmed both locally and through SSH that I can use tzuil /s "Eastern Standard Time_dstoff" and the change is accepted. Only through ansible does it not accept the timezone change.
There is also no issue changing timezone for non-dstoff timezones:
```
null:
null: TASK [bootstrap : Set TimeZone] ************************************************
null: Monday 10 February 2020 09:01:46 -0500 (0:00:08.965) 0:00:09.037 *******
null: Monday 10 February 2020 09:01:46 -0500 (0:00:08.965) 0:00:09.037 *******
null: changed: [default] => {"changed": true, "previous_timezone": "Central Standard Time", "timezone": "Eastern Standard Time"}
null:
```
Working around the issue for now with:
```
- name: Set TimeZone
win_command: 'tzutil /s "Eastern Standard Time_dstoff"'
```
|
https://github.com/ansible/ansible/issues/67273
|
https://github.com/ansible/ansible/pull/67892
|
64a28641586e384f1c35d5f573735f3e5045db20
|
2e38f80f9e5d6b46c5648e19bcb18b69dbc64762
| 2020-02-10T14:02:20Z |
python
| 2020-03-01T22:02:38Z |
lib/ansible/modules/windows/win_timezone.ps1
|
#!powershell
# Copyright: (c) 2015, Phil Schwartz <[email protected]>
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
#Requires -Module Ansible.ModuleUtils.Legacy
$params = Parse-Args $args -supports_check_mode $true
$check_mode = Get-AnsibleParam -obj $params -name "_ansible_check_mode" -type "bool" -default $false
$diff_support = Get-AnsibleParam -obj $params -name "_ansible_diff" -type "bool" -default $false
$timezone = Get-AnsibleParam -obj $params -name "timezone" -type "str" -failifempty $true
$result = @{
changed = $false
previous_timezone = $timezone
timezone = $timezone
}
Try {
# Get the current timezone set
$result.previous_timezone = $(tzutil.exe /g)
If ($LASTEXITCODE -ne 0) {
Throw "An error occurred when getting the current machine's timezone setting."
}
if ( $result.previous_timezone -eq $timezone ) {
Exit-Json $result "Timezone '$timezone' is already set on this machine"
} Else {
# Check that timezone is listed as an available timezone to the machine
$tzList = $(tzutil.exe /l)
If ($LASTEXITCODE -ne 0) {
Throw "An error occurred when listing the available timezones."
}
$tzExists = $false
ForEach ($tz in $tzList) {
If ( $tz -eq $timezone ) {
$tzExists = $true
break
}
}
if (-not $tzExists) {
Fail-Json $result "The specified timezone: $timezone isn't supported on the machine."
}
if ($check_mode) {
$result.changed = $true
} else {
tzutil.exe /s "$timezone"
if ($LASTEXITCODE -ne 0) {
Throw "An error occurred when setting the specified timezone with tzutil."
}
$new_timezone = $(tzutil.exe /g)
if ($LASTEXITCODE -ne 0) {
Throw "An error occurred when getting the current machine's timezone setting."
}
if ($timezone -eq $new_timezone) {
$result.changed = $true
}
}
if ($diff_support) {
$result.diff = @{
before = "$($result.previous_timezone)`n"
after = "$timezone`n"
}
}
}
} Catch {
Fail-Json $result "Error setting timezone to: $timezone."
}
Exit-Json $result
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,273 |
win_timezone cannot set dstoff timezones
|
##### SUMMARY
When attempting to use the win_timezone module to set a timezone to "Eastern Standard Time_dstoff" an error displays saying it is not supported.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
win_timezone
##### ANSIBLE VERSION
ansible 2.9.3
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/home/zinkj/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.17 (default, Nov 7 2019, 10:07:09) [GCC 7.4.0]
##### CONFIGURATION
(Empty)
##### OS / ENVIRONMENT
Target OS: Win10 Enterprise Build 1909
##### STEPS TO REPRODUCE
```
- name: Set TimeZone
win_timezone:
timezone: 'Eastern Standard Time_dstoff'
```
##### EXPECTED RESULTS
Timezone is changed.
##### ACTUAL RESULTS
Timezone fails to change.
```
null: TASK [bootstrap : Set TimeZone] ************************************************
null: task path: /mnt/c/regfarm/framework/revitfarm/setup/node/ansible/roles/bootstrap/tasks/bootstrap_tasks.yml:1
null: Monday 10 February 2020 08:52:14 -0500 (0:00:09.675) 0:00:09.751 *******
null: Monday 10 February 2020 08:52:14 -0500 (0:00:09.675) 0:00:09.750 *******
null: Using module file /usr/lib/python2.7/dist-packages/ansible/modules/windows/win_timezone.ps1
null: Pipelining is enabled.
null: <127.0.0.1> ESTABLISH SSH CONNECTION FOR USER: <sensitive>
null: <127.0.0.1> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o Port=56707 -o 'IdentityFile="/tmp/ansible-key277440362"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey
-o PasswordAuthentication=no -o 'User="<sensitive>"' -o ConnectTimeout=1800 -o ControlPath=/home/zinkj/.ansible/cp/d221cde98b 127.0.0.1 'chcp.com 65001 > $null ; PowerShell -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -EncodedCommand UABvAHcAZQByAFMAaABlAGwAbAAgAC0ATgBvAFAAcgBvAGYAaQBsAGUAIAAtAE4AbwBuAEkAbgB0AGUAcgBhAGMAdABpAHYAZQAgAC0ARQB4AGUAYwB1AHQAaQBvAG4AUABvAGwAaQBjAHkAIABVAG4AcgBlAHMAdAByAGkAYwB0AGUAZAAgAC0ARQBuAGMAbwBkAGUAZABDAG8AbQBtAGEAbgBkACAASgBnAEIAagBBAEcAZwBBAFkAdwBCAHcAQQBDADQAQQBZAHcAQgB2AEEARwAwAEEASQBBAEEAMgBBAEQAVQBBAE0AQQBBAHcAQQBEAEUAQQBJAEEAQQArAEEAQwBBAEEASgBBAEIAdQBBAEgAVQBBAGIAQQBCAHMAQQBBAG8AQQBKAEEAQgBsAEEASABnAEEAWgBRAEIAagBBAEYAOABBAGQAdwBCAHkAQQBHAEUAQQBjAEEAQgB3AEEARwBVAEEAYwBnAEIAZgBBAEgATQBBAGQAQQBCAHkAQQBDAEEAQQBQAFEAQQBnAEEAQwBRAEEAYQBRAEIAdQBBAEgAQQBBAGQAUQBCADAAQQBDAEEAQQBmAEEAQQBnAEEARQA4AEEAZABRAEIAMABBAEMAMABBAFUAdwBCADAAQQBIAEkAQQBhAFEAQgB1AEEARwBjAEEAQwBnAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAGcAQQBEADAAQQBJAEEAQQBrAEEARwBVAEEAZQBBAEIAbABBAEcATQBBAFgAdwBCADMAQQBIAEkAQQBZAFEAQgB3AEEASABBAEEAWgBRAEIAeQBBAEYAOABBAGMAdwBCADAAQQBIAEkAQQBMAGcAQgBUAEEASABBAEEAYgBBAEIAcABBAEgAUQBBAEsAQQBCAEEAQQBDAGcAQQBJAGcAQgBnAEEARABBAEEAWQBBAEEAdwBBAEcAQQBBAE0AQQBCAGcAQQBEAEEAQQBJAGcAQQBwAEEAQwB3AEEASQBBAEEAeQBBAEMAdwBBAEkAQQBCAGIAQQBGAE0AQQBkAEEAQgB5AEEARwBrAEEAYgBnAEIAbgBBAEYATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBQAEEASABBAEEAZABBAEIAcABBAEcAOABBAGIAZwBCAHoAQQBGADAAQQBPAGcAQQA2AEEARgBJAEEAWgBRAEIAdABBAEcAOABBAGQAZwBCAGwAQQBFAFUAQQBiAFEAQgB3AEEASABRAEEAZQBRAEIARgBBAEcANABBAGQAQQBCAHkAQQBHAGsAQQBaAFEAQgB6AEEAQwBrAEEAQwBnAEIASgBBAEcAWQBBAEkAQQBBAG8AQQBDADAAQQBiAGcAQgB2AEEASABRAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAHUAQQBFAHcAQQBaAFEAQgB1AEEARwBjAEEAZABBAEIAbwBBAEMAQQBBAEwAUQBCAGwAQQBIAEUAQQBJAEEAQQB5AEEAQwBrAEEASQBBAEIANwBBAEMAQQBBAGQAQQBCAG8AQQBIAEkAQQBiAHcAQgAzAEEAQwBBAEEASQBnAEIAcABBAEcANABBAGQAZwBCAGgAQQBHAHcAQQBhAFEAQgBrAEEAQwBBAEEAYwBBAEIAaABBAEgAawBBAGIAQQBCAHYAQQBHAEUAQQBaAEEAQQBpAEEAQwBBAEEAZgBRAEEASwBBAEYATQBBAFoAUQBCADAAQQBDADAAQQBWAGcAQgBoAEEASABJAEEAYQBRAEIAaABBAEcASQBBAGIAQQBCAGwAQQBDAEEAQQBMAFEAQgBPAEEARwBFAEEAYgBRAEIAbABBAEMAQQBBAGEAZwBCAHoAQQBHADgAQQBiAGcAQgBmAEEASABJAEEAWQBRAEIAMwBBAEMAQQBBAEwAUQBCAFcAQQBHAEUAQQBiAEEAQgAxAEEARwBVAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEUAQQBYAFEAQQBLAEEAQwBRAEEAWgBRAEIANABBAEcAVQBBAFkAdwBCAGYAQQBIAGMAQQBjAGcAQgBoAEEASABBAEEAYwBBAEIAbABBAEgASQBBAEkAQQBBADkAQQBDAEEAQQBXAHcAQgBUAEEARwBNAEEAYwBnAEIAcABBAEgAQQBBAGQAQQBCAEMAQQBHAHcAQQBiAHcAQgBqAEEARwBzAEEAWABRAEEANgBBAEQAbwBBAFEAdwBCAHkAQQBHAFUAQQBZAFEAQgAwAEEARwBVAEEASwBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEEAQQBYAFEAQQBwAEEAQQBvAEEASgBnAEEAawBBAEcAVQBBAGUAQQBCAGwAQQBHAE0AQQBYAHcAQgAzAEEASABJAEEAWQBRAEIAdwBBAEgAQQBBAFoAUQBCAHkAQQBBAD0APQA='
null: <127.0.0.1> (0, '{"changed":false,"msg":"The specified timezone: Eastern Standard Time_dstoff isn\\u0027t supported on the machine.","timezone":"Eastern Standard Time_dstoff","previous_timezone":"Eastern Standard Time","failed":true}\r\n', 'OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 19: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 610\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\n#< CLIXML\r\n<Objs Version="1.1.0.1" xmlns="http://schemas.microsoft.com/powershell/2004/04"><Obj S="progress" RefId="0"><TN RefId="0"><T>System.Management.Automation.PSCustomObject</T><T>System.Object</T></TN><MS><I64 N="SourceId">1</I64><PR N="Record"><AV>Preparing modules for first use.</AV><AI>0</AI><Nil /><PI>-1</PI><PC>-1</PC><T>Completed</T><SR>-1</SR><SD> </SD></PR></MS></Obj></Objs>debug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
null: fatal: [default]: FAILED! => {
null: "changed": false,
null: "msg": "The specified timezone: Eastern Standard Time_dstoff isn't supported on the machine.",
null: "timezone": "Eastern Standard Time_dstoff"
null: }
```
##### ADDITIONAL INFO
I confirmed both locally and through SSH that I can use tzuil /s "Eastern Standard Time_dstoff" and the change is accepted. Only through ansible does it not accept the timezone change.
There is also no issue changing timezone for non-dstoff timezones:
```
null:
null: TASK [bootstrap : Set TimeZone] ************************************************
null: Monday 10 February 2020 09:01:46 -0500 (0:00:08.965) 0:00:09.037 *******
null: Monday 10 February 2020 09:01:46 -0500 (0:00:08.965) 0:00:09.037 *******
null: changed: [default] => {"changed": true, "previous_timezone": "Central Standard Time", "timezone": "Eastern Standard Time"}
null:
```
Working around the issue for now with:
```
- name: Set TimeZone
win_command: 'tzutil /s "Eastern Standard Time_dstoff"'
```
|
https://github.com/ansible/ansible/issues/67273
|
https://github.com/ansible/ansible/pull/67892
|
64a28641586e384f1c35d5f573735f3e5045db20
|
2e38f80f9e5d6b46c5648e19bcb18b69dbc64762
| 2020-02-10T14:02:20Z |
python
| 2020-03-01T22:02:38Z |
lib/ansible/modules/windows/win_timezone.py
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright: (c) 2015, Phil Schwartz <[email protected]>
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
ANSIBLE_METADATA = {'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'}
DOCUMENTATION = r'''
---
module: win_timezone
version_added: '2.1'
short_description: Sets Windows machine timezone
description:
- Sets machine time to the specified timezone.
options:
timezone:
description:
- Timezone to set to.
- 'Example: Central Standard Time'
type: str
required: yes
notes:
- The module will check if the provided timezone is supported on the machine.
- A list of possible timezones is available from C(tzutil.exe /l) and from
U(https://msdn.microsoft.com/en-us/library/ms912391.aspx)
- If running on Server 2008 the hotfix
U(https://support.microsoft.com/en-us/help/2556308/tzutil-command-line-tool-is-added-to-windows-vista-and-to-windows-server-2008)
needs to be installed to be able to run this module.
seealso:
- module: win_region
author:
- Phil Schwartz (@schwartzmx)
'''
EXAMPLES = r'''
- name: Set timezone to 'Romance Standard Time' (GMT+01:00)
win_timezone:
timezone: Romance Standard Time
- name: Set timezone to 'GMT Standard Time' (GMT)
win_timezone:
timezone: GMT Standard Time
- name: Set timezone to 'Central Standard Time' (GMT-06:00)
win_timezone:
timezone: Central Standard Time
'''
RETURN = r'''
previous_timezone:
description: The previous timezone if it was changed, otherwise the existing timezone.
returned: success
type: str
sample: Central Standard Time
timezone:
description: The current timezone (possibly changed).
returned: success
type: str
sample: Central Standard Time
'''
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,273 |
win_timezone cannot set dstoff timezones
|
##### SUMMARY
When attempting to use the win_timezone module to set a timezone to "Eastern Standard Time_dstoff" an error displays saying it is not supported.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
win_timezone
##### ANSIBLE VERSION
ansible 2.9.3
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/home/zinkj/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.17 (default, Nov 7 2019, 10:07:09) [GCC 7.4.0]
##### CONFIGURATION
(Empty)
##### OS / ENVIRONMENT
Target OS: Win10 Enterprise Build 1909
##### STEPS TO REPRODUCE
```
- name: Set TimeZone
win_timezone:
timezone: 'Eastern Standard Time_dstoff'
```
##### EXPECTED RESULTS
Timezone is changed.
##### ACTUAL RESULTS
Timezone fails to change.
```
null: TASK [bootstrap : Set TimeZone] ************************************************
null: task path: /mnt/c/regfarm/framework/revitfarm/setup/node/ansible/roles/bootstrap/tasks/bootstrap_tasks.yml:1
null: Monday 10 February 2020 08:52:14 -0500 (0:00:09.675) 0:00:09.751 *******
null: Monday 10 February 2020 08:52:14 -0500 (0:00:09.675) 0:00:09.750 *******
null: Using module file /usr/lib/python2.7/dist-packages/ansible/modules/windows/win_timezone.ps1
null: Pipelining is enabled.
null: <127.0.0.1> ESTABLISH SSH CONNECTION FOR USER: <sensitive>
null: <127.0.0.1> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o Port=56707 -o 'IdentityFile="/tmp/ansible-key277440362"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey
-o PasswordAuthentication=no -o 'User="<sensitive>"' -o ConnectTimeout=1800 -o ControlPath=/home/zinkj/.ansible/cp/d221cde98b 127.0.0.1 'chcp.com 65001 > $null ; PowerShell -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -EncodedCommand UABvAHcAZQByAFMAaABlAGwAbAAgAC0ATgBvAFAAcgBvAGYAaQBsAGUAIAAtAE4AbwBuAEkAbgB0AGUAcgBhAGMAdABpAHYAZQAgAC0ARQB4AGUAYwB1AHQAaQBvAG4AUABvAGwAaQBjAHkAIABVAG4AcgBlAHMAdAByAGkAYwB0AGUAZAAgAC0ARQBuAGMAbwBkAGUAZABDAG8AbQBtAGEAbgBkACAASgBnAEIAagBBAEcAZwBBAFkAdwBCAHcAQQBDADQAQQBZAHcAQgB2AEEARwAwAEEASQBBAEEAMgBBAEQAVQBBAE0AQQBBAHcAQQBEAEUAQQBJAEEAQQArAEEAQwBBAEEASgBBAEIAdQBBAEgAVQBBAGIAQQBCAHMAQQBBAG8AQQBKAEEAQgBsAEEASABnAEEAWgBRAEIAagBBAEYAOABBAGQAdwBCAHkAQQBHAEUAQQBjAEEAQgB3AEEARwBVAEEAYwBnAEIAZgBBAEgATQBBAGQAQQBCAHkAQQBDAEEAQQBQAFEAQQBnAEEAQwBRAEEAYQBRAEIAdQBBAEgAQQBBAGQAUQBCADAAQQBDAEEAQQBmAEEAQQBnAEEARQA4AEEAZABRAEIAMABBAEMAMABBAFUAdwBCADAAQQBIAEkAQQBhAFEAQgB1AEEARwBjAEEAQwBnAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAGcAQQBEADAAQQBJAEEAQQBrAEEARwBVAEEAZQBBAEIAbABBAEcATQBBAFgAdwBCADMAQQBIAEkAQQBZAFEAQgB3AEEASABBAEEAWgBRAEIAeQBBAEYAOABBAGMAdwBCADAAQQBIAEkAQQBMAGcAQgBUAEEASABBAEEAYgBBAEIAcABBAEgAUQBBAEsAQQBCAEEAQQBDAGcAQQBJAGcAQgBnAEEARABBAEEAWQBBAEEAdwBBAEcAQQBBAE0AQQBCAGcAQQBEAEEAQQBJAGcAQQBwAEEAQwB3AEEASQBBAEEAeQBBAEMAdwBBAEkAQQBCAGIAQQBGAE0AQQBkAEEAQgB5AEEARwBrAEEAYgBnAEIAbgBBAEYATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBQAEEASABBAEEAZABBAEIAcABBAEcAOABBAGIAZwBCAHoAQQBGADAAQQBPAGcAQQA2AEEARgBJAEEAWgBRAEIAdABBAEcAOABBAGQAZwBCAGwAQQBFAFUAQQBiAFEAQgB3AEEASABRAEEAZQBRAEIARgBBAEcANABBAGQAQQBCAHkAQQBHAGsAQQBaAFEAQgB6AEEAQwBrAEEAQwBnAEIASgBBAEcAWQBBAEkAQQBBAG8AQQBDADAAQQBiAGcAQgB2AEEASABRAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBBAHUAQQBFAHcAQQBaAFEAQgB1AEEARwBjAEEAZABBAEIAbwBBAEMAQQBBAEwAUQBCAGwAQQBIAEUAQQBJAEEAQQB5AEEAQwBrAEEASQBBAEIANwBBAEMAQQBBAGQAQQBCAG8AQQBIAEkAQQBiAHcAQgAzAEEAQwBBAEEASQBnAEIAcABBAEcANABBAGQAZwBCAGgAQQBHAHcAQQBhAFEAQgBrAEEAQwBBAEEAYwBBAEIAaABBAEgAawBBAGIAQQBCAHYAQQBHAEUAQQBaAEEAQQBpAEEAQwBBAEEAZgBRAEEASwBBAEYATQBBAFoAUQBCADAAQQBDADAAQQBWAGcAQgBoAEEASABJAEEAYQBRAEIAaABBAEcASQBBAGIAQQBCAGwAQQBDAEEAQQBMAFEAQgBPAEEARwBFAEEAYgBRAEIAbABBAEMAQQBBAGEAZwBCAHoAQQBHADgAQQBiAGcAQgBmAEEASABJAEEAWQBRAEIAMwBBAEMAQQBBAEwAUQBCAFcAQQBHAEUAQQBiAEEAQgAxAEEARwBVAEEASQBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEUAQQBYAFEAQQBLAEEAQwBRAEEAWgBRAEIANABBAEcAVQBBAFkAdwBCAGYAQQBIAGMAQQBjAGcAQgBoAEEASABBAEEAYwBBAEIAbABBAEgASQBBAEkAQQBBADkAQQBDAEEAQQBXAHcAQgBUAEEARwBNAEEAYwBnAEIAcABBAEgAQQBBAGQAQQBCAEMAQQBHAHcAQQBiAHcAQgBqAEEARwBzAEEAWABRAEEANgBBAEQAbwBBAFEAdwBCAHkAQQBHAFUAQQBZAFEAQgAwAEEARwBVAEEASwBBAEEAawBBAEgATQBBAGMAQQBCAHMAQQBHAGsAQQBkAEEAQgBmAEEASABBAEEAWQBRAEIAeQBBAEgAUQBBAGMAdwBCAGIAQQBEAEEAQQBYAFEAQQBwAEEAQQBvAEEASgBnAEEAawBBAEcAVQBBAGUAQQBCAGwAQQBHAE0AQQBYAHcAQgAzAEEASABJAEEAWQBRAEIAdwBBAEgAQQBBAFoAUQBCAHkAQQBBAD0APQA='
null: <127.0.0.1> (0, '{"changed":false,"msg":"The specified timezone: Eastern Standard Time_dstoff isn\\u0027t supported on the machine.","timezone":"Eastern Standard Time_dstoff","previous_timezone":"Eastern Standard Time","failed":true}\r\n', 'OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 19: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 610\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\n#< CLIXML\r\n<Objs Version="1.1.0.1" xmlns="http://schemas.microsoft.com/powershell/2004/04"><Obj S="progress" RefId="0"><TN RefId="0"><T>System.Management.Automation.PSCustomObject</T><T>System.Object</T></TN><MS><I64 N="SourceId">1</I64><PR N="Record"><AV>Preparing modules for first use.</AV><AI>0</AI><Nil /><PI>-1</PI><PC>-1</PC><T>Completed</T><SR>-1</SR><SD> </SD></PR></MS></Obj></Objs>debug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
null: fatal: [default]: FAILED! => {
null: "changed": false,
null: "msg": "The specified timezone: Eastern Standard Time_dstoff isn't supported on the machine.",
null: "timezone": "Eastern Standard Time_dstoff"
null: }
```
##### ADDITIONAL INFO
I confirmed both locally and through SSH that I can use tzuil /s "Eastern Standard Time_dstoff" and the change is accepted. Only through ansible does it not accept the timezone change.
There is also no issue changing timezone for non-dstoff timezones:
```
null:
null: TASK [bootstrap : Set TimeZone] ************************************************
null: Monday 10 February 2020 09:01:46 -0500 (0:00:08.965) 0:00:09.037 *******
null: Monday 10 February 2020 09:01:46 -0500 (0:00:08.965) 0:00:09.037 *******
null: changed: [default] => {"changed": true, "previous_timezone": "Central Standard Time", "timezone": "Eastern Standard Time"}
null:
```
Working around the issue for now with:
```
- name: Set TimeZone
win_command: 'tzutil /s "Eastern Standard Time_dstoff"'
```
|
https://github.com/ansible/ansible/issues/67273
|
https://github.com/ansible/ansible/pull/67892
|
64a28641586e384f1c35d5f573735f3e5045db20
|
2e38f80f9e5d6b46c5648e19bcb18b69dbc64762
| 2020-02-10T14:02:20Z |
python
| 2020-03-01T22:02:38Z |
test/integration/targets/win_timezone/tasks/tests.yml
|
# NOTE: Set to a known starting value, store original
- name: Change starting timezone to GMT
win_timezone:
timezone: GMT Standard Time
register: original
# NOTE: We don't know if it changed, we don't care
- name: Test GMT timezone
assert:
that:
- original.timezone == 'GMT Standard Time'
- name: Change timezone to GMT+1
win_timezone:
timezone: Romance Standard Time
register: romance
- name: Test GMT+1 timezone
assert:
that:
- romance is changed
- romance.previous_timezone == 'GMT Standard Time'
- romance.timezone == 'Romance Standard Time'
when: not in_check_mode
- name: Test GMT+1 timezone
assert:
that:
- romance is changed
- romance.previous_timezone == original.timezone
- romance.timezone == 'Romance Standard Time'
when: in_check_mode
- name: Change timezone to GMT+1 again
win_timezone:
timezone: Romance Standard Time
register: romance
- name: Test GMT+1 timezone
assert:
that:
- romance is not changed
- romance.previous_timezone == 'Romance Standard Time'
- romance.timezone == 'Romance Standard Time'
when: not in_check_mode
- name: Test GMT+1 timezone
assert:
that:
- romance is changed
- romance.previous_timezone == original.timezone
- romance.timezone == 'Romance Standard Time'
when: in_check_mode
- name: Change timezone to GMT+6
win_timezone:
timezone: Central Standard Time
register: central
- name: Test GMT-6 timezone
assert:
that:
- central is changed
- central.previous_timezone == 'Romance Standard Time'
- central.timezone == 'Central Standard Time'
when: not in_check_mode
- name: Test GMT+1 timezone
assert:
that:
- central is changed
- central.previous_timezone == original.timezone
- central.timezone == 'Central Standard Time'
when: in_check_mode
- name: Change timezone to GMT+666
win_timezone:
timezone: Dag's Standard Time
register: dag
ignore_errors: yes
- name: Test GMT+666 timezone
assert:
that:
- dag is failed
- name: Restore original timezone
win_timezone:
timezone: '{{ original.timezone }}'
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 64,905 |
Add --pre flag for ansible-galaxy client
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
We publish semver collection (vyos.vyos:0.0.1-dev58) to ansible galaxy, however by default it seems ansible-galaxy client will not be able to find it. You must specific the version number directly to fetch it.
https://galaxy.ansible.com/vyos/vyos
EDIT:
This is working as expect, but it would be nice to Include:
--pre Include pre-release and development versions. By default, ansible-galaxy only finds stable versions.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ansible-galaxy
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.1
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
$ ansible-galaxy collection install vyos.vyos -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
ERROR! Cannot meet requirement * for dependency vyos.vyos from source 'https://galaxy.ansible.com/api/'. Available versions before last requirement added:
Requirements from:
base - 'vyos.vyos:*'
(venv) [pabelanger@localhost openstack]$ ansible-galaxy collection install vyos.vyos:0.0.1-dev57 -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
Starting collection install process
Installing 'vyos.vyos:0.0.1-dev57' to '/tmp/11123/ansible_collections/vyos/vyos'
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
vyos.vyos collection to be found and installed
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/64905
|
https://github.com/ansible/ansible/pull/68258
|
ed9de94ad92dcc07ea3863808e0f4b00f2402cea
|
d3ec31f8d5683926aa6a05bb573d9929a6266fac
| 2019-11-15T17:10:49Z |
python
| 2020-03-23T21:04:07Z |
changelogs/fragments/64905-semver.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 64,905 |
Add --pre flag for ansible-galaxy client
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
We publish semver collection (vyos.vyos:0.0.1-dev58) to ansible galaxy, however by default it seems ansible-galaxy client will not be able to find it. You must specific the version number directly to fetch it.
https://galaxy.ansible.com/vyos/vyos
EDIT:
This is working as expect, but it would be nice to Include:
--pre Include pre-release and development versions. By default, ansible-galaxy only finds stable versions.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ansible-galaxy
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.1
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
$ ansible-galaxy collection install vyos.vyos -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
ERROR! Cannot meet requirement * for dependency vyos.vyos from source 'https://galaxy.ansible.com/api/'. Available versions before last requirement added:
Requirements from:
base - 'vyos.vyos:*'
(venv) [pabelanger@localhost openstack]$ ansible-galaxy collection install vyos.vyos:0.0.1-dev57 -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
Starting collection install process
Installing 'vyos.vyos:0.0.1-dev57' to '/tmp/11123/ansible_collections/vyos/vyos'
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
vyos.vyos collection to be found and installed
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/64905
|
https://github.com/ansible/ansible/pull/68258
|
ed9de94ad92dcc07ea3863808e0f4b00f2402cea
|
d3ec31f8d5683926aa6a05bb573d9929a6266fac
| 2019-11-15T17:10:49Z |
python
| 2020-03-23T21:04:07Z |
lib/ansible/cli/galaxy.py
|
# Copyright: (c) 2013, James Cammarata <[email protected]>
# Copyright: (c) 2018, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import os.path
import re
import shutil
import textwrap
import time
import yaml
from jinja2 import BaseLoader, Environment, FileSystemLoader
from yaml.error import YAMLError
import ansible.constants as C
from ansible import context
from ansible.cli import CLI
from ansible.cli.arguments import option_helpers as opt_help
from ansible.errors import AnsibleError, AnsibleOptionsError
from ansible.galaxy import Galaxy, get_collections_galaxy_meta_info
from ansible.galaxy.api import GalaxyAPI
from ansible.galaxy.collection import (
build_collection,
CollectionRequirement,
find_existing_collections,
install_collections,
publish_collection,
validate_collection_name,
validate_collection_path,
verify_collections
)
from ansible.galaxy.login import GalaxyLogin
from ansible.galaxy.role import GalaxyRole
from ansible.galaxy.token import BasicAuthToken, GalaxyToken, KeycloakToken, NoTokenSentinel
from ansible.module_utils.ansible_release import __version__ as ansible_version
from ansible.module_utils.common.collections import is_iterable
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.module_utils import six
from ansible.parsing.yaml.loader import AnsibleLoader
from ansible.playbook.role.requirement import RoleRequirement
from ansible.utils.display import Display
from ansible.utils.plugin_docs import get_versioned_doclink
display = Display()
urlparse = six.moves.urllib.parse.urlparse
def _display_header(path, h1, h2, w1=10, w2=7):
display.display('\n# {0}\n{1:{cwidth}} {2:{vwidth}}\n{3} {4}\n'.format(
path,
h1,
h2,
'-' * max([len(h1), w1]), # Make sure that the number of dashes is at least the width of the header
'-' * max([len(h2), w2]),
cwidth=w1,
vwidth=w2,
))
def _display_role(gr):
install_info = gr.install_info
version = None
if install_info:
version = install_info.get("version", None)
if not version:
version = "(unknown version)"
display.display("- %s, %s" % (gr.name, version))
def _display_collection(collection, cwidth=10, vwidth=7, min_cwidth=10, min_vwidth=7):
display.display('{fqcn:{cwidth}} {version:{vwidth}}'.format(
fqcn=to_text(collection),
version=collection.latest_version,
cwidth=max(cwidth, min_cwidth), # Make sure the width isn't smaller than the header
vwidth=max(vwidth, min_vwidth)
))
def _get_collection_widths(collections):
if is_iterable(collections):
fqcn_set = set(to_text(c) for c in collections)
version_set = set(to_text(c.latest_version) for c in collections)
else:
fqcn_set = set([to_text(collections)])
version_set = set([collections.latest_version])
fqcn_length = len(max(fqcn_set, key=len))
version_length = len(max(version_set, key=len))
return fqcn_length, version_length
class GalaxyCLI(CLI):
'''command to manage Ansible roles in shared repositories, the default of which is Ansible Galaxy *https://galaxy.ansible.com*.'''
SKIP_INFO_KEYS = ("name", "description", "readme_html", "related", "summary_fields", "average_aw_composite", "average_aw_score", "url")
def __init__(self, args):
# Inject role into sys.argv[1] as a backwards compatibility step
if len(args) > 1 and args[1] not in ['-h', '--help', '--version'] and 'role' not in args and 'collection' not in args:
# TODO: Should we add a warning here and eventually deprecate the implicit role subcommand choice
# Remove this in Ansible 2.13 when we also remove -v as an option on the root parser for ansible-galaxy.
idx = 2 if args[1].startswith('-v') else 1
args.insert(idx, 'role')
self.api_servers = []
self.galaxy = None
super(GalaxyCLI, self).__init__(args)
def init_parser(self):
''' create an options parser for bin/ansible '''
super(GalaxyCLI, self).init_parser(
desc="Perform various Role and Collection related operations.",
)
# Common arguments that apply to more than 1 action
common = opt_help.argparse.ArgumentParser(add_help=False)
common.add_argument('-s', '--server', dest='api_server', help='The Galaxy API server URL')
common.add_argument('--token', '--api-key', dest='api_key',
help='The Ansible Galaxy API key which can be found at '
'https://galaxy.ansible.com/me/preferences. You can also use ansible-galaxy login to '
'retrieve this key or set the token for the GALAXY_SERVER_LIST entry.')
common.add_argument('-c', '--ignore-certs', action='store_true', dest='ignore_certs',
default=C.GALAXY_IGNORE_CERTS, help='Ignore SSL certificate validation errors.')
opt_help.add_verbosity_options(common)
force = opt_help.argparse.ArgumentParser(add_help=False)
force.add_argument('-f', '--force', dest='force', action='store_true', default=False,
help='Force overwriting an existing role or collection')
github = opt_help.argparse.ArgumentParser(add_help=False)
github.add_argument('github_user', help='GitHub username')
github.add_argument('github_repo', help='GitHub repository')
offline = opt_help.argparse.ArgumentParser(add_help=False)
offline.add_argument('--offline', dest='offline', default=False, action='store_true',
help="Don't query the galaxy API when creating roles")
default_roles_path = C.config.get_configuration_definition('DEFAULT_ROLES_PATH').get('default', '')
roles_path = opt_help.argparse.ArgumentParser(add_help=False)
roles_path.add_argument('-p', '--roles-path', dest='roles_path', type=opt_help.unfrack_path(pathsep=True),
default=C.DEFAULT_ROLES_PATH, action=opt_help.PrependListAction,
help='The path to the directory containing your roles. The default is the first '
'writable one configured via DEFAULT_ROLES_PATH: %s ' % default_roles_path)
collections_path = opt_help.argparse.ArgumentParser(add_help=False)
collections_path.add_argument('-p', '--collection-path', dest='collections_path', type=opt_help.unfrack_path(pathsep=True),
default=C.COLLECTIONS_PATHS, action=opt_help.PrependListAction,
help="One or more directories to search for collections in addition "
"to the default COLLECTIONS_PATHS. Separate multiple paths "
"with '{0}'.".format(os.path.pathsep))
# Add sub parser for the Galaxy role type (role or collection)
type_parser = self.parser.add_subparsers(metavar='TYPE', dest='type')
type_parser.required = True
# Add sub parser for the Galaxy collection actions
collection = type_parser.add_parser('collection', help='Manage an Ansible Galaxy collection.')
collection_parser = collection.add_subparsers(metavar='COLLECTION_ACTION', dest='action')
collection_parser.required = True
self.add_init_options(collection_parser, parents=[common, force])
self.add_build_options(collection_parser, parents=[common, force])
self.add_publish_options(collection_parser, parents=[common])
self.add_install_options(collection_parser, parents=[common, force])
self.add_list_options(collection_parser, parents=[common, collections_path])
self.add_verify_options(collection_parser, parents=[common, collections_path])
# Add sub parser for the Galaxy role actions
role = type_parser.add_parser('role', help='Manage an Ansible Galaxy role.')
role_parser = role.add_subparsers(metavar='ROLE_ACTION', dest='action')
role_parser.required = True
self.add_init_options(role_parser, parents=[common, force, offline])
self.add_remove_options(role_parser, parents=[common, roles_path])
self.add_delete_options(role_parser, parents=[common, github])
self.add_list_options(role_parser, parents=[common, roles_path])
self.add_search_options(role_parser, parents=[common])
self.add_import_options(role_parser, parents=[common, github])
self.add_setup_options(role_parser, parents=[common, roles_path])
self.add_login_options(role_parser, parents=[common])
self.add_info_options(role_parser, parents=[common, roles_path, offline])
self.add_install_options(role_parser, parents=[common, force, roles_path])
def add_init_options(self, parser, parents=None):
galaxy_type = 'collection' if parser.metavar == 'COLLECTION_ACTION' else 'role'
init_parser = parser.add_parser('init', parents=parents,
help='Initialize new {0} with the base structure of a '
'{0}.'.format(galaxy_type))
init_parser.set_defaults(func=self.execute_init)
init_parser.add_argument('--init-path', dest='init_path', default='./',
help='The path in which the skeleton {0} will be created. The default is the '
'current working directory.'.format(galaxy_type))
init_parser.add_argument('--{0}-skeleton'.format(galaxy_type), dest='{0}_skeleton'.format(galaxy_type),
default=C.GALAXY_ROLE_SKELETON,
help='The path to a {0} skeleton that the new {0} should be based '
'upon.'.format(galaxy_type))
obj_name_kwargs = {}
if galaxy_type == 'collection':
obj_name_kwargs['type'] = validate_collection_name
init_parser.add_argument('{0}_name'.format(galaxy_type), help='{0} name'.format(galaxy_type.capitalize()),
**obj_name_kwargs)
if galaxy_type == 'role':
init_parser.add_argument('--type', dest='role_type', action='store', default='default',
help="Initialize using an alternate role type. Valid types include: 'container', "
"'apb' and 'network'.")
def add_remove_options(self, parser, parents=None):
remove_parser = parser.add_parser('remove', parents=parents, help='Delete roles from roles_path.')
remove_parser.set_defaults(func=self.execute_remove)
remove_parser.add_argument('args', help='Role(s)', metavar='role', nargs='+')
def add_delete_options(self, parser, parents=None):
delete_parser = parser.add_parser('delete', parents=parents,
help='Removes the role from Galaxy. It does not remove or alter the actual '
'GitHub repository.')
delete_parser.set_defaults(func=self.execute_delete)
def add_list_options(self, parser, parents=None):
galaxy_type = 'role'
if parser.metavar == 'COLLECTION_ACTION':
galaxy_type = 'collection'
list_parser = parser.add_parser('list', parents=parents,
help='Show the name and version of each {0} installed in the {0}s_path.'.format(galaxy_type))
list_parser.set_defaults(func=self.execute_list)
list_parser.add_argument(galaxy_type, help=galaxy_type.capitalize(), nargs='?', metavar=galaxy_type)
def add_search_options(self, parser, parents=None):
search_parser = parser.add_parser('search', parents=parents,
help='Search the Galaxy database by tags, platforms, author and multiple '
'keywords.')
search_parser.set_defaults(func=self.execute_search)
search_parser.add_argument('--platforms', dest='platforms', help='list of OS platforms to filter by')
search_parser.add_argument('--galaxy-tags', dest='galaxy_tags', help='list of galaxy tags to filter by')
search_parser.add_argument('--author', dest='author', help='GitHub username')
search_parser.add_argument('args', help='Search terms', metavar='searchterm', nargs='*')
def add_import_options(self, parser, parents=None):
import_parser = parser.add_parser('import', parents=parents, help='Import a role')
import_parser.set_defaults(func=self.execute_import)
import_parser.add_argument('--no-wait', dest='wait', action='store_false', default=True,
help="Don't wait for import results.")
import_parser.add_argument('--branch', dest='reference',
help='The name of a branch to import. Defaults to the repository\'s default branch '
'(usually master)')
import_parser.add_argument('--role-name', dest='role_name',
help='The name the role should have, if different than the repo name')
import_parser.add_argument('--status', dest='check_status', action='store_true', default=False,
help='Check the status of the most recent import request for given github_'
'user/github_repo.')
def add_setup_options(self, parser, parents=None):
setup_parser = parser.add_parser('setup', parents=parents,
help='Manage the integration between Galaxy and the given source.')
setup_parser.set_defaults(func=self.execute_setup)
setup_parser.add_argument('--remove', dest='remove_id', default=None,
help='Remove the integration matching the provided ID value. Use --list to see '
'ID values.')
setup_parser.add_argument('--list', dest="setup_list", action='store_true', default=False,
help='List all of your integrations.')
setup_parser.add_argument('source', help='Source')
setup_parser.add_argument('github_user', help='GitHub username')
setup_parser.add_argument('github_repo', help='GitHub repository')
setup_parser.add_argument('secret', help='Secret')
def add_login_options(self, parser, parents=None):
login_parser = parser.add_parser('login', parents=parents,
help="Login to api.github.com server in order to use ansible-galaxy role sub "
"command such as 'import', 'delete', 'publish', and 'setup'")
login_parser.set_defaults(func=self.execute_login)
login_parser.add_argument('--github-token', dest='token', default=None,
help='Identify with github token rather than username and password.')
def add_info_options(self, parser, parents=None):
info_parser = parser.add_parser('info', parents=parents, help='View more details about a specific role.')
info_parser.set_defaults(func=self.execute_info)
info_parser.add_argument('args', nargs='+', help='role', metavar='role_name[,version]')
def add_verify_options(self, parser, parents=None):
galaxy_type = 'collection'
verify_parser = parser.add_parser('verify', parents=parents, help='Compare checksums with the collection(s) '
'found on the server and the installed copy. This does not verify dependencies.')
verify_parser.set_defaults(func=self.execute_verify)
verify_parser.add_argument('args', metavar='{0}_name'.format(galaxy_type), nargs='*', help='The collection(s) name or '
'path/url to a tar.gz collection artifact. This is mutually exclusive with --requirements-file.')
verify_parser.add_argument('-i', '--ignore-errors', dest='ignore_errors', action='store_true', default=False,
help='Ignore errors during verification and continue with the next specified collection.')
verify_parser.add_argument('-r', '--requirements-file', dest='requirements',
help='A file containing a list of collections to be verified.')
def add_install_options(self, parser, parents=None):
galaxy_type = 'collection' if parser.metavar == 'COLLECTION_ACTION' else 'role'
args_kwargs = {}
if galaxy_type == 'collection':
args_kwargs['help'] = 'The collection(s) name or path/url to a tar.gz collection artifact. This is ' \
'mutually exclusive with --requirements-file.'
ignore_errors_help = 'Ignore errors during installation and continue with the next specified ' \
'collection. This will not ignore dependency conflict errors.'
else:
args_kwargs['help'] = 'Role name, URL or tar file'
ignore_errors_help = 'Ignore errors and continue with the next specified role.'
install_parser = parser.add_parser('install', parents=parents,
help='Install {0}(s) from file(s), URL(s) or Ansible '
'Galaxy'.format(galaxy_type))
install_parser.set_defaults(func=self.execute_install)
install_parser.add_argument('args', metavar='{0}_name'.format(galaxy_type), nargs='*', **args_kwargs)
install_parser.add_argument('-i', '--ignore-errors', dest='ignore_errors', action='store_true', default=False,
help=ignore_errors_help)
install_exclusive = install_parser.add_mutually_exclusive_group()
install_exclusive.add_argument('-n', '--no-deps', dest='no_deps', action='store_true', default=False,
help="Don't download {0}s listed as dependencies.".format(galaxy_type))
install_exclusive.add_argument('--force-with-deps', dest='force_with_deps', action='store_true', default=False,
help="Force overwriting an existing {0} and its "
"dependencies.".format(galaxy_type))
if galaxy_type == 'collection':
install_parser.add_argument('-p', '--collections-path', dest='collections_path',
default=C.COLLECTIONS_PATHS[0],
help='The path to the directory containing your collections.')
install_parser.add_argument('-r', '--requirements-file', dest='requirements',
help='A file containing a list of collections to be installed.')
else:
install_parser.add_argument('-r', '--role-file', dest='role_file',
help='A file containing a list of roles to be imported.')
install_parser.add_argument('-g', '--keep-scm-meta', dest='keep_scm_meta', action='store_true',
default=False,
help='Use tar instead of the scm archive option when packaging the role.')
def add_build_options(self, parser, parents=None):
build_parser = parser.add_parser('build', parents=parents,
help='Build an Ansible collection artifact that can be publish to Ansible '
'Galaxy.')
build_parser.set_defaults(func=self.execute_build)
build_parser.add_argument('args', metavar='collection', nargs='*', default=('.',),
help='Path to the collection(s) directory to build. This should be the directory '
'that contains the galaxy.yml file. The default is the current working '
'directory.')
build_parser.add_argument('--output-path', dest='output_path', default='./',
help='The path in which the collection is built to. The default is the current '
'working directory.')
def add_publish_options(self, parser, parents=None):
publish_parser = parser.add_parser('publish', parents=parents,
help='Publish a collection artifact to Ansible Galaxy.')
publish_parser.set_defaults(func=self.execute_publish)
publish_parser.add_argument('args', metavar='collection_path',
help='The path to the collection tarball to publish.')
publish_parser.add_argument('--no-wait', dest='wait', action='store_false', default=True,
help="Don't wait for import validation results.")
publish_parser.add_argument('--import-timeout', dest='import_timeout', type=int, default=0,
help="The time to wait for the collection import process to finish.")
def post_process_args(self, options):
options = super(GalaxyCLI, self).post_process_args(options)
display.verbosity = options.verbosity
return options
def run(self):
super(GalaxyCLI, self).run()
self.galaxy = Galaxy()
def server_config_def(section, key, required):
return {
'description': 'The %s of the %s Galaxy server' % (key, section),
'ini': [
{
'section': 'galaxy_server.%s' % section,
'key': key,
}
],
'env': [
{'name': 'ANSIBLE_GALAXY_SERVER_%s_%s' % (section.upper(), key.upper())},
],
'required': required,
}
server_def = [('url', True), ('username', False), ('password', False), ('token', False),
('auth_url', False)]
config_servers = []
# Need to filter out empty strings or non truthy values as an empty server list env var is equal to [''].
server_list = [s for s in C.GALAXY_SERVER_LIST or [] if s]
for server_key in server_list:
# Config definitions are looked up dynamically based on the C.GALAXY_SERVER_LIST entry. We look up the
# section [galaxy_server.<server>] for the values url, username, password, and token.
config_dict = dict((k, server_config_def(server_key, k, req)) for k, req in server_def)
defs = AnsibleLoader(yaml.safe_dump(config_dict)).get_single_data()
C.config.initialize_plugin_configuration_definitions('galaxy_server', server_key, defs)
server_options = C.config.get_plugin_options('galaxy_server', server_key)
# auth_url is used to create the token, but not directly by GalaxyAPI, so
# it doesn't need to be passed as kwarg to GalaxyApi
auth_url = server_options.pop('auth_url', None)
token_val = server_options['token'] or NoTokenSentinel
username = server_options['username']
# default case if no auth info is provided.
server_options['token'] = None
if username:
server_options['token'] = BasicAuthToken(username,
server_options['password'])
else:
if token_val:
if auth_url:
server_options['token'] = KeycloakToken(access_token=token_val,
auth_url=auth_url,
validate_certs=not context.CLIARGS['ignore_certs'])
else:
# The galaxy v1 / github / django / 'Token'
server_options['token'] = GalaxyToken(token=token_val)
config_servers.append(GalaxyAPI(self.galaxy, server_key, **server_options))
cmd_server = context.CLIARGS['api_server']
cmd_token = GalaxyToken(token=context.CLIARGS['api_key'])
if cmd_server:
# Cmd args take precedence over the config entry but fist check if the arg was a name and use that config
# entry, otherwise create a new API entry for the server specified.
config_server = next((s for s in config_servers if s.name == cmd_server), None)
if config_server:
self.api_servers.append(config_server)
else:
self.api_servers.append(GalaxyAPI(self.galaxy, 'cmd_arg', cmd_server, token=cmd_token))
else:
self.api_servers = config_servers
# Default to C.GALAXY_SERVER if no servers were defined
if len(self.api_servers) == 0:
self.api_servers.append(GalaxyAPI(self.galaxy, 'default', C.GALAXY_SERVER, token=cmd_token))
context.CLIARGS['func']()
@property
def api(self):
return self.api_servers[0]
def _parse_requirements_file(self, requirements_file, allow_old_format=True):
"""
Parses an Ansible requirement.yml file and returns all the roles and/or collections defined in it. There are 2
requirements file format:
# v1 (roles only)
- src: The source of the role, required if include is not set. Can be Galaxy role name, URL to a SCM repo or tarball.
name: Downloads the role to the specified name, defaults to Galaxy name from Galaxy or name of repo if src is a URL.
scm: If src is a URL, specify the SCM. Only git or hd are supported and defaults ot git.
version: The version of the role to download. Can also be tag, commit, or branch name and defaults to master.
include: Path to additional requirements.yml files.
# v2 (roles and collections)
---
roles:
# Same as v1 format just under the roles key
collections:
- namespace.collection
- name: namespace.collection
version: version identifier, multiple identifiers are separated by ','
source: the URL or a predefined source name that relates to C.GALAXY_SERVER_LIST
:param requirements_file: The path to the requirements file.
:param allow_old_format: Will fail if a v1 requirements file is found and this is set to False.
:return: a dict containing roles and collections to found in the requirements file.
"""
requirements = {
'roles': [],
'collections': [],
}
b_requirements_file = to_bytes(requirements_file, errors='surrogate_or_strict')
if not os.path.exists(b_requirements_file):
raise AnsibleError("The requirements file '%s' does not exist." % to_native(requirements_file))
display.vvv("Reading requirement file at '%s'" % requirements_file)
with open(b_requirements_file, 'rb') as req_obj:
try:
file_requirements = yaml.safe_load(req_obj)
except YAMLError as err:
raise AnsibleError(
"Failed to parse the requirements yml at '%s' with the following error:\n%s"
% (to_native(requirements_file), to_native(err)))
if file_requirements is None:
raise AnsibleError("No requirements found in file '%s'" % to_native(requirements_file))
def parse_role_req(requirement):
if "include" not in requirement:
role = RoleRequirement.role_yaml_parse(requirement)
display.vvv("found role %s in yaml file" % to_text(role))
if "name" not in role and "src" not in role:
raise AnsibleError("Must specify name or src for role")
return [GalaxyRole(self.galaxy, self.api, **role)]
else:
b_include_path = to_bytes(requirement["include"], errors="surrogate_or_strict")
if not os.path.isfile(b_include_path):
raise AnsibleError("Failed to find include requirements file '%s' in '%s'"
% (to_native(b_include_path), to_native(requirements_file)))
with open(b_include_path, 'rb') as f_include:
try:
return [GalaxyRole(self.galaxy, self.api, **r) for r in
(RoleRequirement.role_yaml_parse(i) for i in yaml.safe_load(f_include))]
except Exception as e:
raise AnsibleError("Unable to load data from include requirements file: %s %s"
% (to_native(requirements_file), to_native(e)))
if isinstance(file_requirements, list):
# Older format that contains only roles
if not allow_old_format:
raise AnsibleError("Expecting requirements file to be a dict with the key 'collections' that contains "
"a list of collections to install")
for role_req in file_requirements:
requirements['roles'] += parse_role_req(role_req)
else:
# Newer format with a collections and/or roles key
extra_keys = set(file_requirements.keys()).difference(set(['roles', 'collections']))
if extra_keys:
raise AnsibleError("Expecting only 'roles' and/or 'collections' as base keys in the requirements "
"file. Found: %s" % (to_native(", ".join(extra_keys))))
for role_req in file_requirements.get('roles', []):
requirements['roles'] += parse_role_req(role_req)
for collection_req in file_requirements.get('collections', []):
if isinstance(collection_req, dict):
req_name = collection_req.get('name', None)
if req_name is None:
raise AnsibleError("Collections requirement entry should contain the key name.")
req_version = collection_req.get('version', '*')
req_source = collection_req.get('source', None)
if req_source:
# Try and match up the requirement source with our list of Galaxy API servers defined in the
# config, otherwise create a server with that URL without any auth.
req_source = next(iter([a for a in self.api_servers if req_source in [a.name, a.api_server]]),
GalaxyAPI(self.galaxy, "explicit_requirement_%s" % req_name, req_source))
requirements['collections'].append((req_name, req_version, req_source))
else:
requirements['collections'].append((collection_req, '*', None))
return requirements
@staticmethod
def exit_without_ignore(rc=1):
"""
Exits with the specified return code unless the
option --ignore-errors was specified
"""
if not context.CLIARGS['ignore_errors']:
raise AnsibleError('- you can use --ignore-errors to skip failed roles and finish processing the list.')
@staticmethod
def _display_role_info(role_info):
text = [u"", u"Role: %s" % to_text(role_info['name'])]
text.append(u"\tdescription: %s" % role_info.get('description', ''))
for k in sorted(role_info.keys()):
if k in GalaxyCLI.SKIP_INFO_KEYS:
continue
if isinstance(role_info[k], dict):
text.append(u"\t%s:" % (k))
for key in sorted(role_info[k].keys()):
if key in GalaxyCLI.SKIP_INFO_KEYS:
continue
text.append(u"\t\t%s: %s" % (key, role_info[k][key]))
else:
text.append(u"\t%s: %s" % (k, role_info[k]))
return u'\n'.join(text)
@staticmethod
def _resolve_path(path):
return os.path.abspath(os.path.expanduser(os.path.expandvars(path)))
@staticmethod
def _get_skeleton_galaxy_yml(template_path, inject_data):
with open(to_bytes(template_path, errors='surrogate_or_strict'), 'rb') as template_obj:
meta_template = to_text(template_obj.read(), errors='surrogate_or_strict')
galaxy_meta = get_collections_galaxy_meta_info()
required_config = []
optional_config = []
for meta_entry in galaxy_meta:
config_list = required_config if meta_entry.get('required', False) else optional_config
value = inject_data.get(meta_entry['key'], None)
if not value:
meta_type = meta_entry.get('type', 'str')
if meta_type == 'str':
value = ''
elif meta_type == 'list':
value = []
elif meta_type == 'dict':
value = {}
meta_entry['value'] = value
config_list.append(meta_entry)
link_pattern = re.compile(r"L\(([^)]+),\s+([^)]+)\)")
const_pattern = re.compile(r"C\(([^)]+)\)")
def comment_ify(v):
if isinstance(v, list):
v = ". ".join([l.rstrip('.') for l in v])
v = link_pattern.sub(r"\1 <\2>", v)
v = const_pattern.sub(r"'\1'", v)
return textwrap.fill(v, width=117, initial_indent="# ", subsequent_indent="# ", break_on_hyphens=False)
def to_yaml(v):
return yaml.safe_dump(v, default_flow_style=False).rstrip()
env = Environment(loader=BaseLoader)
env.filters['comment_ify'] = comment_ify
env.filters['to_yaml'] = to_yaml
template = env.from_string(meta_template)
meta_value = template.render({'required_config': required_config, 'optional_config': optional_config})
return meta_value
def _require_one_of_collections_requirements(self, collections, requirements_file):
if collections and requirements_file:
raise AnsibleError("The positional collection_name arg and --requirements-file are mutually exclusive.")
elif not collections and not requirements_file:
raise AnsibleError("You must specify a collection name or a requirements file.")
elif requirements_file:
requirements_file = GalaxyCLI._resolve_path(requirements_file)
requirements = self._parse_requirements_file(requirements_file, allow_old_format=False)['collections']
else:
requirements = []
for collection_input in collections:
requirement = None
if os.path.isfile(to_bytes(collection_input, errors='surrogate_or_strict')) or \
urlparse(collection_input).scheme.lower() in ['http', 'https']:
# Arg is a file path or URL to a collection
name = collection_input
else:
name, dummy, requirement = collection_input.partition(':')
requirements.append((name, requirement or '*', None))
return requirements
############################
# execute actions
############################
def execute_role(self):
"""
Perform the action on an Ansible Galaxy role. Must be combined with a further action like delete/install/init
as listed below.
"""
# To satisfy doc build
pass
def execute_collection(self):
"""
Perform the action on an Ansible Galaxy collection. Must be combined with a further action like init/install as
listed below.
"""
# To satisfy doc build
pass
def execute_build(self):
"""
Build an Ansible Galaxy collection artifact that can be stored in a central repository like Ansible Galaxy.
By default, this command builds from the current working directory. You can optionally pass in the
collection input path (where the ``galaxy.yml`` file is).
"""
force = context.CLIARGS['force']
output_path = GalaxyCLI._resolve_path(context.CLIARGS['output_path'])
b_output_path = to_bytes(output_path, errors='surrogate_or_strict')
if not os.path.exists(b_output_path):
os.makedirs(b_output_path)
elif os.path.isfile(b_output_path):
raise AnsibleError("- the output collection directory %s is a file - aborting" % to_native(output_path))
for collection_path in context.CLIARGS['args']:
collection_path = GalaxyCLI._resolve_path(collection_path)
build_collection(collection_path, output_path, force)
def execute_init(self):
"""
Creates the skeleton framework of a role or collection that complies with the Galaxy metadata format.
Requires a role or collection name. The collection name must be in the format ``<namespace>.<collection>``.
"""
galaxy_type = context.CLIARGS['type']
init_path = context.CLIARGS['init_path']
force = context.CLIARGS['force']
obj_skeleton = context.CLIARGS['{0}_skeleton'.format(galaxy_type)]
obj_name = context.CLIARGS['{0}_name'.format(galaxy_type)]
inject_data = dict(
description='your {0} description'.format(galaxy_type),
ansible_plugin_list_dir=get_versioned_doclink('plugins/plugins.html'),
)
if galaxy_type == 'role':
inject_data.update(dict(
author='your name',
company='your company (optional)',
license='license (GPL-2.0-or-later, MIT, etc)',
role_name=obj_name,
role_type=context.CLIARGS['role_type'],
issue_tracker_url='http://example.com/issue/tracker',
repository_url='http://example.com/repository',
documentation_url='http://docs.example.com',
homepage_url='http://example.com',
min_ansible_version=ansible_version[:3], # x.y
))
obj_path = os.path.join(init_path, obj_name)
elif galaxy_type == 'collection':
namespace, collection_name = obj_name.split('.', 1)
inject_data.update(dict(
namespace=namespace,
collection_name=collection_name,
version='1.0.0',
readme='README.md',
authors=['your name <[email protected]>'],
license=['GPL-2.0-or-later'],
repository='http://example.com/repository',
documentation='http://docs.example.com',
homepage='http://example.com',
issues='http://example.com/issue/tracker',
build_ignore=[],
))
obj_path = os.path.join(init_path, namespace, collection_name)
b_obj_path = to_bytes(obj_path, errors='surrogate_or_strict')
if os.path.exists(b_obj_path):
if os.path.isfile(obj_path):
raise AnsibleError("- the path %s already exists, but is a file - aborting" % to_native(obj_path))
elif not force:
raise AnsibleError("- the directory %s already exists. "
"You can use --force to re-initialize this directory,\n"
"however it will reset any main.yml files that may have\n"
"been modified there already." % to_native(obj_path))
if obj_skeleton is not None:
own_skeleton = False
skeleton_ignore_expressions = C.GALAXY_ROLE_SKELETON_IGNORE
else:
own_skeleton = True
obj_skeleton = self.galaxy.default_role_skeleton_path
skeleton_ignore_expressions = ['^.*/.git_keep$']
obj_skeleton = os.path.expanduser(obj_skeleton)
skeleton_ignore_re = [re.compile(x) for x in skeleton_ignore_expressions]
if not os.path.exists(obj_skeleton):
raise AnsibleError("- the skeleton path '{0}' does not exist, cannot init {1}".format(
to_native(obj_skeleton), galaxy_type)
)
template_env = Environment(loader=FileSystemLoader(obj_skeleton))
# create role directory
if not os.path.exists(b_obj_path):
os.makedirs(b_obj_path)
for root, dirs, files in os.walk(obj_skeleton, topdown=True):
rel_root = os.path.relpath(root, obj_skeleton)
rel_dirs = rel_root.split(os.sep)
rel_root_dir = rel_dirs[0]
if galaxy_type == 'collection':
# A collection can contain templates in playbooks/*/templates and roles/*/templates
in_templates_dir = rel_root_dir in ['playbooks', 'roles'] and 'templates' in rel_dirs
else:
in_templates_dir = rel_root_dir == 'templates'
dirs[:] = [d for d in dirs if not any(r.match(d) for r in skeleton_ignore_re)]
for f in files:
filename, ext = os.path.splitext(f)
if any(r.match(os.path.join(rel_root, f)) for r in skeleton_ignore_re):
continue
if galaxy_type == 'collection' and own_skeleton and rel_root == '.' and f == 'galaxy.yml.j2':
# Special use case for galaxy.yml.j2 in our own default collection skeleton. We build the options
# dynamically which requires special options to be set.
# The templated data's keys must match the key name but the inject data contains collection_name
# instead of name. We just make a copy and change the key back to name for this file.
template_data = inject_data.copy()
template_data['name'] = template_data.pop('collection_name')
meta_value = GalaxyCLI._get_skeleton_galaxy_yml(os.path.join(root, rel_root, f), template_data)
b_dest_file = to_bytes(os.path.join(obj_path, rel_root, filename), errors='surrogate_or_strict')
with open(b_dest_file, 'wb') as galaxy_obj:
galaxy_obj.write(to_bytes(meta_value, errors='surrogate_or_strict'))
elif ext == ".j2" and not in_templates_dir:
src_template = os.path.join(rel_root, f)
dest_file = os.path.join(obj_path, rel_root, filename)
template_env.get_template(src_template).stream(inject_data).dump(dest_file, encoding='utf-8')
else:
f_rel_path = os.path.relpath(os.path.join(root, f), obj_skeleton)
shutil.copyfile(os.path.join(root, f), os.path.join(obj_path, f_rel_path))
for d in dirs:
b_dir_path = to_bytes(os.path.join(obj_path, rel_root, d), errors='surrogate_or_strict')
if not os.path.exists(b_dir_path):
os.makedirs(b_dir_path)
display.display("- %s %s was created successfully" % (galaxy_type.title(), obj_name))
def execute_info(self):
"""
prints out detailed information about an installed role as well as info available from the galaxy API.
"""
roles_path = context.CLIARGS['roles_path']
data = ''
for role in context.CLIARGS['args']:
role_info = {'path': roles_path}
gr = GalaxyRole(self.galaxy, self.api, role)
install_info = gr.install_info
if install_info:
if 'version' in install_info:
install_info['installed_version'] = install_info['version']
del install_info['version']
role_info.update(install_info)
remote_data = False
if not context.CLIARGS['offline']:
remote_data = self.api.lookup_role_by_name(role, False)
if remote_data:
role_info.update(remote_data)
if gr.metadata:
role_info.update(gr.metadata)
req = RoleRequirement()
role_spec = req.role_yaml_parse({'role': role})
if role_spec:
role_info.update(role_spec)
data = self._display_role_info(role_info)
# FIXME: This is broken in both 1.9 and 2.0 as
# _display_role_info() always returns something
if not data:
data = u"\n- the role %s was not found" % role
self.pager(data)
def execute_verify(self):
collections = context.CLIARGS['args']
search_paths = context.CLIARGS['collections_path']
ignore_certs = context.CLIARGS['ignore_certs']
ignore_errors = context.CLIARGS['ignore_errors']
requirements_file = context.CLIARGS['requirements']
requirements = self._require_one_of_collections_requirements(collections, requirements_file)
resolved_paths = [validate_collection_path(GalaxyCLI._resolve_path(path)) for path in search_paths]
verify_collections(requirements, resolved_paths, self.api_servers, (not ignore_certs), ignore_errors)
return 0
def execute_install(self):
"""
Install one or more roles(``ansible-galaxy role install``), or one or more collections(``ansible-galaxy collection install``).
You can pass in a list (roles or collections) or use the file
option listed below (these are mutually exclusive). If you pass in a list, it
can be a name (which will be downloaded via the galaxy API and github), or it can be a local tar archive file.
"""
if context.CLIARGS['type'] == 'collection':
collections = context.CLIARGS['args']
force = context.CLIARGS['force']
output_path = context.CLIARGS['collections_path']
ignore_certs = context.CLIARGS['ignore_certs']
ignore_errors = context.CLIARGS['ignore_errors']
requirements_file = context.CLIARGS['requirements']
no_deps = context.CLIARGS['no_deps']
force_deps = context.CLIARGS['force_with_deps']
if collections and requirements_file:
raise AnsibleError("The positional collection_name arg and --requirements-file are mutually exclusive.")
elif not collections and not requirements_file:
raise AnsibleError("You must specify a collection name or a requirements file.")
if requirements_file:
requirements_file = GalaxyCLI._resolve_path(requirements_file)
requirements = self._require_one_of_collections_requirements(collections, requirements_file)
output_path = GalaxyCLI._resolve_path(output_path)
collections_path = C.COLLECTIONS_PATHS
if len([p for p in collections_path if p.startswith(output_path)]) == 0:
display.warning("The specified collections path '%s' is not part of the configured Ansible "
"collections paths '%s'. The installed collection won't be picked up in an Ansible "
"run." % (to_text(output_path), to_text(":".join(collections_path))))
output_path = validate_collection_path(output_path)
b_output_path = to_bytes(output_path, errors='surrogate_or_strict')
if not os.path.exists(b_output_path):
os.makedirs(b_output_path)
install_collections(requirements, output_path, self.api_servers, (not ignore_certs), ignore_errors,
no_deps, force, force_deps)
return 0
role_file = context.CLIARGS['role_file']
if not context.CLIARGS['args'] and role_file is None:
# the user needs to specify one of either --role-file or specify a single user/role name
raise AnsibleOptionsError("- you must specify a user/role name or a roles file")
no_deps = context.CLIARGS['no_deps']
force_deps = context.CLIARGS['force_with_deps']
force = context.CLIARGS['force'] or force_deps
roles_left = []
if role_file:
if not (role_file.endswith('.yaml') or role_file.endswith('.yml')):
raise AnsibleError("Invalid role requirements file, it must end with a .yml or .yaml extension")
roles_left = self._parse_requirements_file(role_file)['roles']
else:
# roles were specified directly, so we'll just go out grab them
# (and their dependencies, unless the user doesn't want us to).
for rname in context.CLIARGS['args']:
role = RoleRequirement.role_yaml_parse(rname.strip())
roles_left.append(GalaxyRole(self.galaxy, self.api, **role))
for role in roles_left:
# only process roles in roles files when names matches if given
if role_file and context.CLIARGS['args'] and role.name not in context.CLIARGS['args']:
display.vvv('Skipping role %s' % role.name)
continue
display.vvv('Processing role %s ' % role.name)
# query the galaxy API for the role data
if role.install_info is not None:
if role.install_info['version'] != role.version or force:
if force:
display.display('- changing role %s from %s to %s' %
(role.name, role.install_info['version'], role.version or "unspecified"))
role.remove()
else:
display.warning('- %s (%s) is already installed - use --force to change version to %s' %
(role.name, role.install_info['version'], role.version or "unspecified"))
continue
else:
if not force:
display.display('- %s is already installed, skipping.' % str(role))
continue
try:
installed = role.install()
except AnsibleError as e:
display.warning(u"- %s was NOT installed successfully: %s " % (role.name, to_text(e)))
self.exit_without_ignore()
continue
# install dependencies, if we want them
if not no_deps and installed:
if not role.metadata:
display.warning("Meta file %s is empty. Skipping dependencies." % role.path)
else:
role_dependencies = role.metadata.get('dependencies') or []
for dep in role_dependencies:
display.debug('Installing dep %s' % dep)
dep_req = RoleRequirement()
dep_info = dep_req.role_yaml_parse(dep)
dep_role = GalaxyRole(self.galaxy, self.api, **dep_info)
if '.' not in dep_role.name and '.' not in dep_role.src and dep_role.scm is None:
# we know we can skip this, as it's not going to
# be found on galaxy.ansible.com
continue
if dep_role.install_info is None:
if dep_role not in roles_left:
display.display('- adding dependency: %s' % to_text(dep_role))
roles_left.append(dep_role)
else:
display.display('- dependency %s already pending installation.' % dep_role.name)
else:
if dep_role.install_info['version'] != dep_role.version:
if force_deps:
display.display('- changing dependant role %s from %s to %s' %
(dep_role.name, dep_role.install_info['version'], dep_role.version or "unspecified"))
dep_role.remove()
roles_left.append(dep_role)
else:
display.warning('- dependency %s (%s) from role %s differs from already installed version (%s), skipping' %
(to_text(dep_role), dep_role.version, role.name, dep_role.install_info['version']))
else:
if force_deps:
roles_left.append(dep_role)
else:
display.display('- dependency %s is already installed, skipping.' % dep_role.name)
if not installed:
display.warning("- %s was NOT installed successfully." % role.name)
self.exit_without_ignore()
return 0
def execute_remove(self):
"""
removes the list of roles passed as arguments from the local system.
"""
if not context.CLIARGS['args']:
raise AnsibleOptionsError('- you must specify at least one role to remove.')
for role_name in context.CLIARGS['args']:
role = GalaxyRole(self.galaxy, self.api, role_name)
try:
if role.remove():
display.display('- successfully removed %s' % role_name)
else:
display.display('- %s is not installed, skipping.' % role_name)
except Exception as e:
raise AnsibleError("Failed to remove role %s: %s" % (role_name, to_native(e)))
return 0
def execute_list(self):
"""
List installed collections or roles
"""
if context.CLIARGS['type'] == 'role':
self.execute_list_role()
elif context.CLIARGS['type'] == 'collection':
self.execute_list_collection()
def execute_list_role(self):
"""
List all roles installed on the local system or a specific role
"""
path_found = False
role_found = False
warnings = []
roles_search_paths = context.CLIARGS['roles_path']
role_name = context.CLIARGS['role']
for path in roles_search_paths:
role_path = GalaxyCLI._resolve_path(path)
if os.path.isdir(path):
path_found = True
else:
warnings.append("- the configured path {0} does not exist.".format(path))
continue
if role_name:
# show the requested role, if it exists
gr = GalaxyRole(self.galaxy, self.api, role_name, path=os.path.join(role_path, role_name))
if os.path.isdir(gr.path):
role_found = True
display.display('# %s' % os.path.dirname(gr.path))
_display_role(gr)
break
warnings.append("- the role %s was not found" % role_name)
else:
if not os.path.exists(role_path):
warnings.append("- the configured path %s does not exist." % role_path)
continue
if not os.path.isdir(role_path):
warnings.append("- the configured path %s, exists, but it is not a directory." % role_path)
continue
display.display('# %s' % role_path)
path_files = os.listdir(role_path)
for path_file in path_files:
gr = GalaxyRole(self.galaxy, self.api, path_file, path=path)
if gr.metadata:
_display_role(gr)
# Do not warn if the role was found in any of the search paths
if role_found and role_name:
warnings = []
for w in warnings:
display.warning(w)
if not path_found:
raise AnsibleOptionsError("- None of the provided paths were usable. Please specify a valid path with --{0}s-path".format(context.CLIARGS['type']))
return 0
def execute_list_collection(self):
"""
List all collections installed on the local system
"""
collections_search_paths = set(context.CLIARGS['collections_path'])
collection_name = context.CLIARGS['collection']
default_collections_path = C.config.get_configuration_definition('COLLECTIONS_PATHS').get('default')
warnings = []
path_found = False
collection_found = False
for path in collections_search_paths:
collection_path = GalaxyCLI._resolve_path(path)
if not os.path.exists(path):
if path in default_collections_path:
# don't warn for missing default paths
continue
warnings.append("- the configured path {0} does not exist.".format(collection_path))
continue
if not os.path.isdir(collection_path):
warnings.append("- the configured path {0}, exists, but it is not a directory.".format(collection_path))
continue
path_found = True
if collection_name:
# list a specific collection
validate_collection_name(collection_name)
namespace, collection = collection_name.split('.')
collection_path = validate_collection_path(collection_path)
b_collection_path = to_bytes(os.path.join(collection_path, namespace, collection), errors='surrogate_or_strict')
if not os.path.exists(b_collection_path):
warnings.append("- unable to find {0} in collection paths".format(collection_name))
continue
if not os.path.isdir(collection_path):
warnings.append("- the configured path {0}, exists, but it is not a directory.".format(collection_path))
continue
collection_found = True
collection = CollectionRequirement.from_path(b_collection_path, False)
fqcn_width, version_width = _get_collection_widths(collection)
_display_header(collection_path, 'Collection', 'Version', fqcn_width, version_width)
_display_collection(collection, fqcn_width, version_width)
else:
# list all collections
collection_path = validate_collection_path(path)
if os.path.isdir(collection_path):
display.vvv("Searching {0} for collections".format(collection_path))
collections = find_existing_collections(collection_path)
else:
# There was no 'ansible_collections/' directory in the path, so there
# or no collections here.
display.vvv("No 'ansible_collections' directory found at {0}".format(collection_path))
continue
if not collections:
display.vvv("No collections found at {0}".format(collection_path))
continue
# Display header
fqcn_width, version_width = _get_collection_widths(collections)
_display_header(collection_path, 'Collection', 'Version', fqcn_width, version_width)
# Sort collections by the namespace and name
collections.sort(key=to_text)
for collection in collections:
_display_collection(collection, fqcn_width, version_width)
# Do not warn if the specific collection was found in any of the search paths
if collection_found and collection_name:
warnings = []
for w in warnings:
display.warning(w)
if not path_found:
raise AnsibleOptionsError("- None of the provided paths were usable. Please specify a valid path with --{0}s-path".format(context.CLIARGS['type']))
return 0
def execute_publish(self):
"""
Publish a collection into Ansible Galaxy. Requires the path to the collection tarball to publish.
"""
collection_path = GalaxyCLI._resolve_path(context.CLIARGS['args'])
wait = context.CLIARGS['wait']
timeout = context.CLIARGS['import_timeout']
publish_collection(collection_path, self.api, wait, timeout)
def execute_search(self):
''' searches for roles on the Ansible Galaxy server'''
page_size = 1000
search = None
if context.CLIARGS['args']:
search = '+'.join(context.CLIARGS['args'])
if not search and not context.CLIARGS['platforms'] and not context.CLIARGS['galaxy_tags'] and not context.CLIARGS['author']:
raise AnsibleError("Invalid query. At least one search term, platform, galaxy tag or author must be provided.")
response = self.api.search_roles(search, platforms=context.CLIARGS['platforms'],
tags=context.CLIARGS['galaxy_tags'], author=context.CLIARGS['author'], page_size=page_size)
if response['count'] == 0:
display.display("No roles match your search.", color=C.COLOR_ERROR)
return True
data = [u'']
if response['count'] > page_size:
data.append(u"Found %d roles matching your search. Showing first %s." % (response['count'], page_size))
else:
data.append(u"Found %d roles matching your search:" % response['count'])
max_len = []
for role in response['results']:
max_len.append(len(role['username'] + '.' + role['name']))
name_len = max(max_len)
format_str = u" %%-%ds %%s" % name_len
data.append(u'')
data.append(format_str % (u"Name", u"Description"))
data.append(format_str % (u"----", u"-----------"))
for role in response['results']:
data.append(format_str % (u'%s.%s' % (role['username'], role['name']), role['description']))
data = u'\n'.join(data)
self.pager(data)
return True
def execute_login(self):
"""
verify user's identify via Github and retrieve an auth token from Ansible Galaxy.
"""
# Authenticate with github and retrieve a token
if context.CLIARGS['token'] is None:
if C.GALAXY_TOKEN:
github_token = C.GALAXY_TOKEN
else:
login = GalaxyLogin(self.galaxy)
github_token = login.create_github_token()
else:
github_token = context.CLIARGS['token']
galaxy_response = self.api.authenticate(github_token)
if context.CLIARGS['token'] is None and C.GALAXY_TOKEN is None:
# Remove the token we created
login.remove_github_token()
# Store the Galaxy token
token = GalaxyToken()
token.set(galaxy_response['token'])
display.display("Successfully logged into Galaxy as %s" % galaxy_response['username'])
return 0
def execute_import(self):
""" used to import a role into Ansible Galaxy """
colors = {
'INFO': 'normal',
'WARNING': C.COLOR_WARN,
'ERROR': C.COLOR_ERROR,
'SUCCESS': C.COLOR_OK,
'FAILED': C.COLOR_ERROR,
}
github_user = to_text(context.CLIARGS['github_user'], errors='surrogate_or_strict')
github_repo = to_text(context.CLIARGS['github_repo'], errors='surrogate_or_strict')
if context.CLIARGS['check_status']:
task = self.api.get_import_task(github_user=github_user, github_repo=github_repo)
else:
# Submit an import request
task = self.api.create_import_task(github_user, github_repo,
reference=context.CLIARGS['reference'],
role_name=context.CLIARGS['role_name'])
if len(task) > 1:
# found multiple roles associated with github_user/github_repo
display.display("WARNING: More than one Galaxy role associated with Github repo %s/%s." % (github_user, github_repo),
color='yellow')
display.display("The following Galaxy roles are being updated:" + u'\n', color=C.COLOR_CHANGED)
for t in task:
display.display('%s.%s' % (t['summary_fields']['role']['namespace'], t['summary_fields']['role']['name']), color=C.COLOR_CHANGED)
display.display(u'\nTo properly namespace this role, remove each of the above and re-import %s/%s from scratch' % (github_user, github_repo),
color=C.COLOR_CHANGED)
return 0
# found a single role as expected
display.display("Successfully submitted import request %d" % task[0]['id'])
if not context.CLIARGS['wait']:
display.display("Role name: %s" % task[0]['summary_fields']['role']['name'])
display.display("Repo: %s/%s" % (task[0]['github_user'], task[0]['github_repo']))
if context.CLIARGS['check_status'] or context.CLIARGS['wait']:
# Get the status of the import
msg_list = []
finished = False
while not finished:
task = self.api.get_import_task(task_id=task[0]['id'])
for msg in task[0]['summary_fields']['task_messages']:
if msg['id'] not in msg_list:
display.display(msg['message_text'], color=colors[msg['message_type']])
msg_list.append(msg['id'])
if task[0]['state'] in ['SUCCESS', 'FAILED']:
finished = True
else:
time.sleep(10)
return 0
def execute_setup(self):
""" Setup an integration from Github or Travis for Ansible Galaxy roles"""
if context.CLIARGS['setup_list']:
# List existing integration secrets
secrets = self.api.list_secrets()
if len(secrets) == 0:
# None found
display.display("No integrations found.")
return 0
display.display(u'\n' + "ID Source Repo", color=C.COLOR_OK)
display.display("---------- ---------- ----------", color=C.COLOR_OK)
for secret in secrets:
display.display("%-10s %-10s %s/%s" % (secret['id'], secret['source'], secret['github_user'],
secret['github_repo']), color=C.COLOR_OK)
return 0
if context.CLIARGS['remove_id']:
# Remove a secret
self.api.remove_secret(context.CLIARGS['remove_id'])
display.display("Secret removed. Integrations using this secret will not longer work.", color=C.COLOR_OK)
return 0
source = context.CLIARGS['source']
github_user = context.CLIARGS['github_user']
github_repo = context.CLIARGS['github_repo']
secret = context.CLIARGS['secret']
resp = self.api.add_secret(source, github_user, github_repo, secret)
display.display("Added integration for %s %s/%s" % (resp['source'], resp['github_user'], resp['github_repo']))
return 0
def execute_delete(self):
""" Delete a role from Ansible Galaxy. """
github_user = context.CLIARGS['github_user']
github_repo = context.CLIARGS['github_repo']
resp = self.api.delete_role(github_user, github_repo)
if len(resp['deleted_roles']) > 1:
display.display("Deleted the following roles:")
display.display("ID User Name")
display.display("------ --------------- ----------")
for role in resp['deleted_roles']:
display.display("%-8s %-15s %s" % (role.id, role.namespace, role.name))
display.display(resp['status'])
return True
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 64,905 |
Add --pre flag for ansible-galaxy client
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
We publish semver collection (vyos.vyos:0.0.1-dev58) to ansible galaxy, however by default it seems ansible-galaxy client will not be able to find it. You must specific the version number directly to fetch it.
https://galaxy.ansible.com/vyos/vyos
EDIT:
This is working as expect, but it would be nice to Include:
--pre Include pre-release and development versions. By default, ansible-galaxy only finds stable versions.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ansible-galaxy
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.1
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
$ ansible-galaxy collection install vyos.vyos -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
ERROR! Cannot meet requirement * for dependency vyos.vyos from source 'https://galaxy.ansible.com/api/'. Available versions before last requirement added:
Requirements from:
base - 'vyos.vyos:*'
(venv) [pabelanger@localhost openstack]$ ansible-galaxy collection install vyos.vyos:0.0.1-dev57 -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
Starting collection install process
Installing 'vyos.vyos:0.0.1-dev57' to '/tmp/11123/ansible_collections/vyos/vyos'
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
vyos.vyos collection to be found and installed
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/64905
|
https://github.com/ansible/ansible/pull/68258
|
ed9de94ad92dcc07ea3863808e0f4b00f2402cea
|
d3ec31f8d5683926aa6a05bb573d9929a6266fac
| 2019-11-15T17:10:49Z |
python
| 2020-03-23T21:04:07Z |
lib/ansible/galaxy/collection.py
|
# Copyright: (c) 2019, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import fnmatch
import json
import operator
import os
import shutil
import sys
import tarfile
import tempfile
import threading
import time
import yaml
from collections import namedtuple
from contextlib import contextmanager
from distutils.version import LooseVersion, StrictVersion
from hashlib import sha256
from io import BytesIO
from yaml.error import YAMLError
try:
import queue
except ImportError:
import Queue as queue # Python 2
import ansible.constants as C
from ansible.errors import AnsibleError
from ansible.galaxy import get_collections_galaxy_meta_info
from ansible.galaxy.api import CollectionVersionMetadata, GalaxyError
from ansible.galaxy.user_agent import user_agent
from ansible.module_utils import six
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.utils.collection_loader import AnsibleCollectionRef
from ansible.utils.display import Display
from ansible.utils.hashing import secure_hash, secure_hash_s
from ansible.module_utils.urls import open_url
urlparse = six.moves.urllib.parse.urlparse
urllib_error = six.moves.urllib.error
display = Display()
MANIFEST_FORMAT = 1
ModifiedContent = namedtuple('ModifiedContent', ['filename', 'expected', 'installed'])
class CollectionRequirement:
_FILE_MAPPING = [(b'MANIFEST.json', 'manifest_file'), (b'FILES.json', 'files_file')]
def __init__(self, namespace, name, b_path, api, versions, requirement, force, parent=None, metadata=None,
files=None, skip=False):
"""
Represents a collection requirement, the versions that are available to be installed as well as any
dependencies the collection has.
:param namespace: The collection namespace.
:param name: The collection name.
:param b_path: Byte str of the path to the collection tarball if it has already been downloaded.
:param api: The GalaxyAPI to use if the collection is from Galaxy.
:param versions: A list of versions of the collection that are available.
:param requirement: The version requirement string used to verify the list of versions fit the requirements.
:param force: Whether the force flag applied to the collection.
:param parent: The name of the parent the collection is a dependency of.
:param metadata: The galaxy.api.CollectionVersionMetadata that has already been retrieved from the Galaxy
server.
:param files: The files that exist inside the collection. This is based on the FILES.json file inside the
collection artifact.
:param skip: Whether to skip installing the collection. Should be set if the collection is already installed
and force is not set.
"""
self.namespace = namespace
self.name = name
self.b_path = b_path
self.api = api
self.versions = set(versions)
self.force = force
self.skip = skip
self.required_by = []
self._metadata = metadata
self._files = files
self.add_requirement(parent, requirement)
def __str__(self):
return to_native("%s.%s" % (self.namespace, self.name))
def __unicode__(self):
return u"%s.%s" % (self.namespace, self.name)
@property
def metadata(self):
self._get_metadata()
return self._metadata
@property
def latest_version(self):
try:
return max([v for v in self.versions if v != '*'], key=LooseVersion)
except ValueError: # ValueError: max() arg is an empty sequence
return '*'
@property
def dependencies(self):
if not self._metadata:
if len(self.versions) > 1:
return {}
self._get_metadata()
dependencies = self._metadata.dependencies
if dependencies is None:
return {}
return dependencies
def add_requirement(self, parent, requirement):
self.required_by.append((parent, requirement))
new_versions = set(v for v in self.versions if self._meets_requirements(v, requirement, parent))
if len(new_versions) == 0:
if self.skip:
force_flag = '--force-with-deps' if parent else '--force'
version = self.latest_version if self.latest_version != '*' else 'unknown'
msg = "Cannot meet requirement %s:%s as it is already installed at version '%s'. Use %s to overwrite" \
% (to_text(self), requirement, version, force_flag)
raise AnsibleError(msg)
elif parent is None:
msg = "Cannot meet requirement %s for dependency %s" % (requirement, to_text(self))
else:
msg = "Cannot meet dependency requirement '%s:%s' for collection %s" \
% (to_text(self), requirement, parent)
collection_source = to_text(self.b_path, nonstring='passthru') or self.api.api_server
req_by = "\n".join(
"\t%s - '%s:%s'" % (to_text(p) if p else 'base', to_text(self), r)
for p, r in self.required_by
)
versions = ", ".join(sorted(self.versions, key=LooseVersion))
raise AnsibleError(
"%s from source '%s'. Available versions before last requirement added: %s\nRequirements from:\n%s"
% (msg, collection_source, versions, req_by)
)
self.versions = new_versions
def install(self, path, b_temp_path):
if self.skip:
display.display("Skipping '%s' as it is already installed" % to_text(self))
return
# Install if it is not
collection_path = os.path.join(path, self.namespace, self.name)
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
display.display("Installing '%s:%s' to '%s'" % (to_text(self), self.latest_version, collection_path))
if self.b_path is None:
download_url = self._metadata.download_url
artifact_hash = self._metadata.artifact_sha256
headers = {}
self.api._add_auth_token(headers, download_url, required=False)
self.b_path = _download_file(download_url, b_temp_path, artifact_hash, self.api.validate_certs,
headers=headers)
if os.path.exists(b_collection_path):
shutil.rmtree(b_collection_path)
os.makedirs(b_collection_path)
with tarfile.open(self.b_path, mode='r') as collection_tar:
files_member_obj = collection_tar.getmember('FILES.json')
with _tarfile_extract(collection_tar, files_member_obj) as files_obj:
files = json.loads(to_text(files_obj.read(), errors='surrogate_or_strict'))
_extract_tar_file(collection_tar, 'MANIFEST.json', b_collection_path, b_temp_path)
_extract_tar_file(collection_tar, 'FILES.json', b_collection_path, b_temp_path)
for file_info in files['files']:
file_name = file_info['name']
if file_name == '.':
continue
if file_info['ftype'] == 'file':
_extract_tar_file(collection_tar, file_name, b_collection_path, b_temp_path,
expected_hash=file_info['chksum_sha256'])
else:
os.makedirs(os.path.join(b_collection_path, to_bytes(file_name, errors='surrogate_or_strict')))
def set_latest_version(self):
self.versions = set([self.latest_version])
self._get_metadata()
def verify(self, remote_collection, path, b_temp_tar_path):
if not self.skip:
display.display("'%s' has not been installed, nothing to verify" % (to_text(self)))
return
collection_path = os.path.join(path, self.namespace, self.name)
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
display.vvv("Verifying '%s:%s'." % (to_text(self), self.latest_version))
display.vvv("Installed collection found at '%s'" % collection_path)
display.vvv("Remote collection found at '%s'" % remote_collection.metadata.download_url)
# Compare installed version versus requirement version
if self.latest_version != remote_collection.latest_version:
err = "%s has the version '%s' but is being compared to '%s'" % (to_text(self), self.latest_version, remote_collection.latest_version)
display.display(err)
return
modified_content = []
# Verify the manifest hash matches before verifying the file manifest
expected_hash = _get_tar_file_hash(b_temp_tar_path, 'MANIFEST.json')
self._verify_file_hash(b_collection_path, 'MANIFEST.json', expected_hash, modified_content)
manifest = _get_json_from_tar_file(b_temp_tar_path, 'MANIFEST.json')
# Use the manifest to verify the file manifest checksum
file_manifest_data = manifest['file_manifest_file']
file_manifest_filename = file_manifest_data['name']
expected_hash = file_manifest_data['chksum_%s' % file_manifest_data['chksum_type']]
# Verify the file manifest before using it to verify individual files
self._verify_file_hash(b_collection_path, file_manifest_filename, expected_hash, modified_content)
file_manifest = _get_json_from_tar_file(b_temp_tar_path, file_manifest_filename)
# Use the file manifest to verify individual file checksums
for manifest_data in file_manifest['files']:
if manifest_data['ftype'] == 'file':
expected_hash = manifest_data['chksum_%s' % manifest_data['chksum_type']]
self._verify_file_hash(b_collection_path, manifest_data['name'], expected_hash, modified_content)
if modified_content:
display.display("Collection %s contains modified content in the following files:" % to_text(self))
display.display(to_text(self))
display.vvv(to_text(self.b_path))
for content_change in modified_content:
display.display(' %s' % content_change.filename)
display.vvv(" Expected: %s\n Found: %s" % (content_change.expected, content_change.installed))
else:
display.vvv("Successfully verified that checksums for '%s:%s' match the remote collection" % (to_text(self), self.latest_version))
def _verify_file_hash(self, b_path, filename, expected_hash, error_queue):
b_file_path = to_bytes(os.path.join(to_text(b_path), filename), errors='surrogate_or_strict')
if not os.path.isfile(b_file_path):
actual_hash = None
else:
with open(b_file_path, mode='rb') as file_object:
actual_hash = _consume_file(file_object)
if expected_hash != actual_hash:
error_queue.append(ModifiedContent(filename=filename, expected=expected_hash, installed=actual_hash))
def _get_metadata(self):
if self._metadata:
return
self._metadata = self.api.get_collection_version_metadata(self.namespace, self.name, self.latest_version)
def _meets_requirements(self, version, requirements, parent):
"""
Supports version identifiers can be '==', '!=', '>', '>=', '<', '<=', '*'. Each requirement is delimited by ','
"""
op_map = {
'!=': operator.ne,
'==': operator.eq,
'=': operator.eq,
'>=': operator.ge,
'>': operator.gt,
'<=': operator.le,
'<': operator.lt,
}
for req in list(requirements.split(',')):
op_pos = 2 if len(req) > 1 and req[1] == '=' else 1
op = op_map.get(req[:op_pos])
requirement = req[op_pos:]
if not op:
requirement = req
op = operator.eq
# In the case we are checking a new requirement on a base requirement (parent != None) we can't accept
# version as '*' (unknown version) unless the requirement is also '*'.
if parent and version == '*' and requirement != '*':
display.warning("Failed to validate the collection requirement '%s:%s' for %s when the existing "
"install does not have a version set, the collection may not work."
% (to_text(self), req, parent))
continue
elif requirement == '*' or version == '*':
continue
if not op(LooseVersion(version), LooseVersion(requirement)):
break
else:
return True
# The loop was broken early, it does not meet all the requirements
return False
@staticmethod
def from_tar(b_path, force, parent=None):
if not tarfile.is_tarfile(b_path):
raise AnsibleError("Collection artifact at '%s' is not a valid tar file." % to_native(b_path))
info = {}
with tarfile.open(b_path, mode='r') as collection_tar:
for b_member_name, property_name in CollectionRequirement._FILE_MAPPING:
n_member_name = to_native(b_member_name)
try:
member = collection_tar.getmember(n_member_name)
except KeyError:
raise AnsibleError("Collection at '%s' does not contain the required file %s."
% (to_native(b_path), n_member_name))
with _tarfile_extract(collection_tar, member) as member_obj:
try:
info[property_name] = json.loads(to_text(member_obj.read(), errors='surrogate_or_strict'))
except ValueError:
raise AnsibleError("Collection tar file member %s does not contain a valid json string."
% n_member_name)
meta = info['manifest_file']['collection_info']
files = info['files_file']['files']
namespace = meta['namespace']
name = meta['name']
version = meta['version']
meta = CollectionVersionMetadata(namespace, name, version, None, None, meta['dependencies'])
return CollectionRequirement(namespace, name, b_path, None, [version], version, force, parent=parent,
metadata=meta, files=files)
@staticmethod
def from_path(b_path, force, parent=None):
info = {}
for b_file_name, property_name in CollectionRequirement._FILE_MAPPING:
b_file_path = os.path.join(b_path, b_file_name)
if not os.path.exists(b_file_path):
continue
with open(b_file_path, 'rb') as file_obj:
try:
info[property_name] = json.loads(to_text(file_obj.read(), errors='surrogate_or_strict'))
except ValueError:
raise AnsibleError("Collection file at '%s' does not contain a valid json string."
% to_native(b_file_path))
if 'manifest_file' in info:
manifest = info['manifest_file']['collection_info']
namespace = manifest['namespace']
name = manifest['name']
version = to_text(manifest['version'], errors='surrogate_or_strict')
if not hasattr(LooseVersion(version), 'version'):
display.warning("Collection at '%s' does not have a valid version set, falling back to '*'. Found "
"version: '%s'" % (to_text(b_path), version))
version = '*'
dependencies = manifest['dependencies']
else:
display.warning("Collection at '%s' does not have a MANIFEST.json file, cannot detect version."
% to_text(b_path))
parent_dir, name = os.path.split(to_text(b_path, errors='surrogate_or_strict'))
namespace = os.path.split(parent_dir)[1]
version = '*'
dependencies = {}
meta = CollectionVersionMetadata(namespace, name, version, None, None, dependencies)
files = info.get('files_file', {}).get('files', {})
return CollectionRequirement(namespace, name, b_path, None, [version], version, force, parent=parent,
metadata=meta, files=files, skip=True)
@staticmethod
def from_name(collection, apis, requirement, force, parent=None):
namespace, name = collection.split('.', 1)
galaxy_meta = None
for api in apis:
try:
if not (requirement == '*' or requirement.startswith('<') or requirement.startswith('>') or
requirement.startswith('!=')):
if requirement.startswith('='):
requirement = requirement.lstrip('=')
resp = api.get_collection_version_metadata(namespace, name, requirement)
galaxy_meta = resp
versions = [resp.version]
else:
resp = api.get_collection_versions(namespace, name)
# Galaxy supports semver but ansible-galaxy does not. We ignore any versions that don't match
# StrictVersion (x.y.z) and only support pre-releases if an explicit version was set (done above).
versions = [v for v in resp if StrictVersion.version_re.match(v)]
except GalaxyError as err:
if err.http_code == 404:
display.vvv("Collection '%s' is not available from server %s %s"
% (collection, api.name, api.api_server))
continue
raise
display.vvv("Collection '%s' obtained from server %s %s" % (collection, api.name, api.api_server))
break
else:
raise AnsibleError("Failed to find collection %s:%s" % (collection, requirement))
req = CollectionRequirement(namespace, name, None, api, versions, requirement, force, parent=parent,
metadata=galaxy_meta)
return req
def build_collection(collection_path, output_path, force):
"""
Creates the Ansible collection artifact in a .tar.gz file.
:param collection_path: The path to the collection to build. This should be the directory that contains the
galaxy.yml file.
:param output_path: The path to create the collection build artifact. This should be a directory.
:param force: Whether to overwrite an existing collection build artifact or fail.
:return: The path to the collection build artifact.
"""
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
b_galaxy_path = os.path.join(b_collection_path, b'galaxy.yml')
if not os.path.exists(b_galaxy_path):
raise AnsibleError("The collection galaxy.yml path '%s' does not exist." % to_native(b_galaxy_path))
collection_meta = _get_galaxy_yml(b_galaxy_path)
file_manifest = _build_files_manifest(b_collection_path, collection_meta['namespace'], collection_meta['name'],
collection_meta['build_ignore'])
collection_manifest = _build_manifest(**collection_meta)
collection_output = os.path.join(output_path, "%s-%s-%s.tar.gz" % (collection_meta['namespace'],
collection_meta['name'],
collection_meta['version']))
b_collection_output = to_bytes(collection_output, errors='surrogate_or_strict')
if os.path.exists(b_collection_output):
if os.path.isdir(b_collection_output):
raise AnsibleError("The output collection artifact '%s' already exists, "
"but is a directory - aborting" % to_native(collection_output))
elif not force:
raise AnsibleError("The file '%s' already exists. You can use --force to re-create "
"the collection artifact." % to_native(collection_output))
_build_collection_tar(b_collection_path, b_collection_output, collection_manifest, file_manifest)
def publish_collection(collection_path, api, wait, timeout):
"""
Publish an Ansible collection tarball into an Ansible Galaxy server.
:param collection_path: The path to the collection tarball to publish.
:param api: A GalaxyAPI to publish the collection to.
:param wait: Whether to wait until the import process is complete.
:param timeout: The time in seconds to wait for the import process to finish, 0 is indefinite.
"""
import_uri = api.publish_collection(collection_path)
if wait:
# Galaxy returns a url fragment which differs between v2 and v3. The second to last entry is
# always the task_id, though.
# v2: {"task": "https://galaxy-dev.ansible.com/api/v2/collection-imports/35573/"}
# v3: {"task": "/api/automation-hub/v3/imports/collections/838d1308-a8f4-402c-95cb-7823f3806cd8/"}
task_id = None
for path_segment in reversed(import_uri.split('/')):
if path_segment:
task_id = path_segment
break
if not task_id:
raise AnsibleError("Publishing the collection did not return valid task info. Cannot wait for task status. Returned task info: '%s'" % import_uri)
display.display("Collection has been published to the Galaxy server %s %s" % (api.name, api.api_server))
with _display_progress():
api.wait_import_task(task_id, timeout)
display.display("Collection has been successfully published and imported to the Galaxy server %s %s"
% (api.name, api.api_server))
else:
display.display("Collection has been pushed to the Galaxy server %s %s, not waiting until import has "
"completed due to --no-wait being set. Import task results can be found at %s"
% (api.name, api.api_server, import_uri))
def install_collections(collections, output_path, apis, validate_certs, ignore_errors, no_deps, force, force_deps):
"""
Install Ansible collections to the path specified.
:param collections: The collections to install, should be a list of tuples with (name, requirement, Galaxy server).
:param output_path: The path to install the collections to.
:param apis: A list of GalaxyAPIs to query when searching for a collection.
:param validate_certs: Whether to validate the certificates if downloading a tarball.
:param ignore_errors: Whether to ignore any errors when installing the collection.
:param no_deps: Ignore any collection dependencies and only install the base requirements.
:param force: Re-install a collection if it has already been installed.
:param force_deps: Re-install a collection as well as its dependencies if they have already been installed.
"""
existing_collections = find_existing_collections(output_path)
with _tempdir() as b_temp_path:
display.display("Process install dependency map")
with _display_progress():
dependency_map = _build_dependency_map(collections, existing_collections, b_temp_path, apis,
validate_certs, force, force_deps, no_deps)
display.display("Starting collection install process")
with _display_progress():
for collection in dependency_map.values():
try:
collection.install(output_path, b_temp_path)
except AnsibleError as err:
if ignore_errors:
display.warning("Failed to install collection %s but skipping due to --ignore-errors being set. "
"Error: %s" % (to_text(collection), to_text(err)))
else:
raise
def validate_collection_name(name):
"""
Validates the collection name as an input from the user or a requirements file fit the requirements.
:param name: The input name with optional range specifier split by ':'.
:return: The input value, required for argparse validation.
"""
collection, dummy, dummy = name.partition(':')
if AnsibleCollectionRef.is_valid_collection_name(collection):
return name
raise AnsibleError("Invalid collection name '%s', "
"name must be in the format <namespace>.<collection>. \n"
"Please make sure namespace and collection name contains "
"characters from [a-zA-Z0-9_] only." % name)
def validate_collection_path(collection_path):
""" Ensure a given path ends with 'ansible_collections'
:param collection_path: The path that should end in 'ansible_collections'
:return: collection_path ending in 'ansible_collections' if it does not already.
"""
if os.path.split(collection_path)[1] != 'ansible_collections':
return os.path.join(collection_path, 'ansible_collections')
return collection_path
def verify_collections(collections, search_paths, apis, validate_certs, ignore_errors):
with _display_progress():
with _tempdir() as b_temp_path:
for collection in collections:
try:
local_collection = None
b_collection = to_bytes(collection[0], errors='surrogate_or_strict')
if os.path.isfile(b_collection) or urlparse(collection[0]).scheme.lower() in ['http', 'https'] or len(collection[0].split('.')) != 2:
raise AnsibleError(message="'%s' is not a valid collection name. The format namespace.name is expected." % collection[0])
collection_name = collection[0]
namespace, name = collection_name.split('.')
collection_version = collection[1]
# Verify local collection exists before downloading it from a galaxy server
for search_path in search_paths:
b_search_path = to_bytes(os.path.join(search_path, namespace, name), errors='surrogate_or_strict')
if os.path.isdir(b_search_path):
local_collection = CollectionRequirement.from_path(b_search_path, False)
break
if local_collection is None:
raise AnsibleError(message='Collection %s is not installed in any of the collection paths.' % collection_name)
# Download collection on a galaxy server for comparison
try:
remote_collection = CollectionRequirement.from_name(collection_name, apis, collection_version, False, parent=None)
except AnsibleError as e:
if e.message == 'Failed to find collection %s:%s' % (collection[0], collection[1]):
raise AnsibleError('Failed to find remote collection %s:%s on any of the galaxy servers' % (collection[0], collection[1]))
raise
download_url = remote_collection.metadata.download_url
headers = {}
remote_collection.api._add_auth_token(headers, download_url, required=False)
b_temp_tar_path = _download_file(download_url, b_temp_path, None, validate_certs, headers=headers)
local_collection.verify(remote_collection, search_path, b_temp_tar_path)
except AnsibleError as err:
if ignore_errors:
display.warning("Failed to verify collection %s but skipping due to --ignore-errors being set. "
"Error: %s" % (collection[0], to_text(err)))
else:
raise
@contextmanager
def _tempdir():
b_temp_path = tempfile.mkdtemp(dir=to_bytes(C.DEFAULT_LOCAL_TMP, errors='surrogate_or_strict'))
yield b_temp_path
shutil.rmtree(b_temp_path)
@contextmanager
def _tarfile_extract(tar, member):
tar_obj = tar.extractfile(member)
yield tar_obj
tar_obj.close()
@contextmanager
def _display_progress():
config_display = C.GALAXY_DISPLAY_PROGRESS
display_wheel = sys.stdout.isatty() if config_display is None else config_display
if not display_wheel:
yield
return
def progress(display_queue, actual_display):
actual_display.debug("Starting display_progress display thread")
t = threading.current_thread()
while True:
for c in "|/-\\":
actual_display.display(c + "\b", newline=False)
time.sleep(0.1)
# Display a message from the main thread
while True:
try:
method, args, kwargs = display_queue.get(block=False, timeout=0.1)
except queue.Empty:
break
else:
func = getattr(actual_display, method)
func(*args, **kwargs)
if getattr(t, "finish", False):
actual_display.debug("Received end signal for display_progress display thread")
return
class DisplayThread(object):
def __init__(self, display_queue):
self.display_queue = display_queue
def __getattr__(self, attr):
def call_display(*args, **kwargs):
self.display_queue.put((attr, args, kwargs))
return call_display
# Temporary override the global display class with our own which add the calls to a queue for the thread to call.
global display
old_display = display
try:
display_queue = queue.Queue()
display = DisplayThread(display_queue)
t = threading.Thread(target=progress, args=(display_queue, old_display))
t.daemon = True
t.start()
try:
yield
finally:
t.finish = True
t.join()
except Exception:
# The exception is re-raised so we can sure the thread is finished and not using the display anymore
raise
finally:
display = old_display
def _get_galaxy_yml(b_galaxy_yml_path):
meta_info = get_collections_galaxy_meta_info()
mandatory_keys = set()
string_keys = set()
list_keys = set()
dict_keys = set()
for info in meta_info:
if info.get('required', False):
mandatory_keys.add(info['key'])
key_list_type = {
'str': string_keys,
'list': list_keys,
'dict': dict_keys,
}[info.get('type', 'str')]
key_list_type.add(info['key'])
all_keys = frozenset(list(mandatory_keys) + list(string_keys) + list(list_keys) + list(dict_keys))
try:
with open(b_galaxy_yml_path, 'rb') as g_yaml:
galaxy_yml = yaml.safe_load(g_yaml)
except YAMLError as err:
raise AnsibleError("Failed to parse the galaxy.yml at '%s' with the following error:\n%s"
% (to_native(b_galaxy_yml_path), to_native(err)))
set_keys = set(galaxy_yml.keys())
missing_keys = mandatory_keys.difference(set_keys)
if missing_keys:
raise AnsibleError("The collection galaxy.yml at '%s' is missing the following mandatory keys: %s"
% (to_native(b_galaxy_yml_path), ", ".join(sorted(missing_keys))))
extra_keys = set_keys.difference(all_keys)
if len(extra_keys) > 0:
display.warning("Found unknown keys in collection galaxy.yml at '%s': %s"
% (to_text(b_galaxy_yml_path), ", ".join(extra_keys)))
# Add the defaults if they have not been set
for optional_string in string_keys:
if optional_string not in galaxy_yml:
galaxy_yml[optional_string] = None
for optional_list in list_keys:
list_val = galaxy_yml.get(optional_list, None)
if list_val is None:
galaxy_yml[optional_list] = []
elif not isinstance(list_val, list):
galaxy_yml[optional_list] = [list_val]
for optional_dict in dict_keys:
if optional_dict not in galaxy_yml:
galaxy_yml[optional_dict] = {}
# license is a builtin var in Python, to avoid confusion we just rename it to license_ids
galaxy_yml['license_ids'] = galaxy_yml['license']
del galaxy_yml['license']
return galaxy_yml
def _build_files_manifest(b_collection_path, namespace, name, ignore_patterns):
# We always ignore .pyc and .retry files as well as some well known version control directories. The ignore
# patterns can be extended by the build_ignore key in galaxy.yml
b_ignore_patterns = [
b'galaxy.yml',
b'*.pyc',
b'*.retry',
b'tests/output', # Ignore ansible-test result output directory.
to_bytes('{0}-{1}-*.tar.gz'.format(namespace, name)), # Ignores previously built artifacts in the root dir.
]
b_ignore_patterns += [to_bytes(p) for p in ignore_patterns]
b_ignore_dirs = frozenset([b'CVS', b'.bzr', b'.hg', b'.git', b'.svn', b'__pycache__', b'.tox'])
entry_template = {
'name': None,
'ftype': None,
'chksum_type': None,
'chksum_sha256': None,
'format': MANIFEST_FORMAT
}
manifest = {
'files': [
{
'name': '.',
'ftype': 'dir',
'chksum_type': None,
'chksum_sha256': None,
'format': MANIFEST_FORMAT,
},
],
'format': MANIFEST_FORMAT,
}
def _walk(b_path, b_top_level_dir):
for b_item in os.listdir(b_path):
b_abs_path = os.path.join(b_path, b_item)
b_rel_base_dir = b'' if b_path == b_top_level_dir else b_path[len(b_top_level_dir) + 1:]
b_rel_path = os.path.join(b_rel_base_dir, b_item)
rel_path = to_text(b_rel_path, errors='surrogate_or_strict')
if os.path.isdir(b_abs_path):
if any(b_item == b_path for b_path in b_ignore_dirs) or \
any(fnmatch.fnmatch(b_rel_path, b_pattern) for b_pattern in b_ignore_patterns):
display.vvv("Skipping '%s' for collection build" % to_text(b_abs_path))
continue
if os.path.islink(b_abs_path):
b_link_target = os.path.realpath(b_abs_path)
if not b_link_target.startswith(b_top_level_dir):
display.warning("Skipping '%s' as it is a symbolic link to a directory outside the collection"
% to_text(b_abs_path))
continue
manifest_entry = entry_template.copy()
manifest_entry['name'] = rel_path
manifest_entry['ftype'] = 'dir'
manifest['files'].append(manifest_entry)
_walk(b_abs_path, b_top_level_dir)
else:
if any(fnmatch.fnmatch(b_rel_path, b_pattern) for b_pattern in b_ignore_patterns):
display.vvv("Skipping '%s' for collection build" % to_text(b_abs_path))
continue
manifest_entry = entry_template.copy()
manifest_entry['name'] = rel_path
manifest_entry['ftype'] = 'file'
manifest_entry['chksum_type'] = 'sha256'
manifest_entry['chksum_sha256'] = secure_hash(b_abs_path, hash_func=sha256)
manifest['files'].append(manifest_entry)
_walk(b_collection_path, b_collection_path)
return manifest
def _build_manifest(namespace, name, version, authors, readme, tags, description, license_ids, license_file,
dependencies, repository, documentation, homepage, issues, **kwargs):
manifest = {
'collection_info': {
'namespace': namespace,
'name': name,
'version': version,
'authors': authors,
'readme': readme,
'tags': tags,
'description': description,
'license': license_ids,
'license_file': license_file if license_file else None, # Handle galaxy.yml having an empty string (None)
'dependencies': dependencies,
'repository': repository,
'documentation': documentation,
'homepage': homepage,
'issues': issues,
},
'file_manifest_file': {
'name': 'FILES.json',
'ftype': 'file',
'chksum_type': 'sha256',
'chksum_sha256': None, # Filled out in _build_collection_tar
'format': MANIFEST_FORMAT
},
'format': MANIFEST_FORMAT,
}
return manifest
def _build_collection_tar(b_collection_path, b_tar_path, collection_manifest, file_manifest):
files_manifest_json = to_bytes(json.dumps(file_manifest, indent=True), errors='surrogate_or_strict')
collection_manifest['file_manifest_file']['chksum_sha256'] = secure_hash_s(files_manifest_json, hash_func=sha256)
collection_manifest_json = to_bytes(json.dumps(collection_manifest, indent=True), errors='surrogate_or_strict')
with _tempdir() as b_temp_path:
b_tar_filepath = os.path.join(b_temp_path, os.path.basename(b_tar_path))
with tarfile.open(b_tar_filepath, mode='w:gz') as tar_file:
# Add the MANIFEST.json and FILES.json file to the archive
for name, b in [('MANIFEST.json', collection_manifest_json), ('FILES.json', files_manifest_json)]:
b_io = BytesIO(b)
tar_info = tarfile.TarInfo(name)
tar_info.size = len(b)
tar_info.mtime = time.time()
tar_info.mode = 0o0644
tar_file.addfile(tarinfo=tar_info, fileobj=b_io)
for file_info in file_manifest['files']:
if file_info['name'] == '.':
continue
# arcname expects a native string, cannot be bytes
filename = to_native(file_info['name'], errors='surrogate_or_strict')
b_src_path = os.path.join(b_collection_path, to_bytes(filename, errors='surrogate_or_strict'))
def reset_stat(tarinfo):
tarinfo.mode = 0o0755 if tarinfo.isdir() else 0o0644
tarinfo.uid = tarinfo.gid = 0
tarinfo.uname = tarinfo.gname = ''
return tarinfo
tar_file.add(os.path.realpath(b_src_path), arcname=filename, recursive=False, filter=reset_stat)
shutil.copy(b_tar_filepath, b_tar_path)
collection_name = "%s.%s" % (collection_manifest['collection_info']['namespace'],
collection_manifest['collection_info']['name'])
display.display('Created collection for %s at %s' % (collection_name, to_text(b_tar_path)))
def find_existing_collections(path):
collections = []
b_path = to_bytes(path, errors='surrogate_or_strict')
for b_namespace in os.listdir(b_path):
b_namespace_path = os.path.join(b_path, b_namespace)
if os.path.isfile(b_namespace_path):
continue
for b_collection in os.listdir(b_namespace_path):
b_collection_path = os.path.join(b_namespace_path, b_collection)
if os.path.isdir(b_collection_path):
req = CollectionRequirement.from_path(b_collection_path, False)
display.vvv("Found installed collection %s:%s at '%s'" % (to_text(req), req.latest_version,
to_text(b_collection_path)))
collections.append(req)
return collections
def _build_dependency_map(collections, existing_collections, b_temp_path, apis, validate_certs, force, force_deps,
no_deps):
dependency_map = {}
# First build the dependency map on the actual requirements
for name, version, source in collections:
_get_collection_info(dependency_map, existing_collections, name, version, source, b_temp_path, apis,
validate_certs, (force or force_deps))
checked_parents = set([to_text(c) for c in dependency_map.values() if c.skip])
while len(dependency_map) != len(checked_parents):
while not no_deps: # Only parse dependencies if no_deps was not set
parents_to_check = set(dependency_map.keys()).difference(checked_parents)
deps_exhausted = True
for parent in parents_to_check:
parent_info = dependency_map[parent]
if parent_info.dependencies:
deps_exhausted = False
for dep_name, dep_requirement in parent_info.dependencies.items():
_get_collection_info(dependency_map, existing_collections, dep_name, dep_requirement,
parent_info.api, b_temp_path, apis, validate_certs, force_deps,
parent=parent)
checked_parents.add(parent)
# No extra dependencies were resolved, exit loop
if deps_exhausted:
break
# Now we have resolved the deps to our best extent, now select the latest version for collections with
# multiple versions found and go from there
deps_not_checked = set(dependency_map.keys()).difference(checked_parents)
for collection in deps_not_checked:
dependency_map[collection].set_latest_version()
if no_deps or len(dependency_map[collection].dependencies) == 0:
checked_parents.add(collection)
return dependency_map
def _get_collection_info(dep_map, existing_collections, collection, requirement, source, b_temp_path, apis,
validate_certs, force, parent=None):
dep_msg = ""
if parent:
dep_msg = " - as dependency of %s" % parent
display.vvv("Processing requirement collection '%s'%s" % (to_text(collection), dep_msg))
b_tar_path = None
if os.path.isfile(to_bytes(collection, errors='surrogate_or_strict')):
display.vvvv("Collection requirement '%s' is a tar artifact" % to_text(collection))
b_tar_path = to_bytes(collection, errors='surrogate_or_strict')
elif urlparse(collection).scheme.lower() in ['http', 'https']:
display.vvvv("Collection requirement '%s' is a URL to a tar artifact" % collection)
try:
b_tar_path = _download_file(collection, b_temp_path, None, validate_certs)
except urllib_error.URLError as err:
raise AnsibleError("Failed to download collection tar from '%s': %s"
% (to_native(collection), to_native(err)))
if b_tar_path:
req = CollectionRequirement.from_tar(b_tar_path, force, parent=parent)
collection_name = to_text(req)
if collection_name in dep_map:
collection_info = dep_map[collection_name]
collection_info.add_requirement(None, req.latest_version)
else:
collection_info = req
else:
validate_collection_name(collection)
display.vvvv("Collection requirement '%s' is the name of a collection" % collection)
if collection in dep_map:
collection_info = dep_map[collection]
collection_info.add_requirement(parent, requirement)
else:
apis = [source] if source else apis
collection_info = CollectionRequirement.from_name(collection, apis, requirement, force, parent=parent)
existing = [c for c in existing_collections if to_text(c) == to_text(collection_info)]
if existing and not collection_info.force:
# Test that the installed collection fits the requirement
existing[0].add_requirement(parent, requirement)
collection_info = existing[0]
dep_map[to_text(collection_info)] = collection_info
def _download_file(url, b_path, expected_hash, validate_certs, headers=None):
urlsplit = os.path.splitext(to_text(url.rsplit('/', 1)[1]))
b_file_name = to_bytes(urlsplit[0], errors='surrogate_or_strict')
b_file_ext = to_bytes(urlsplit[1], errors='surrogate_or_strict')
b_file_path = tempfile.NamedTemporaryFile(dir=b_path, prefix=b_file_name, suffix=b_file_ext, delete=False).name
display.vvv("Downloading %s to %s" % (url, to_text(b_path)))
# Galaxy redirs downloads to S3 which reject the request if an Authorization header is attached so don't redir that
resp = open_url(to_native(url, errors='surrogate_or_strict'), validate_certs=validate_certs, headers=headers,
unredirected_headers=['Authorization'], http_agent=user_agent())
with open(b_file_path, 'wb') as download_file:
actual_hash = _consume_file(resp, download_file)
if expected_hash:
display.vvvv("Validating downloaded file hash %s with expected hash %s" % (actual_hash, expected_hash))
if expected_hash != actual_hash:
raise AnsibleError("Mismatch artifact hash with downloaded file")
return b_file_path
def _extract_tar_file(tar, filename, b_dest, b_temp_path, expected_hash=None):
with _get_tar_file_member(tar, filename) as tar_obj:
with tempfile.NamedTemporaryFile(dir=b_temp_path, delete=False) as tmpfile_obj:
actual_hash = _consume_file(tar_obj, tmpfile_obj)
if expected_hash and actual_hash != expected_hash:
raise AnsibleError("Checksum mismatch for '%s' inside collection at '%s'"
% (to_native(filename, errors='surrogate_or_strict'), to_native(tar.name)))
b_dest_filepath = os.path.join(b_dest, to_bytes(filename, errors='surrogate_or_strict'))
b_parent_dir = os.path.split(b_dest_filepath)[0]
if not os.path.exists(b_parent_dir):
# Seems like Galaxy does not validate if all file entries have a corresponding dir ftype entry. This check
# makes sure we create the parent directory even if it wasn't set in the metadata.
os.makedirs(b_parent_dir)
shutil.move(to_bytes(tmpfile_obj.name, errors='surrogate_or_strict'), b_dest_filepath)
def _get_tar_file_member(tar, filename):
n_filename = to_native(filename, errors='surrogate_or_strict')
try:
member = tar.getmember(n_filename)
except KeyError:
raise AnsibleError("Collection tar at '%s' does not contain the expected file '%s'." % (
to_native(tar.name),
n_filename))
return _tarfile_extract(tar, member)
def _get_json_from_tar_file(b_path, filename):
file_contents = ''
with tarfile.open(b_path, mode='r') as collection_tar:
with _get_tar_file_member(collection_tar, filename) as tar_obj:
bufsize = 65536
data = tar_obj.read(bufsize)
while data:
file_contents += to_text(data)
data = tar_obj.read(bufsize)
return json.loads(file_contents)
def _get_tar_file_hash(b_path, filename):
with tarfile.open(b_path, mode='r') as collection_tar:
with _get_tar_file_member(collection_tar, filename) as tar_obj:
return _consume_file(tar_obj)
def _consume_file(read_from, write_to=None):
bufsize = 65536
sha256_digest = sha256()
data = read_from.read(bufsize)
while data:
if write_to is not None:
write_to.write(data)
write_to.flush()
sha256_digest.update(data)
data = read_from.read(bufsize)
return sha256_digest.hexdigest()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 64,905 |
Add --pre flag for ansible-galaxy client
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
We publish semver collection (vyos.vyos:0.0.1-dev58) to ansible galaxy, however by default it seems ansible-galaxy client will not be able to find it. You must specific the version number directly to fetch it.
https://galaxy.ansible.com/vyos/vyos
EDIT:
This is working as expect, but it would be nice to Include:
--pre Include pre-release and development versions. By default, ansible-galaxy only finds stable versions.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ansible-galaxy
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.1
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
$ ansible-galaxy collection install vyos.vyos -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
ERROR! Cannot meet requirement * for dependency vyos.vyos from source 'https://galaxy.ansible.com/api/'. Available versions before last requirement added:
Requirements from:
base - 'vyos.vyos:*'
(venv) [pabelanger@localhost openstack]$ ansible-galaxy collection install vyos.vyos:0.0.1-dev57 -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
Starting collection install process
Installing 'vyos.vyos:0.0.1-dev57' to '/tmp/11123/ansible_collections/vyos/vyos'
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
vyos.vyos collection to be found and installed
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/64905
|
https://github.com/ansible/ansible/pull/68258
|
ed9de94ad92dcc07ea3863808e0f4b00f2402cea
|
d3ec31f8d5683926aa6a05bb573d9929a6266fac
| 2019-11-15T17:10:49Z |
python
| 2020-03-23T21:04:07Z |
lib/ansible/utils/version.py
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 64,905 |
Add --pre flag for ansible-galaxy client
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
We publish semver collection (vyos.vyos:0.0.1-dev58) to ansible galaxy, however by default it seems ansible-galaxy client will not be able to find it. You must specific the version number directly to fetch it.
https://galaxy.ansible.com/vyos/vyos
EDIT:
This is working as expect, but it would be nice to Include:
--pre Include pre-release and development versions. By default, ansible-galaxy only finds stable versions.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ansible-galaxy
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.1
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
$ ansible-galaxy collection install vyos.vyos -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
ERROR! Cannot meet requirement * for dependency vyos.vyos from source 'https://galaxy.ansible.com/api/'. Available versions before last requirement added:
Requirements from:
base - 'vyos.vyos:*'
(venv) [pabelanger@localhost openstack]$ ansible-galaxy collection install vyos.vyos:0.0.1-dev57 -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
Starting collection install process
Installing 'vyos.vyos:0.0.1-dev57' to '/tmp/11123/ansible_collections/vyos/vyos'
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
vyos.vyos collection to be found and installed
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/64905
|
https://github.com/ansible/ansible/pull/68258
|
ed9de94ad92dcc07ea3863808e0f4b00f2402cea
|
d3ec31f8d5683926aa6a05bb573d9929a6266fac
| 2019-11-15T17:10:49Z |
python
| 2020-03-23T21:04:07Z |
test/integration/targets/ansible-galaxy-collection/tasks/install.yml
|
---
- name: create test collection install directory - {{ test_name }}
file:
path: '{{ galaxy_dir }}/ansible_collections'
state: directory
- name: install simple collection with implicit path - {{ test_name }}
command: ansible-galaxy collection install namespace1.name1 -s '{{ test_server }}'
environment:
ANSIBLE_COLLECTIONS_PATHS: '{{ galaxy_dir }}/ansible_collections'
register: install_normal
- name: get installed files of install simple collection with implicit path - {{ test_name }}
find:
path: '{{ galaxy_dir }}/ansible_collections/namespace1/name1'
file_type: file
register: install_normal_files
- name: get the manifest of install simple collection with implicit path - {{ test_name }}
slurp:
path: '{{ galaxy_dir }}/ansible_collections/namespace1/name1/MANIFEST.json'
register: install_normal_manifest
- name: assert install simple collection with implicit path - {{ test_name }}
assert:
that:
- '"Installing ''namespace1.name1:1.0.9'' to" in install_normal.stdout'
- install_normal_files.files | length == 3
- install_normal_files.files[0].path | basename in ['MANIFEST.json', 'FILES.json', 'README.md']
- install_normal_files.files[1].path | basename in ['MANIFEST.json', 'FILES.json', 'README.md']
- install_normal_files.files[2].path | basename in ['MANIFEST.json', 'FILES.json', 'README.md']
- (install_normal_manifest.content | b64decode | from_json).collection_info.version == '1.0.9'
- name: install existing without --force - {{ test_name }}
command: ansible-galaxy collection install namespace1.name1 -s '{{ test_server }}'
environment:
ANSIBLE_COLLECTIONS_PATHS: '{{ galaxy_dir }}/ansible_collections'
register: install_existing_no_force
- name: assert install existing without --force - {{ test_name }}
assert:
that:
- '"Skipping ''namespace1.name1'' as it is already installed" in install_existing_no_force.stdout'
- name: install existing with --force - {{ test_name }}
command: ansible-galaxy collection install namespace1.name1 -s '{{ test_server }}' --force
environment:
ANSIBLE_COLLECTIONS_PATH: '{{ galaxy_dir }}/ansible_collections'
register: install_existing_force
- name: assert install existing with --force - {{ test_name }}
assert:
that:
- '"Installing ''namespace1.name1:1.0.9'' to" in install_existing_force.stdout'
- name: remove test installed collection - {{ test_name }}
file:
path: '{{ galaxy_dir }}/ansible_collections/namespace1'
state: absent
- name: install pre-release as explicit version to custom dir - {{ test_name }}
command: ansible-galaxy collection install 'namespace1.name1:1.1.0-beta.1' -s '{{ test_server }}' -p '{{ galaxy_dir }}/ansible_collections'
register: install_prerelease
- name: get result of install pre-release as explicit version to custom dir - {{ test_name }}
slurp:
path: '{{ galaxy_dir }}/ansible_collections/namespace1/name1/MANIFEST.json'
register: install_prerelease_actual
- name: assert install pre-release as explicit version to custom dir - {{ test_name }}
assert:
that:
- '"Installing ''namespace1.name1:1.1.0-beta.1'' to" in install_prerelease.stdout'
- (install_prerelease_actual.content | b64decode | from_json).collection_info.version == '1.1.0-beta.1'
- name: install multiple collections with dependencies - {{ test_name }}
command: ansible-galaxy collection install parent_dep.parent_collection namespace2.name -s {{ test_name }}
args:
chdir: '{{ galaxy_dir }}/ansible_collections'
environment:
ANSIBLE_COLLECTIONS_PATHS: '{{ galaxy_dir }}/ansible_collections'
ANSIBLE_CONFIG: '{{ galaxy_dir }}/ansible.cfg'
register: install_multiple_with_dep
- name: get result of install multiple collections with dependencies - {{ test_name }}
slurp:
path: '{{ galaxy_dir }}/ansible_collections/{{ collection.namespace }}/{{ collection.name }}/MANIFEST.json'
register: install_multiple_with_dep_actual
loop_control:
loop_var: collection
loop:
- namespace: namespace2
name: name
- namespace: parent_dep
name: parent_collection
- namespace: child_dep
name: child_collection
- namespace: child_dep
name: child_dep2
- name: assert install multiple collections with dependencies - {{ test_name }}
assert:
that:
- (install_multiple_with_dep_actual.results[0].content | b64decode | from_json).collection_info.version == '1.0.0'
- (install_multiple_with_dep_actual.results[1].content | b64decode | from_json).collection_info.version == '1.0.0'
- (install_multiple_with_dep_actual.results[2].content | b64decode | from_json).collection_info.version == '0.9.9'
- (install_multiple_with_dep_actual.results[3].content | b64decode | from_json).collection_info.version == '1.2.2'
- name: expect failure with dep resolution failure
command: ansible-galaxy collection install fail_namespace.fail_collection -s {{ test_server }}
register: fail_dep_mismatch
failed_when: '"Cannot meet dependency requirement ''fail_dep2.name:<0.0.5'' for collection fail_namespace.fail_collection" not in fail_dep_mismatch.stderr'
- name: download a collection for an offline install - {{ test_name }}
get_url:
url: '{{ test_server }}custom/collections/namespace3-name-1.0.0.tar.gz'
dest: '{{ galaxy_dir }}/namespace3.tar.gz'
- name: install a collection from a tarball - {{ test_name }}
command: ansible-galaxy collection install '{{ galaxy_dir }}/namespace3.tar.gz'
register: install_tarball
environment:
ANSIBLE_COLLECTIONS_PATHS: '{{ galaxy_dir }}/ansible_collections'
- name: get result of install collection from a tarball - {{ test_name }}
slurp:
path: '{{ galaxy_dir }}/ansible_collections/namespace3/name/MANIFEST.json'
register: install_tarball_actual
- name: assert install a collection from a tarball - {{ test_name }}
assert:
that:
- '"Installing ''namespace3.name:1.0.0'' to" in install_tarball.stdout'
- (install_tarball_actual.content | b64decode | from_json).collection_info.version == '1.0.0'
- name: install a collection from a URI - {{ test_name }}
command: ansible-galaxy collection install '{{ test_server }}custom/collections/namespace4-name-1.0.0.tar.gz'
register: install_uri
environment:
ANSIBLE_COLLECTIONS_PATHS: '{{ galaxy_dir }}/ansible_collections'
- name: get result of install collection from a URI - {{ test_name }}
slurp:
path: '{{ galaxy_dir }}/ansible_collections/namespace4/name/MANIFEST.json'
register: install_uri_actual
- name: assert install a collection from a URI - {{ test_name }}
assert:
that:
- '"Installing ''namespace4.name:1.0.0'' to" in install_uri.stdout'
- (install_uri_actual.content | b64decode | from_json).collection_info.version == '1.0.0'
- name: fail to install a collection with an undefined URL - {{ test_name }}
command: ansible-galaxy collection install namespace5.name
register: fail_undefined_server
failed_when: '"No setting was provided for required configuration plugin_type: galaxy_server plugin: undefined" not in fail_undefined_server.stderr'
environment:
ANSIBLE_GALAXY_SERVER_LIST: undefined
- name: install a collection with an empty server list - {{ test_name }}
command: ansible-galaxy collection install namespace5.name -s '{{ test_server }}'
register: install_empty_server_list
environment:
ANSIBLE_COLLECTIONS_PATHS: '{{ galaxy_dir }}/ansible_collections'
ANSIBLE_GALAXY_SERVER_LIST: ''
- name: get result of a collection with an empty server list - {{ test_name }}
slurp:
path: '{{ galaxy_dir }}/ansible_collections/namespace5/name/MANIFEST.json'
register: install_empty_server_list_actual
- name: assert install a collection with an empty server list - {{ test_name }}
assert:
that:
- '"Installing ''namespace5.name:1.0.0'' to" in install_empty_server_list.stdout'
- (install_empty_server_list_actual.content | b64decode | from_json).collection_info.version == '1.0.0'
- name: remove test collection install directory - {{ test_name }}
file:
path: '{{ galaxy_dir }}/ansible_collections'
state: absent
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 64,905 |
Add --pre flag for ansible-galaxy client
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
We publish semver collection (vyos.vyos:0.0.1-dev58) to ansible galaxy, however by default it seems ansible-galaxy client will not be able to find it. You must specific the version number directly to fetch it.
https://galaxy.ansible.com/vyos/vyos
EDIT:
This is working as expect, but it would be nice to Include:
--pre Include pre-release and development versions. By default, ansible-galaxy only finds stable versions.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ansible-galaxy
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.1
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
$ ansible-galaxy collection install vyos.vyos -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
ERROR! Cannot meet requirement * for dependency vyos.vyos from source 'https://galaxy.ansible.com/api/'. Available versions before last requirement added:
Requirements from:
base - 'vyos.vyos:*'
(venv) [pabelanger@localhost openstack]$ ansible-galaxy collection install vyos.vyos:0.0.1-dev57 -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
Starting collection install process
Installing 'vyos.vyos:0.0.1-dev57' to '/tmp/11123/ansible_collections/vyos/vyos'
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
vyos.vyos collection to be found and installed
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/64905
|
https://github.com/ansible/ansible/pull/68258
|
ed9de94ad92dcc07ea3863808e0f4b00f2402cea
|
d3ec31f8d5683926aa6a05bb573d9929a6266fac
| 2019-11-15T17:10:49Z |
python
| 2020-03-23T21:04:07Z |
test/units/galaxy/test_collection_install.py
|
# -*- coding: utf-8 -*-
# Copyright: (c) 2019, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import copy
import json
import os
import pytest
import re
import shutil
import tarfile
import yaml
from io import BytesIO, StringIO
from units.compat.mock import MagicMock
import ansible.module_utils.six.moves.urllib.error as urllib_error
from ansible import context
from ansible.cli.galaxy import GalaxyCLI
from ansible.errors import AnsibleError
from ansible.galaxy import collection, api
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.utils import context_objects as co
from ansible.utils.display import Display
def call_galaxy_cli(args):
orig = co.GlobalCLIArgs._Singleton__instance
co.GlobalCLIArgs._Singleton__instance = None
try:
GalaxyCLI(args=['ansible-galaxy', 'collection'] + args).run()
finally:
co.GlobalCLIArgs._Singleton__instance = orig
def artifact_json(namespace, name, version, dependencies, server):
json_str = json.dumps({
'artifact': {
'filename': '%s-%s-%s.tar.gz' % (namespace, name, version),
'sha256': '2d76f3b8c4bab1072848107fb3914c345f71a12a1722f25c08f5d3f51f4ab5fd',
'size': 1234,
},
'download_url': '%s/download/%s-%s-%s.tar.gz' % (server, namespace, name, version),
'metadata': {
'namespace': namespace,
'name': name,
'dependencies': dependencies,
},
'version': version
})
return to_text(json_str)
def artifact_versions_json(namespace, name, versions, galaxy_api, available_api_versions=None):
results = []
available_api_versions = available_api_versions or {}
api_version = 'v2'
if 'v3' in available_api_versions:
api_version = 'v3'
for version in versions:
results.append({
'href': '%s/api/%s/%s/%s/versions/%s/' % (galaxy_api.api_server, api_version, namespace, name, version),
'version': version,
})
if api_version == 'v2':
json_str = json.dumps({
'count': len(versions),
'next': None,
'previous': None,
'results': results
})
if api_version == 'v3':
response = {'meta': {'count': len(versions)},
'data': results,
'links': {'first': None,
'last': None,
'next': None,
'previous': None},
}
json_str = json.dumps(response)
return to_text(json_str)
def error_json(galaxy_api, errors_to_return=None, available_api_versions=None):
errors_to_return = errors_to_return or []
available_api_versions = available_api_versions or {}
response = {}
api_version = 'v2'
if 'v3' in available_api_versions:
api_version = 'v3'
if api_version == 'v2':
assert len(errors_to_return) <= 1
if errors_to_return:
response = errors_to_return[0]
if api_version == 'v3':
response['errors'] = errors_to_return
json_str = json.dumps(response)
return to_text(json_str)
@pytest.fixture(autouse='function')
def reset_cli_args():
co.GlobalCLIArgs._Singleton__instance = None
yield
co.GlobalCLIArgs._Singleton__instance = None
@pytest.fixture()
def collection_artifact(request, tmp_path_factory):
test_dir = to_text(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
namespace = 'ansible_namespace'
collection = 'collection'
skeleton_path = os.path.join(os.path.dirname(os.path.split(__file__)[0]), 'cli', 'test_data', 'collection_skeleton')
collection_path = os.path.join(test_dir, namespace, collection)
call_galaxy_cli(['init', '%s.%s' % (namespace, collection), '-c', '--init-path', test_dir,
'--collection-skeleton', skeleton_path])
dependencies = getattr(request, 'param', None)
if dependencies:
galaxy_yml = os.path.join(collection_path, 'galaxy.yml')
with open(galaxy_yml, 'rb+') as galaxy_obj:
existing_yaml = yaml.safe_load(galaxy_obj)
existing_yaml['dependencies'] = dependencies
galaxy_obj.seek(0)
galaxy_obj.write(to_bytes(yaml.safe_dump(existing_yaml)))
galaxy_obj.truncate()
call_galaxy_cli(['build', collection_path, '--output-path', test_dir])
collection_tar = os.path.join(test_dir, '%s-%s-0.1.0.tar.gz' % (namespace, collection))
return to_bytes(collection_path), to_bytes(collection_tar)
@pytest.fixture()
def galaxy_server():
context.CLIARGS._store = {'ignore_certs': False}
galaxy_api = api.GalaxyAPI(None, 'test_server', 'https://galaxy.ansible.com')
return galaxy_api
def test_build_requirement_from_path(collection_artifact):
actual = collection.CollectionRequirement.from_path(collection_artifact[0], True)
assert actual.namespace == u'ansible_namespace'
assert actual.name == u'collection'
assert actual.b_path == collection_artifact[0]
assert actual.api is None
assert actual.skip is True
assert actual.versions == set([u'*'])
assert actual.latest_version == u'*'
assert actual.dependencies == {}
@pytest.mark.parametrize('version', ['1.1.1', 1.1, 1])
def test_build_requirement_from_path_with_manifest(version, collection_artifact):
manifest_path = os.path.join(collection_artifact[0], b'MANIFEST.json')
manifest_value = json.dumps({
'collection_info': {
'namespace': 'namespace',
'name': 'name',
'version': version,
'dependencies': {
'ansible_namespace.collection': '*'
}
}
})
with open(manifest_path, 'wb') as manifest_obj:
manifest_obj.write(to_bytes(manifest_value))
actual = collection.CollectionRequirement.from_path(collection_artifact[0], True)
# While the folder name suggests a different collection, we treat MANIFEST.json as the source of truth.
assert actual.namespace == u'namespace'
assert actual.name == u'name'
assert actual.b_path == collection_artifact[0]
assert actual.api is None
assert actual.skip is True
assert actual.versions == set([to_text(version)])
assert actual.latest_version == to_text(version)
assert actual.dependencies == {'ansible_namespace.collection': '*'}
def test_build_requirement_from_path_invalid_manifest(collection_artifact):
manifest_path = os.path.join(collection_artifact[0], b'MANIFEST.json')
with open(manifest_path, 'wb') as manifest_obj:
manifest_obj.write(b"not json")
expected = "Collection file at '%s' does not contain a valid json string." % to_native(manifest_path)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_path(collection_artifact[0], True)
def test_build_requirement_from_path_no_version(collection_artifact, monkeypatch):
manifest_path = os.path.join(collection_artifact[0], b'MANIFEST.json')
manifest_value = json.dumps({
'collection_info': {
'namespace': 'namespace',
'name': 'name',
'version': '',
'dependencies': {}
}
})
with open(manifest_path, 'wb') as manifest_obj:
manifest_obj.write(to_bytes(manifest_value))
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
actual = collection.CollectionRequirement.from_path(collection_artifact[0], True)
# While the folder name suggests a different collection, we treat MANIFEST.json as the source of truth.
assert actual.namespace == u'namespace'
assert actual.name == u'name'
assert actual.b_path == collection_artifact[0]
assert actual.api is None
assert actual.skip is True
assert actual.versions == set(['*'])
assert actual.latest_version == u'*'
assert actual.dependencies == {}
assert mock_display.call_count == 1
actual_warn = ' '.join(mock_display.mock_calls[0][1][0].split('\n'))
expected_warn = "Collection at '%s' does not have a valid version set, falling back to '*'. Found version: ''" \
% to_text(collection_artifact[0])
assert expected_warn in actual_warn
def test_build_requirement_from_tar(collection_artifact):
actual = collection.CollectionRequirement.from_tar(collection_artifact[1], True, True)
assert actual.namespace == u'ansible_namespace'
assert actual.name == u'collection'
assert actual.b_path == collection_artifact[1]
assert actual.api is None
assert actual.skip is False
assert actual.versions == set([u'0.1.0'])
assert actual.latest_version == u'0.1.0'
assert actual.dependencies == {}
def test_build_requirement_from_tar_fail_not_tar(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
test_file = os.path.join(test_dir, b'fake.tar.gz')
with open(test_file, 'wb') as test_obj:
test_obj.write(b"\x00\x01\x02\x03")
expected = "Collection artifact at '%s' is not a valid tar file." % to_native(test_file)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(test_file, True, True)
def test_build_requirement_from_tar_no_manifest(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
json_data = to_bytes(json.dumps(
{
'files': [],
'format': 1,
}
))
tar_path = os.path.join(test_dir, b'ansible-collections.tar.gz')
with tarfile.open(tar_path, 'w:gz') as tfile:
b_io = BytesIO(json_data)
tar_info = tarfile.TarInfo('FILES.json')
tar_info.size = len(json_data)
tar_info.mode = 0o0644
tfile.addfile(tarinfo=tar_info, fileobj=b_io)
expected = "Collection at '%s' does not contain the required file MANIFEST.json." % to_native(tar_path)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(tar_path, True, True)
def test_build_requirement_from_tar_no_files(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
json_data = to_bytes(json.dumps(
{
'collection_info': {},
}
))
tar_path = os.path.join(test_dir, b'ansible-collections.tar.gz')
with tarfile.open(tar_path, 'w:gz') as tfile:
b_io = BytesIO(json_data)
tar_info = tarfile.TarInfo('MANIFEST.json')
tar_info.size = len(json_data)
tar_info.mode = 0o0644
tfile.addfile(tarinfo=tar_info, fileobj=b_io)
expected = "Collection at '%s' does not contain the required file FILES.json." % to_native(tar_path)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(tar_path, True, True)
def test_build_requirement_from_tar_invalid_manifest(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
json_data = b"not a json"
tar_path = os.path.join(test_dir, b'ansible-collections.tar.gz')
with tarfile.open(tar_path, 'w:gz') as tfile:
b_io = BytesIO(json_data)
tar_info = tarfile.TarInfo('MANIFEST.json')
tar_info.size = len(json_data)
tar_info.mode = 0o0644
tfile.addfile(tarinfo=tar_info, fileobj=b_io)
expected = "Collection tar file member MANIFEST.json does not contain a valid json string."
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(tar_path, True, True)
def test_build_requirement_from_name(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['2.1.9', '2.1.10']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '*', True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.1.9', u'2.1.10'])
assert actual.latest_version == u'2.1.10'
assert actual.dependencies == {}
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
def test_build_requirement_from_name_with_prerelease(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['1.0.1', '2.0.1-beta.1', '2.0.1']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '*', True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'1.0.1', u'2.0.1'])
assert actual.latest_version == u'2.0.1'
assert actual.dependencies == {}
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
def test_build_requirment_from_name_with_prerelease_explicit(galaxy_server, monkeypatch):
mock_get_info = MagicMock()
mock_get_info.return_value = api.CollectionVersionMetadata('namespace', 'collection', '2.0.1-beta.1', None, None,
{})
monkeypatch.setattr(galaxy_server, 'get_collection_version_metadata', mock_get_info)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '2.0.1-beta.1', True,
True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.1-beta.1'])
assert actual.latest_version == u'2.0.1-beta.1'
assert actual.dependencies == {}
assert mock_get_info.call_count == 1
assert mock_get_info.mock_calls[0][1] == ('namespace', 'collection', '2.0.1-beta.1')
def test_build_requirement_from_name_second_server(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['1.0.1', '1.0.2', '1.0.3']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
broken_server = copy.copy(galaxy_server)
broken_server.api_server = 'https://broken.com/'
mock_404 = MagicMock()
mock_404.side_effect = api.GalaxyError(urllib_error.HTTPError('https://galaxy.server.com', 404, 'msg', {},
StringIO()), "custom msg")
monkeypatch.setattr(broken_server, 'get_collection_versions', mock_404)
actual = collection.CollectionRequirement.from_name('namespace.collection', [broken_server, galaxy_server],
'>1.0.1', False, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
# assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'1.0.2', u'1.0.3'])
assert actual.latest_version == u'1.0.3'
assert actual.dependencies == {}
assert mock_404.call_count == 1
assert mock_404.mock_calls[0][1] == ('namespace', 'collection')
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
def test_build_requirement_from_name_missing(galaxy_server, monkeypatch):
mock_open = MagicMock()
mock_open.side_effect = api.GalaxyError(urllib_error.HTTPError('https://galaxy.server.com', 404, 'msg', {},
StringIO()), "")
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_open)
expected = "Failed to find collection namespace.collection:*"
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server, galaxy_server], '*', False,
True)
def test_build_requirement_from_name_401_unauthorized(galaxy_server, monkeypatch):
mock_open = MagicMock()
mock_open.side_effect = api.GalaxyError(urllib_error.HTTPError('https://galaxy.server.com', 401, 'msg', {},
StringIO()), "error")
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_open)
expected = "error (HTTP Code: 401, Message: msg)"
with pytest.raises(api.GalaxyError, match=re.escape(expected)):
collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server, galaxy_server], '*', False)
def test_build_requirement_from_name_single_version(galaxy_server, monkeypatch):
mock_get_info = MagicMock()
mock_get_info.return_value = api.CollectionVersionMetadata('namespace', 'collection', '2.0.0', None, None,
{})
monkeypatch.setattr(galaxy_server, 'get_collection_version_metadata', mock_get_info)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '2.0.0', True,
True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.0'])
assert actual.latest_version == u'2.0.0'
assert actual.dependencies == {}
assert mock_get_info.call_count == 1
assert mock_get_info.mock_calls[0][1] == ('namespace', 'collection', '2.0.0')
def test_build_requirement_from_name_multiple_versions_one_match(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['2.0.0', '2.0.1', '2.0.2']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
mock_get_info = MagicMock()
mock_get_info.return_value = api.CollectionVersionMetadata('namespace', 'collection', '2.0.1', None, None,
{})
monkeypatch.setattr(galaxy_server, 'get_collection_version_metadata', mock_get_info)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '>=2.0.1,<2.0.2',
True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.1'])
assert actual.latest_version == u'2.0.1'
assert actual.dependencies == {}
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
assert mock_get_info.call_count == 1
assert mock_get_info.mock_calls[0][1] == ('namespace', 'collection', '2.0.1')
def test_build_requirement_from_name_multiple_version_results(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['2.0.0', '2.0.1', '2.0.2', '2.0.3', '2.0.4', '2.0.5']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '!=2.0.2',
True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.0', u'2.0.1', u'2.0.3', u'2.0.4', u'2.0.5'])
assert actual.latest_version == u'2.0.5'
assert actual.dependencies == {}
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
@pytest.mark.parametrize('versions, requirement, expected_filter, expected_latest', [
[['1.0.0', '1.0.1'], '*', ['1.0.0', '1.0.1'], '1.0.1'],
[['1.0.0', '1.0.5', '1.1.0'], '>1.0.0,<1.1.0', ['1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '>1.0.0,<=1.0.5', ['1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '>=1.1.0', ['1.1.0'], '1.1.0'],
[['1.0.0', '1.0.5', '1.1.0'], '!=1.1.0', ['1.0.0', '1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '==1.0.5', ['1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '1.0.5', ['1.0.5'], '1.0.5'],
[['1.0.0', '2.0.0', '3.0.0'], '>=2', ['2.0.0', '3.0.0'], '3.0.0'],
])
def test_add_collection_requirements(versions, requirement, expected_filter, expected_latest):
req = collection.CollectionRequirement('namespace', 'name', None, 'https://galaxy.com', versions, requirement,
False)
assert req.versions == set(expected_filter)
assert req.latest_version == expected_latest
def test_add_collection_requirement_to_unknown_installed_version(monkeypatch):
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
req = collection.CollectionRequirement('namespace', 'name', None, 'https://galaxy.com', ['*'], '*', False,
skip=True)
req.add_requirement('parent.collection', '1.0.0')
assert req.latest_version == '*'
assert mock_display.call_count == 1
actual_warn = ' '.join(mock_display.mock_calls[0][1][0].split('\n'))
assert "Failed to validate the collection requirement 'namespace.name:1.0.0' for parent.collection" in actual_warn
def test_add_collection_wildcard_requirement_to_unknown_installed_version():
req = collection.CollectionRequirement('namespace', 'name', None, 'https://galaxy.com', ['*'], '*', False,
skip=True)
req.add_requirement(str(req), '*')
assert req.versions == set('*')
assert req.latest_version == '*'
def test_add_collection_requirement_with_conflict(galaxy_server):
expected = "Cannot meet requirement ==1.0.2 for dependency namespace.name from source '%s'. Available versions " \
"before last requirement added: 1.0.0, 1.0.1\n" \
"Requirements from:\n" \
"\tbase - 'namespace.name:==1.0.2'" % galaxy_server.api_server
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement('namespace', 'name', None, galaxy_server, ['1.0.0', '1.0.1'], '==1.0.2',
False)
def test_add_requirement_to_existing_collection_with_conflict(galaxy_server):
req = collection.CollectionRequirement('namespace', 'name', None, galaxy_server, ['1.0.0', '1.0.1'], '*', False)
expected = "Cannot meet dependency requirement 'namespace.name:1.0.2' for collection namespace.collection2 from " \
"source '%s'. Available versions before last requirement added: 1.0.0, 1.0.1\n" \
"Requirements from:\n" \
"\tbase - 'namespace.name:*'\n" \
"\tnamespace.collection2 - 'namespace.name:1.0.2'" % galaxy_server.api_server
with pytest.raises(AnsibleError, match=re.escape(expected)):
req.add_requirement('namespace.collection2', '1.0.2')
def test_add_requirement_to_installed_collection_with_conflict():
source = 'https://galaxy.ansible.com'
req = collection.CollectionRequirement('namespace', 'name', None, source, ['1.0.0', '1.0.1'], '*', False,
skip=True)
expected = "Cannot meet requirement namespace.name:1.0.2 as it is already installed at version '1.0.1'. " \
"Use --force to overwrite"
with pytest.raises(AnsibleError, match=re.escape(expected)):
req.add_requirement(None, '1.0.2')
def test_add_requirement_to_installed_collection_with_conflict_as_dep():
source = 'https://galaxy.ansible.com'
req = collection.CollectionRequirement('namespace', 'name', None, source, ['1.0.0', '1.0.1'], '*', False,
skip=True)
expected = "Cannot meet requirement namespace.name:1.0.2 as it is already installed at version '1.0.1'. " \
"Use --force-with-deps to overwrite"
with pytest.raises(AnsibleError, match=re.escape(expected)):
req.add_requirement('namespace.collection2', '1.0.2')
def test_install_skipped_collection(monkeypatch):
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
req = collection.CollectionRequirement('namespace', 'name', None, 'source', ['1.0.0'], '*', False, skip=True)
req.install(None, None)
assert mock_display.call_count == 1
assert mock_display.mock_calls[0][1][0] == "Skipping 'namespace.name' as it is already installed"
def test_install_collection(collection_artifact, monkeypatch):
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
collection_tar = collection_artifact[1]
output_path = os.path.join(os.path.split(collection_tar)[0], b'output')
collection_path = os.path.join(output_path, b'ansible_namespace', b'collection')
os.makedirs(os.path.join(collection_path, b'delete_me')) # Create a folder to verify the install cleans out the dir
temp_path = os.path.join(os.path.split(collection_tar)[0], b'temp')
os.makedirs(temp_path)
req = collection.CollectionRequirement.from_tar(collection_tar, True, True)
req.install(to_text(output_path), temp_path)
# Ensure the temp directory is empty, nothing is left behind
assert os.listdir(temp_path) == []
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
assert mock_display.call_count == 1
assert mock_display.mock_calls[0][1][0] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" \
% to_text(collection_path)
def test_install_collection_with_download(galaxy_server, collection_artifact, monkeypatch):
collection_tar = collection_artifact[1]
output_path = os.path.join(os.path.split(collection_tar)[0], b'output')
collection_path = os.path.join(output_path, b'ansible_namespace', b'collection')
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
mock_download = MagicMock()
mock_download.return_value = collection_tar
monkeypatch.setattr(collection, '_download_file', mock_download)
monkeypatch.setattr(galaxy_server, '_available_api_versions', {'v2': 'v2/'})
temp_path = os.path.join(os.path.split(collection_tar)[0], b'temp')
os.makedirs(temp_path)
meta = api.CollectionVersionMetadata('ansible_namespace', 'collection', '0.1.0', 'https://downloadme.com',
'myhash', {})
req = collection.CollectionRequirement('ansible_namespace', 'collection', None, galaxy_server,
['0.1.0'], '*', False, metadata=meta)
req.install(to_text(output_path), temp_path)
# Ensure the temp directory is empty, nothing is left behind
assert os.listdir(temp_path) == []
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
assert mock_display.call_count == 1
assert mock_display.mock_calls[0][1][0] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" \
% to_text(collection_path)
assert mock_download.call_count == 1
assert mock_download.mock_calls[0][1][0] == 'https://downloadme.com'
assert mock_download.mock_calls[0][1][1] == temp_path
assert mock_download.mock_calls[0][1][2] == 'myhash'
assert mock_download.mock_calls[0][1][3] is True
def test_install_collections_from_tar(collection_artifact, monkeypatch):
collection_path, collection_tar = collection_artifact
temp_path = os.path.split(collection_tar)[0]
shutil.rmtree(collection_path)
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path)
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
with open(os.path.join(collection_path, b'MANIFEST.json'), 'rb') as manifest_obj:
actual_manifest = json.loads(to_text(manifest_obj.read()))
assert actual_manifest['collection_info']['namespace'] == 'ansible_namespace'
assert actual_manifest['collection_info']['name'] == 'collection'
assert actual_manifest['collection_info']['version'] == '0.1.0'
# Filter out the progress cursor display calls.
display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1]
assert len(display_msgs) == 3
assert display_msgs[0] == "Process install dependency map"
assert display_msgs[1] == "Starting collection install process"
assert display_msgs[2] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" % to_text(collection_path)
def test_install_collections_existing_without_force(collection_artifact, monkeypatch):
collection_path, collection_tar = collection_artifact
temp_path = os.path.split(collection_tar)[0]
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
# If we don't delete collection_path it will think the original build skeleton is installed so we expect a skip
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path)
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'README.md', b'docs', b'galaxy.yml', b'playbooks', b'plugins', b'roles']
# Filter out the progress cursor display calls.
display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1]
assert len(display_msgs) == 4
# Msg1 is the warning about not MANIFEST.json, cannot really check message as it has line breaks which varies based
# on the path size
assert display_msgs[1] == "Process install dependency map"
assert display_msgs[2] == "Starting collection install process"
assert display_msgs[3] == "Skipping 'ansible_namespace.collection' as it is already installed"
# Makes sure we don't get stuck in some recursive loop
@pytest.mark.parametrize('collection_artifact', [
{'ansible_namespace.collection': '>=0.0.1'},
], indirect=True)
def test_install_collection_with_circular_dependency(collection_artifact, monkeypatch):
collection_path, collection_tar = collection_artifact
temp_path = os.path.split(collection_tar)[0]
shutil.rmtree(collection_path)
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path)
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
with open(os.path.join(collection_path, b'MANIFEST.json'), 'rb') as manifest_obj:
actual_manifest = json.loads(to_text(manifest_obj.read()))
assert actual_manifest['collection_info']['namespace'] == 'ansible_namespace'
assert actual_manifest['collection_info']['name'] == 'collection'
assert actual_manifest['collection_info']['version'] == '0.1.0'
# Filter out the progress cursor display calls.
display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1]
assert len(display_msgs) == 3
assert display_msgs[0] == "Process install dependency map"
assert display_msgs[1] == "Starting collection install process"
assert display_msgs[2] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" % to_text(collection_path)
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 64,905 |
Add --pre flag for ansible-galaxy client
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
We publish semver collection (vyos.vyos:0.0.1-dev58) to ansible galaxy, however by default it seems ansible-galaxy client will not be able to find it. You must specific the version number directly to fetch it.
https://galaxy.ansible.com/vyos/vyos
EDIT:
This is working as expect, but it would be nice to Include:
--pre Include pre-release and development versions. By default, ansible-galaxy only finds stable versions.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ansible-galaxy
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.1
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
$ ansible-galaxy collection install vyos.vyos -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
ERROR! Cannot meet requirement * for dependency vyos.vyos from source 'https://galaxy.ansible.com/api/'. Available versions before last requirement added:
Requirements from:
base - 'vyos.vyos:*'
(venv) [pabelanger@localhost openstack]$ ansible-galaxy collection install vyos.vyos:0.0.1-dev57 -p /tmp/11123
[WARNING]: The specified collections path '/tmp/11123' is not part of the configured Ansible collections paths '/home/pabelanger/.ansible/collections:/usr/share/ansible/collections'. The
installed collection won't be picked up in an Ansible run.
Process install dependency map
Starting collection install process
Installing 'vyos.vyos:0.0.1-dev57' to '/tmp/11123/ansible_collections/vyos/vyos'
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
vyos.vyos collection to be found and installed
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/64905
|
https://github.com/ansible/ansible/pull/68258
|
ed9de94ad92dcc07ea3863808e0f4b00f2402cea
|
d3ec31f8d5683926aa6a05bb573d9929a6266fac
| 2019-11-15T17:10:49Z |
python
| 2020-03-23T21:04:07Z |
test/units/utils/test_version.py
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,415 |
`ansible-galaxy collection install` creates all files with mode 0600
|
##### SUMMARY
When installing a collection, all files are created with mode 0600. This is caused by the way the collection .tar.gz file is extracted: https://github.com/ansible/ansible/blob/d3ec31f8d5683926aa6a05bb573d9929a6266fac/lib/ansible/galaxy/collection.py#L1076-L1090 The file created by `tempfile.NamedTemporaryFile()` seems to have mode `0600` and is simply moved to the final destination without a following `os.chmod`.
(See also this discussion: https://github.com/ansible-collections/community.general/pull/29#discussion_r396147440)
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
lib/ansible/galaxy/collection.py
##### ANSIBLE VERSION
```
2.9
2.10
```
|
https://github.com/ansible/ansible/issues/68415
|
https://github.com/ansible/ansible/pull/68418
|
a9d2ceafe429171c0e2ad007058b88bae57c74ce
|
127d54b3630c65043ec12c4af2024f8ef0bc6d09
| 2020-03-23T21:15:27Z |
python
| 2020-03-24T22:08:23Z |
changelogs/fragments/collection-install-mode.yaml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,415 |
`ansible-galaxy collection install` creates all files with mode 0600
|
##### SUMMARY
When installing a collection, all files are created with mode 0600. This is caused by the way the collection .tar.gz file is extracted: https://github.com/ansible/ansible/blob/d3ec31f8d5683926aa6a05bb573d9929a6266fac/lib/ansible/galaxy/collection.py#L1076-L1090 The file created by `tempfile.NamedTemporaryFile()` seems to have mode `0600` and is simply moved to the final destination without a following `os.chmod`.
(See also this discussion: https://github.com/ansible-collections/community.general/pull/29#discussion_r396147440)
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
lib/ansible/galaxy/collection.py
##### ANSIBLE VERSION
```
2.9
2.10
```
|
https://github.com/ansible/ansible/issues/68415
|
https://github.com/ansible/ansible/pull/68418
|
a9d2ceafe429171c0e2ad007058b88bae57c74ce
|
127d54b3630c65043ec12c4af2024f8ef0bc6d09
| 2020-03-23T21:15:27Z |
python
| 2020-03-24T22:08:23Z |
lib/ansible/galaxy/collection.py
|
# Copyright: (c) 2019, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import fnmatch
import json
import operator
import os
import shutil
import sys
import tarfile
import tempfile
import threading
import time
import yaml
from collections import namedtuple
from contextlib import contextmanager
from distutils.version import LooseVersion
from hashlib import sha256
from io import BytesIO
from yaml.error import YAMLError
try:
import queue
except ImportError:
import Queue as queue # Python 2
import ansible.constants as C
from ansible.errors import AnsibleError
from ansible.galaxy import get_collections_galaxy_meta_info
from ansible.galaxy.api import CollectionVersionMetadata, GalaxyError
from ansible.galaxy.user_agent import user_agent
from ansible.module_utils import six
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.utils.collection_loader import AnsibleCollectionRef
from ansible.utils.display import Display
from ansible.utils.hashing import secure_hash, secure_hash_s
from ansible.utils.version import SemanticVersion
from ansible.module_utils.urls import open_url
urlparse = six.moves.urllib.parse.urlparse
urllib_error = six.moves.urllib.error
display = Display()
MANIFEST_FORMAT = 1
ModifiedContent = namedtuple('ModifiedContent', ['filename', 'expected', 'installed'])
class CollectionRequirement:
_FILE_MAPPING = [(b'MANIFEST.json', 'manifest_file'), (b'FILES.json', 'files_file')]
def __init__(self, namespace, name, b_path, api, versions, requirement, force, parent=None, metadata=None,
files=None, skip=False, allow_pre_releases=False):
"""
Represents a collection requirement, the versions that are available to be installed as well as any
dependencies the collection has.
:param namespace: The collection namespace.
:param name: The collection name.
:param b_path: Byte str of the path to the collection tarball if it has already been downloaded.
:param api: The GalaxyAPI to use if the collection is from Galaxy.
:param versions: A list of versions of the collection that are available.
:param requirement: The version requirement string used to verify the list of versions fit the requirements.
:param force: Whether the force flag applied to the collection.
:param parent: The name of the parent the collection is a dependency of.
:param metadata: The galaxy.api.CollectionVersionMetadata that has already been retrieved from the Galaxy
server.
:param files: The files that exist inside the collection. This is based on the FILES.json file inside the
collection artifact.
:param skip: Whether to skip installing the collection. Should be set if the collection is already installed
and force is not set.
:param allow_pre_releases: Whether to skip pre-release versions of collections.
"""
self.namespace = namespace
self.name = name
self.b_path = b_path
self.api = api
self._versions = set(versions)
self.force = force
self.skip = skip
self.required_by = []
self.allow_pre_releases = allow_pre_releases
self._metadata = metadata
self._files = files
self.add_requirement(parent, requirement)
def __str__(self):
return to_native("%s.%s" % (self.namespace, self.name))
def __unicode__(self):
return u"%s.%s" % (self.namespace, self.name)
@property
def metadata(self):
self._get_metadata()
return self._metadata
@property
def versions(self):
if self.allow_pre_releases:
return self._versions
return set(v for v in self._versions if v == '*' or not SemanticVersion(v).is_prerelease)
@versions.setter
def versions(self, value):
self._versions = set(value)
@property
def pre_releases(self):
return set(v for v in self._versions if SemanticVersion(v).is_prerelease)
@property
def latest_version(self):
try:
return max([v for v in self.versions if v != '*'], key=SemanticVersion)
except ValueError: # ValueError: max() arg is an empty sequence
return '*'
@property
def dependencies(self):
if not self._metadata:
if len(self.versions) > 1:
return {}
self._get_metadata()
dependencies = self._metadata.dependencies
if dependencies is None:
return {}
return dependencies
def add_requirement(self, parent, requirement):
self.required_by.append((parent, requirement))
new_versions = set(v for v in self.versions if self._meets_requirements(v, requirement, parent))
if len(new_versions) == 0:
if self.skip:
force_flag = '--force-with-deps' if parent else '--force'
version = self.latest_version if self.latest_version != '*' else 'unknown'
msg = "Cannot meet requirement %s:%s as it is already installed at version '%s'. Use %s to overwrite" \
% (to_text(self), requirement, version, force_flag)
raise AnsibleError(msg)
elif parent is None:
msg = "Cannot meet requirement %s for dependency %s" % (requirement, to_text(self))
else:
msg = "Cannot meet dependency requirement '%s:%s' for collection %s" \
% (to_text(self), requirement, parent)
collection_source = to_text(self.b_path, nonstring='passthru') or self.api.api_server
req_by = "\n".join(
"\t%s - '%s:%s'" % (to_text(p) if p else 'base', to_text(self), r)
for p, r in self.required_by
)
versions = ", ".join(sorted(self.versions, key=SemanticVersion))
if not self.versions and self.pre_releases:
pre_release_msg = (
'\nThis collection only contains pre-releases. Utilize `--pre` to install pre-releases, or '
'explicitly provide the pre-release version.'
)
else:
pre_release_msg = ''
raise AnsibleError(
"%s from source '%s'. Available versions before last requirement added: %s\nRequirements from:\n%s%s"
% (msg, collection_source, versions, req_by, pre_release_msg)
)
self.versions = new_versions
def install(self, path, b_temp_path):
if self.skip:
display.display("Skipping '%s' as it is already installed" % to_text(self))
return
# Install if it is not
collection_path = os.path.join(path, self.namespace, self.name)
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
display.display("Installing '%s:%s' to '%s'" % (to_text(self), self.latest_version, collection_path))
if self.b_path is None:
download_url = self._metadata.download_url
artifact_hash = self._metadata.artifact_sha256
headers = {}
self.api._add_auth_token(headers, download_url, required=False)
self.b_path = _download_file(download_url, b_temp_path, artifact_hash, self.api.validate_certs,
headers=headers)
if os.path.exists(b_collection_path):
shutil.rmtree(b_collection_path)
os.makedirs(b_collection_path)
with tarfile.open(self.b_path, mode='r') as collection_tar:
files_member_obj = collection_tar.getmember('FILES.json')
with _tarfile_extract(collection_tar, files_member_obj) as files_obj:
files = json.loads(to_text(files_obj.read(), errors='surrogate_or_strict'))
_extract_tar_file(collection_tar, 'MANIFEST.json', b_collection_path, b_temp_path)
_extract_tar_file(collection_tar, 'FILES.json', b_collection_path, b_temp_path)
for file_info in files['files']:
file_name = file_info['name']
if file_name == '.':
continue
if file_info['ftype'] == 'file':
_extract_tar_file(collection_tar, file_name, b_collection_path, b_temp_path,
expected_hash=file_info['chksum_sha256'])
else:
os.makedirs(os.path.join(b_collection_path, to_bytes(file_name, errors='surrogate_or_strict')))
def set_latest_version(self):
self.versions = set([self.latest_version])
self._get_metadata()
def verify(self, remote_collection, path, b_temp_tar_path):
if not self.skip:
display.display("'%s' has not been installed, nothing to verify" % (to_text(self)))
return
collection_path = os.path.join(path, self.namespace, self.name)
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
display.vvv("Verifying '%s:%s'." % (to_text(self), self.latest_version))
display.vvv("Installed collection found at '%s'" % collection_path)
display.vvv("Remote collection found at '%s'" % remote_collection.metadata.download_url)
# Compare installed version versus requirement version
if self.latest_version != remote_collection.latest_version:
err = "%s has the version '%s' but is being compared to '%s'" % (to_text(self), self.latest_version, remote_collection.latest_version)
display.display(err)
return
modified_content = []
# Verify the manifest hash matches before verifying the file manifest
expected_hash = _get_tar_file_hash(b_temp_tar_path, 'MANIFEST.json')
self._verify_file_hash(b_collection_path, 'MANIFEST.json', expected_hash, modified_content)
manifest = _get_json_from_tar_file(b_temp_tar_path, 'MANIFEST.json')
# Use the manifest to verify the file manifest checksum
file_manifest_data = manifest['file_manifest_file']
file_manifest_filename = file_manifest_data['name']
expected_hash = file_manifest_data['chksum_%s' % file_manifest_data['chksum_type']]
# Verify the file manifest before using it to verify individual files
self._verify_file_hash(b_collection_path, file_manifest_filename, expected_hash, modified_content)
file_manifest = _get_json_from_tar_file(b_temp_tar_path, file_manifest_filename)
# Use the file manifest to verify individual file checksums
for manifest_data in file_manifest['files']:
if manifest_data['ftype'] == 'file':
expected_hash = manifest_data['chksum_%s' % manifest_data['chksum_type']]
self._verify_file_hash(b_collection_path, manifest_data['name'], expected_hash, modified_content)
if modified_content:
display.display("Collection %s contains modified content in the following files:" % to_text(self))
display.display(to_text(self))
display.vvv(to_text(self.b_path))
for content_change in modified_content:
display.display(' %s' % content_change.filename)
display.vvv(" Expected: %s\n Found: %s" % (content_change.expected, content_change.installed))
else:
display.vvv("Successfully verified that checksums for '%s:%s' match the remote collection" % (to_text(self), self.latest_version))
def _verify_file_hash(self, b_path, filename, expected_hash, error_queue):
b_file_path = to_bytes(os.path.join(to_text(b_path), filename), errors='surrogate_or_strict')
if not os.path.isfile(b_file_path):
actual_hash = None
else:
with open(b_file_path, mode='rb') as file_object:
actual_hash = _consume_file(file_object)
if expected_hash != actual_hash:
error_queue.append(ModifiedContent(filename=filename, expected=expected_hash, installed=actual_hash))
def _get_metadata(self):
if self._metadata:
return
self._metadata = self.api.get_collection_version_metadata(self.namespace, self.name, self.latest_version)
def _meets_requirements(self, version, requirements, parent):
"""
Supports version identifiers can be '==', '!=', '>', '>=', '<', '<=', '*'. Each requirement is delimited by ','
"""
op_map = {
'!=': operator.ne,
'==': operator.eq,
'=': operator.eq,
'>=': operator.ge,
'>': operator.gt,
'<=': operator.le,
'<': operator.lt,
}
for req in list(requirements.split(',')):
op_pos = 2 if len(req) > 1 and req[1] == '=' else 1
op = op_map.get(req[:op_pos])
requirement = req[op_pos:]
if not op:
requirement = req
op = operator.eq
# In the case we are checking a new requirement on a base requirement (parent != None) we can't accept
# version as '*' (unknown version) unless the requirement is also '*'.
if parent and version == '*' and requirement != '*':
display.warning("Failed to validate the collection requirement '%s:%s' for %s when the existing "
"install does not have a version set, the collection may not work."
% (to_text(self), req, parent))
continue
elif requirement == '*' or version == '*':
continue
if not op(SemanticVersion(version), SemanticVersion.from_loose_version(LooseVersion(requirement))):
break
else:
return True
# The loop was broken early, it does not meet all the requirements
return False
@staticmethod
def from_tar(b_path, force, parent=None):
if not tarfile.is_tarfile(b_path):
raise AnsibleError("Collection artifact at '%s' is not a valid tar file." % to_native(b_path))
info = {}
with tarfile.open(b_path, mode='r') as collection_tar:
for b_member_name, property_name in CollectionRequirement._FILE_MAPPING:
n_member_name = to_native(b_member_name)
try:
member = collection_tar.getmember(n_member_name)
except KeyError:
raise AnsibleError("Collection at '%s' does not contain the required file %s."
% (to_native(b_path), n_member_name))
with _tarfile_extract(collection_tar, member) as member_obj:
try:
info[property_name] = json.loads(to_text(member_obj.read(), errors='surrogate_or_strict'))
except ValueError:
raise AnsibleError("Collection tar file member %s does not contain a valid json string."
% n_member_name)
meta = info['manifest_file']['collection_info']
files = info['files_file']['files']
namespace = meta['namespace']
name = meta['name']
version = meta['version']
meta = CollectionVersionMetadata(namespace, name, version, None, None, meta['dependencies'])
if SemanticVersion(version).is_prerelease:
allow_pre_release = True
else:
allow_pre_release = False
return CollectionRequirement(namespace, name, b_path, None, [version], version, force, parent=parent,
metadata=meta, files=files, allow_pre_releases=allow_pre_release)
@staticmethod
def from_path(b_path, force, parent=None):
info = {}
for b_file_name, property_name in CollectionRequirement._FILE_MAPPING:
b_file_path = os.path.join(b_path, b_file_name)
if not os.path.exists(b_file_path):
continue
with open(b_file_path, 'rb') as file_obj:
try:
info[property_name] = json.loads(to_text(file_obj.read(), errors='surrogate_or_strict'))
except ValueError:
raise AnsibleError("Collection file at '%s' does not contain a valid json string."
% to_native(b_file_path))
allow_pre_release = False
if 'manifest_file' in info:
manifest = info['manifest_file']['collection_info']
namespace = manifest['namespace']
name = manifest['name']
version = to_text(manifest['version'], errors='surrogate_or_strict')
try:
_v = SemanticVersion()
_v.parse(version)
if _v.is_prerelease:
allow_pre_release = True
except ValueError:
display.warning("Collection at '%s' does not have a valid version set, falling back to '*'. Found "
"version: '%s'" % (to_text(b_path), version))
version = '*'
dependencies = manifest['dependencies']
else:
display.warning("Collection at '%s' does not have a MANIFEST.json file, cannot detect version."
% to_text(b_path))
parent_dir, name = os.path.split(to_text(b_path, errors='surrogate_or_strict'))
namespace = os.path.split(parent_dir)[1]
version = '*'
dependencies = {}
meta = CollectionVersionMetadata(namespace, name, version, None, None, dependencies)
files = info.get('files_file', {}).get('files', {})
return CollectionRequirement(namespace, name, b_path, None, [version], version, force, parent=parent,
metadata=meta, files=files, skip=True, allow_pre_releases=allow_pre_release)
@staticmethod
def from_name(collection, apis, requirement, force, parent=None, allow_pre_release=False):
namespace, name = collection.split('.', 1)
galaxy_meta = None
for api in apis:
try:
if not (requirement == '*' or requirement.startswith('<') or requirement.startswith('>') or
requirement.startswith('!=')):
# Exact requirement
allow_pre_release = True
if requirement.startswith('='):
requirement = requirement.lstrip('=')
resp = api.get_collection_version_metadata(namespace, name, requirement)
galaxy_meta = resp
versions = [resp.version]
else:
versions = api.get_collection_versions(namespace, name)
except GalaxyError as err:
if err.http_code == 404:
display.vvv("Collection '%s' is not available from server %s %s"
% (collection, api.name, api.api_server))
continue
raise
display.vvv("Collection '%s' obtained from server %s %s" % (collection, api.name, api.api_server))
break
else:
raise AnsibleError("Failed to find collection %s:%s" % (collection, requirement))
req = CollectionRequirement(namespace, name, None, api, versions, requirement, force, parent=parent,
metadata=galaxy_meta, allow_pre_releases=allow_pre_release)
return req
def build_collection(collection_path, output_path, force):
"""
Creates the Ansible collection artifact in a .tar.gz file.
:param collection_path: The path to the collection to build. This should be the directory that contains the
galaxy.yml file.
:param output_path: The path to create the collection build artifact. This should be a directory.
:param force: Whether to overwrite an existing collection build artifact or fail.
:return: The path to the collection build artifact.
"""
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
b_galaxy_path = os.path.join(b_collection_path, b'galaxy.yml')
if not os.path.exists(b_galaxy_path):
raise AnsibleError("The collection galaxy.yml path '%s' does not exist." % to_native(b_galaxy_path))
collection_meta = _get_galaxy_yml(b_galaxy_path)
file_manifest = _build_files_manifest(b_collection_path, collection_meta['namespace'], collection_meta['name'],
collection_meta['build_ignore'])
collection_manifest = _build_manifest(**collection_meta)
collection_output = os.path.join(output_path, "%s-%s-%s.tar.gz" % (collection_meta['namespace'],
collection_meta['name'],
collection_meta['version']))
b_collection_output = to_bytes(collection_output, errors='surrogate_or_strict')
if os.path.exists(b_collection_output):
if os.path.isdir(b_collection_output):
raise AnsibleError("The output collection artifact '%s' already exists, "
"but is a directory - aborting" % to_native(collection_output))
elif not force:
raise AnsibleError("The file '%s' already exists. You can use --force to re-create "
"the collection artifact." % to_native(collection_output))
_build_collection_tar(b_collection_path, b_collection_output, collection_manifest, file_manifest)
def publish_collection(collection_path, api, wait, timeout):
"""
Publish an Ansible collection tarball into an Ansible Galaxy server.
:param collection_path: The path to the collection tarball to publish.
:param api: A GalaxyAPI to publish the collection to.
:param wait: Whether to wait until the import process is complete.
:param timeout: The time in seconds to wait for the import process to finish, 0 is indefinite.
"""
import_uri = api.publish_collection(collection_path)
if wait:
# Galaxy returns a url fragment which differs between v2 and v3. The second to last entry is
# always the task_id, though.
# v2: {"task": "https://galaxy-dev.ansible.com/api/v2/collection-imports/35573/"}
# v3: {"task": "/api/automation-hub/v3/imports/collections/838d1308-a8f4-402c-95cb-7823f3806cd8/"}
task_id = None
for path_segment in reversed(import_uri.split('/')):
if path_segment:
task_id = path_segment
break
if not task_id:
raise AnsibleError("Publishing the collection did not return valid task info. Cannot wait for task status. Returned task info: '%s'" % import_uri)
display.display("Collection has been published to the Galaxy server %s %s" % (api.name, api.api_server))
with _display_progress():
api.wait_import_task(task_id, timeout)
display.display("Collection has been successfully published and imported to the Galaxy server %s %s"
% (api.name, api.api_server))
else:
display.display("Collection has been pushed to the Galaxy server %s %s, not waiting until import has "
"completed due to --no-wait being set. Import task results can be found at %s"
% (api.name, api.api_server, import_uri))
def install_collections(collections, output_path, apis, validate_certs, ignore_errors, no_deps, force, force_deps,
allow_pre_release=False):
"""
Install Ansible collections to the path specified.
:param collections: The collections to install, should be a list of tuples with (name, requirement, Galaxy server).
:param output_path: The path to install the collections to.
:param apis: A list of GalaxyAPIs to query when searching for a collection.
:param validate_certs: Whether to validate the certificates if downloading a tarball.
:param ignore_errors: Whether to ignore any errors when installing the collection.
:param no_deps: Ignore any collection dependencies and only install the base requirements.
:param force: Re-install a collection if it has already been installed.
:param force_deps: Re-install a collection as well as its dependencies if they have already been installed.
"""
existing_collections = find_existing_collections(output_path)
with _tempdir() as b_temp_path:
display.display("Process install dependency map")
with _display_progress():
dependency_map = _build_dependency_map(collections, existing_collections, b_temp_path, apis,
validate_certs, force, force_deps, no_deps,
allow_pre_release=allow_pre_release)
display.display("Starting collection install process")
with _display_progress():
for collection in dependency_map.values():
try:
collection.install(output_path, b_temp_path)
except AnsibleError as err:
if ignore_errors:
display.warning("Failed to install collection %s but skipping due to --ignore-errors being set. "
"Error: %s" % (to_text(collection), to_text(err)))
else:
raise
def validate_collection_name(name):
"""
Validates the collection name as an input from the user or a requirements file fit the requirements.
:param name: The input name with optional range specifier split by ':'.
:return: The input value, required for argparse validation.
"""
collection, dummy, dummy = name.partition(':')
if AnsibleCollectionRef.is_valid_collection_name(collection):
return name
raise AnsibleError("Invalid collection name '%s', "
"name must be in the format <namespace>.<collection>. \n"
"Please make sure namespace and collection name contains "
"characters from [a-zA-Z0-9_] only." % name)
def validate_collection_path(collection_path):
""" Ensure a given path ends with 'ansible_collections'
:param collection_path: The path that should end in 'ansible_collections'
:return: collection_path ending in 'ansible_collections' if it does not already.
"""
if os.path.split(collection_path)[1] != 'ansible_collections':
return os.path.join(collection_path, 'ansible_collections')
return collection_path
def verify_collections(collections, search_paths, apis, validate_certs, ignore_errors, allow_pre_release=False):
with _display_progress():
with _tempdir() as b_temp_path:
for collection in collections:
try:
local_collection = None
b_collection = to_bytes(collection[0], errors='surrogate_or_strict')
if os.path.isfile(b_collection) or urlparse(collection[0]).scheme.lower() in ['http', 'https'] or len(collection[0].split('.')) != 2:
raise AnsibleError(message="'%s' is not a valid collection name. The format namespace.name is expected." % collection[0])
collection_name = collection[0]
namespace, name = collection_name.split('.')
collection_version = collection[1]
# Verify local collection exists before downloading it from a galaxy server
for search_path in search_paths:
b_search_path = to_bytes(os.path.join(search_path, namespace, name), errors='surrogate_or_strict')
if os.path.isdir(b_search_path):
local_collection = CollectionRequirement.from_path(b_search_path, False)
break
if local_collection is None:
raise AnsibleError(message='Collection %s is not installed in any of the collection paths.' % collection_name)
# Download collection on a galaxy server for comparison
try:
remote_collection = CollectionRequirement.from_name(collection_name, apis, collection_version, False, parent=None,
allow_pre_release=allow_pre_release)
except AnsibleError as e:
if e.message == 'Failed to find collection %s:%s' % (collection[0], collection[1]):
raise AnsibleError('Failed to find remote collection %s:%s on any of the galaxy servers' % (collection[0], collection[1]))
raise
download_url = remote_collection.metadata.download_url
headers = {}
remote_collection.api._add_auth_token(headers, download_url, required=False)
b_temp_tar_path = _download_file(download_url, b_temp_path, None, validate_certs, headers=headers)
local_collection.verify(remote_collection, search_path, b_temp_tar_path)
except AnsibleError as err:
if ignore_errors:
display.warning("Failed to verify collection %s but skipping due to --ignore-errors being set. "
"Error: %s" % (collection[0], to_text(err)))
else:
raise
@contextmanager
def _tempdir():
b_temp_path = tempfile.mkdtemp(dir=to_bytes(C.DEFAULT_LOCAL_TMP, errors='surrogate_or_strict'))
yield b_temp_path
shutil.rmtree(b_temp_path)
@contextmanager
def _tarfile_extract(tar, member):
tar_obj = tar.extractfile(member)
yield tar_obj
tar_obj.close()
@contextmanager
def _display_progress():
config_display = C.GALAXY_DISPLAY_PROGRESS
display_wheel = sys.stdout.isatty() if config_display is None else config_display
if not display_wheel:
yield
return
def progress(display_queue, actual_display):
actual_display.debug("Starting display_progress display thread")
t = threading.current_thread()
while True:
for c in "|/-\\":
actual_display.display(c + "\b", newline=False)
time.sleep(0.1)
# Display a message from the main thread
while True:
try:
method, args, kwargs = display_queue.get(block=False, timeout=0.1)
except queue.Empty:
break
else:
func = getattr(actual_display, method)
func(*args, **kwargs)
if getattr(t, "finish", False):
actual_display.debug("Received end signal for display_progress display thread")
return
class DisplayThread(object):
def __init__(self, display_queue):
self.display_queue = display_queue
def __getattr__(self, attr):
def call_display(*args, **kwargs):
self.display_queue.put((attr, args, kwargs))
return call_display
# Temporary override the global display class with our own which add the calls to a queue for the thread to call.
global display
old_display = display
try:
display_queue = queue.Queue()
display = DisplayThread(display_queue)
t = threading.Thread(target=progress, args=(display_queue, old_display))
t.daemon = True
t.start()
try:
yield
finally:
t.finish = True
t.join()
except Exception:
# The exception is re-raised so we can sure the thread is finished and not using the display anymore
raise
finally:
display = old_display
def _get_galaxy_yml(b_galaxy_yml_path):
meta_info = get_collections_galaxy_meta_info()
mandatory_keys = set()
string_keys = set()
list_keys = set()
dict_keys = set()
for info in meta_info:
if info.get('required', False):
mandatory_keys.add(info['key'])
key_list_type = {
'str': string_keys,
'list': list_keys,
'dict': dict_keys,
}[info.get('type', 'str')]
key_list_type.add(info['key'])
all_keys = frozenset(list(mandatory_keys) + list(string_keys) + list(list_keys) + list(dict_keys))
try:
with open(b_galaxy_yml_path, 'rb') as g_yaml:
galaxy_yml = yaml.safe_load(g_yaml)
except YAMLError as err:
raise AnsibleError("Failed to parse the galaxy.yml at '%s' with the following error:\n%s"
% (to_native(b_galaxy_yml_path), to_native(err)))
set_keys = set(galaxy_yml.keys())
missing_keys = mandatory_keys.difference(set_keys)
if missing_keys:
raise AnsibleError("The collection galaxy.yml at '%s' is missing the following mandatory keys: %s"
% (to_native(b_galaxy_yml_path), ", ".join(sorted(missing_keys))))
extra_keys = set_keys.difference(all_keys)
if len(extra_keys) > 0:
display.warning("Found unknown keys in collection galaxy.yml at '%s': %s"
% (to_text(b_galaxy_yml_path), ", ".join(extra_keys)))
# Add the defaults if they have not been set
for optional_string in string_keys:
if optional_string not in galaxy_yml:
galaxy_yml[optional_string] = None
for optional_list in list_keys:
list_val = galaxy_yml.get(optional_list, None)
if list_val is None:
galaxy_yml[optional_list] = []
elif not isinstance(list_val, list):
galaxy_yml[optional_list] = [list_val]
for optional_dict in dict_keys:
if optional_dict not in galaxy_yml:
galaxy_yml[optional_dict] = {}
# license is a builtin var in Python, to avoid confusion we just rename it to license_ids
galaxy_yml['license_ids'] = galaxy_yml['license']
del galaxy_yml['license']
return galaxy_yml
def _build_files_manifest(b_collection_path, namespace, name, ignore_patterns):
# We always ignore .pyc and .retry files as well as some well known version control directories. The ignore
# patterns can be extended by the build_ignore key in galaxy.yml
b_ignore_patterns = [
b'galaxy.yml',
b'*.pyc',
b'*.retry',
b'tests/output', # Ignore ansible-test result output directory.
to_bytes('{0}-{1}-*.tar.gz'.format(namespace, name)), # Ignores previously built artifacts in the root dir.
]
b_ignore_patterns += [to_bytes(p) for p in ignore_patterns]
b_ignore_dirs = frozenset([b'CVS', b'.bzr', b'.hg', b'.git', b'.svn', b'__pycache__', b'.tox'])
entry_template = {
'name': None,
'ftype': None,
'chksum_type': None,
'chksum_sha256': None,
'format': MANIFEST_FORMAT
}
manifest = {
'files': [
{
'name': '.',
'ftype': 'dir',
'chksum_type': None,
'chksum_sha256': None,
'format': MANIFEST_FORMAT,
},
],
'format': MANIFEST_FORMAT,
}
def _walk(b_path, b_top_level_dir):
for b_item in os.listdir(b_path):
b_abs_path = os.path.join(b_path, b_item)
b_rel_base_dir = b'' if b_path == b_top_level_dir else b_path[len(b_top_level_dir) + 1:]
b_rel_path = os.path.join(b_rel_base_dir, b_item)
rel_path = to_text(b_rel_path, errors='surrogate_or_strict')
if os.path.isdir(b_abs_path):
if any(b_item == b_path for b_path in b_ignore_dirs) or \
any(fnmatch.fnmatch(b_rel_path, b_pattern) for b_pattern in b_ignore_patterns):
display.vvv("Skipping '%s' for collection build" % to_text(b_abs_path))
continue
if os.path.islink(b_abs_path):
b_link_target = os.path.realpath(b_abs_path)
if not b_link_target.startswith(b_top_level_dir):
display.warning("Skipping '%s' as it is a symbolic link to a directory outside the collection"
% to_text(b_abs_path))
continue
manifest_entry = entry_template.copy()
manifest_entry['name'] = rel_path
manifest_entry['ftype'] = 'dir'
manifest['files'].append(manifest_entry)
_walk(b_abs_path, b_top_level_dir)
else:
if any(fnmatch.fnmatch(b_rel_path, b_pattern) for b_pattern in b_ignore_patterns):
display.vvv("Skipping '%s' for collection build" % to_text(b_abs_path))
continue
manifest_entry = entry_template.copy()
manifest_entry['name'] = rel_path
manifest_entry['ftype'] = 'file'
manifest_entry['chksum_type'] = 'sha256'
manifest_entry['chksum_sha256'] = secure_hash(b_abs_path, hash_func=sha256)
manifest['files'].append(manifest_entry)
_walk(b_collection_path, b_collection_path)
return manifest
def _build_manifest(namespace, name, version, authors, readme, tags, description, license_ids, license_file,
dependencies, repository, documentation, homepage, issues, **kwargs):
manifest = {
'collection_info': {
'namespace': namespace,
'name': name,
'version': version,
'authors': authors,
'readme': readme,
'tags': tags,
'description': description,
'license': license_ids,
'license_file': license_file if license_file else None, # Handle galaxy.yml having an empty string (None)
'dependencies': dependencies,
'repository': repository,
'documentation': documentation,
'homepage': homepage,
'issues': issues,
},
'file_manifest_file': {
'name': 'FILES.json',
'ftype': 'file',
'chksum_type': 'sha256',
'chksum_sha256': None, # Filled out in _build_collection_tar
'format': MANIFEST_FORMAT
},
'format': MANIFEST_FORMAT,
}
return manifest
def _build_collection_tar(b_collection_path, b_tar_path, collection_manifest, file_manifest):
files_manifest_json = to_bytes(json.dumps(file_manifest, indent=True), errors='surrogate_or_strict')
collection_manifest['file_manifest_file']['chksum_sha256'] = secure_hash_s(files_manifest_json, hash_func=sha256)
collection_manifest_json = to_bytes(json.dumps(collection_manifest, indent=True), errors='surrogate_or_strict')
with _tempdir() as b_temp_path:
b_tar_filepath = os.path.join(b_temp_path, os.path.basename(b_tar_path))
with tarfile.open(b_tar_filepath, mode='w:gz') as tar_file:
# Add the MANIFEST.json and FILES.json file to the archive
for name, b in [('MANIFEST.json', collection_manifest_json), ('FILES.json', files_manifest_json)]:
b_io = BytesIO(b)
tar_info = tarfile.TarInfo(name)
tar_info.size = len(b)
tar_info.mtime = time.time()
tar_info.mode = 0o0644
tar_file.addfile(tarinfo=tar_info, fileobj=b_io)
for file_info in file_manifest['files']:
if file_info['name'] == '.':
continue
# arcname expects a native string, cannot be bytes
filename = to_native(file_info['name'], errors='surrogate_or_strict')
b_src_path = os.path.join(b_collection_path, to_bytes(filename, errors='surrogate_or_strict'))
def reset_stat(tarinfo):
tarinfo.mode = 0o0755 if tarinfo.isdir() else 0o0644
tarinfo.uid = tarinfo.gid = 0
tarinfo.uname = tarinfo.gname = ''
return tarinfo
tar_file.add(os.path.realpath(b_src_path), arcname=filename, recursive=False, filter=reset_stat)
shutil.copy(b_tar_filepath, b_tar_path)
collection_name = "%s.%s" % (collection_manifest['collection_info']['namespace'],
collection_manifest['collection_info']['name'])
display.display('Created collection for %s at %s' % (collection_name, to_text(b_tar_path)))
def find_existing_collections(path):
collections = []
b_path = to_bytes(path, errors='surrogate_or_strict')
for b_namespace in os.listdir(b_path):
b_namespace_path = os.path.join(b_path, b_namespace)
if os.path.isfile(b_namespace_path):
continue
for b_collection in os.listdir(b_namespace_path):
b_collection_path = os.path.join(b_namespace_path, b_collection)
if os.path.isdir(b_collection_path):
req = CollectionRequirement.from_path(b_collection_path, False)
display.vvv("Found installed collection %s:%s at '%s'" % (to_text(req), req.latest_version,
to_text(b_collection_path)))
collections.append(req)
return collections
def _build_dependency_map(collections, existing_collections, b_temp_path, apis, validate_certs, force, force_deps,
no_deps, allow_pre_release=False):
dependency_map = {}
# First build the dependency map on the actual requirements
for name, version, source in collections:
_get_collection_info(dependency_map, existing_collections, name, version, source, b_temp_path, apis,
validate_certs, (force or force_deps), allow_pre_release=allow_pre_release)
checked_parents = set([to_text(c) for c in dependency_map.values() if c.skip])
while len(dependency_map) != len(checked_parents):
while not no_deps: # Only parse dependencies if no_deps was not set
parents_to_check = set(dependency_map.keys()).difference(checked_parents)
deps_exhausted = True
for parent in parents_to_check:
parent_info = dependency_map[parent]
if parent_info.dependencies:
deps_exhausted = False
for dep_name, dep_requirement in parent_info.dependencies.items():
_get_collection_info(dependency_map, existing_collections, dep_name, dep_requirement,
parent_info.api, b_temp_path, apis, validate_certs, force_deps,
parent=parent, allow_pre_release=allow_pre_release)
checked_parents.add(parent)
# No extra dependencies were resolved, exit loop
if deps_exhausted:
break
# Now we have resolved the deps to our best extent, now select the latest version for collections with
# multiple versions found and go from there
deps_not_checked = set(dependency_map.keys()).difference(checked_parents)
for collection in deps_not_checked:
dependency_map[collection].set_latest_version()
if no_deps or len(dependency_map[collection].dependencies) == 0:
checked_parents.add(collection)
return dependency_map
def _get_collection_info(dep_map, existing_collections, collection, requirement, source, b_temp_path, apis,
validate_certs, force, parent=None, allow_pre_release=False):
dep_msg = ""
if parent:
dep_msg = " - as dependency of %s" % parent
display.vvv("Processing requirement collection '%s'%s" % (to_text(collection), dep_msg))
b_tar_path = None
if os.path.isfile(to_bytes(collection, errors='surrogate_or_strict')):
display.vvvv("Collection requirement '%s' is a tar artifact" % to_text(collection))
b_tar_path = to_bytes(collection, errors='surrogate_or_strict')
elif urlparse(collection).scheme.lower() in ['http', 'https']:
display.vvvv("Collection requirement '%s' is a URL to a tar artifact" % collection)
try:
b_tar_path = _download_file(collection, b_temp_path, None, validate_certs)
except urllib_error.URLError as err:
raise AnsibleError("Failed to download collection tar from '%s': %s"
% (to_native(collection), to_native(err)))
if b_tar_path:
req = CollectionRequirement.from_tar(b_tar_path, force, parent=parent)
collection_name = to_text(req)
if collection_name in dep_map:
collection_info = dep_map[collection_name]
collection_info.add_requirement(None, req.latest_version)
else:
collection_info = req
else:
validate_collection_name(collection)
display.vvvv("Collection requirement '%s' is the name of a collection" % collection)
if collection in dep_map:
collection_info = dep_map[collection]
collection_info.add_requirement(parent, requirement)
else:
apis = [source] if source else apis
collection_info = CollectionRequirement.from_name(collection, apis, requirement, force, parent=parent,
allow_pre_release=allow_pre_release)
existing = [c for c in existing_collections if to_text(c) == to_text(collection_info)]
if existing and not collection_info.force:
# Test that the installed collection fits the requirement
existing[0].add_requirement(parent, requirement)
collection_info = existing[0]
dep_map[to_text(collection_info)] = collection_info
def _download_file(url, b_path, expected_hash, validate_certs, headers=None):
urlsplit = os.path.splitext(to_text(url.rsplit('/', 1)[1]))
b_file_name = to_bytes(urlsplit[0], errors='surrogate_or_strict')
b_file_ext = to_bytes(urlsplit[1], errors='surrogate_or_strict')
b_file_path = tempfile.NamedTemporaryFile(dir=b_path, prefix=b_file_name, suffix=b_file_ext, delete=False).name
display.vvv("Downloading %s to %s" % (url, to_text(b_path)))
# Galaxy redirs downloads to S3 which reject the request if an Authorization header is attached so don't redir that
resp = open_url(to_native(url, errors='surrogate_or_strict'), validate_certs=validate_certs, headers=headers,
unredirected_headers=['Authorization'], http_agent=user_agent())
with open(b_file_path, 'wb') as download_file:
actual_hash = _consume_file(resp, download_file)
if expected_hash:
display.vvvv("Validating downloaded file hash %s with expected hash %s" % (actual_hash, expected_hash))
if expected_hash != actual_hash:
raise AnsibleError("Mismatch artifact hash with downloaded file")
return b_file_path
def _extract_tar_file(tar, filename, b_dest, b_temp_path, expected_hash=None):
with _get_tar_file_member(tar, filename) as tar_obj:
with tempfile.NamedTemporaryFile(dir=b_temp_path, delete=False) as tmpfile_obj:
actual_hash = _consume_file(tar_obj, tmpfile_obj)
if expected_hash and actual_hash != expected_hash:
raise AnsibleError("Checksum mismatch for '%s' inside collection at '%s'"
% (to_native(filename, errors='surrogate_or_strict'), to_native(tar.name)))
b_dest_filepath = os.path.join(b_dest, to_bytes(filename, errors='surrogate_or_strict'))
b_parent_dir = os.path.split(b_dest_filepath)[0]
if not os.path.exists(b_parent_dir):
# Seems like Galaxy does not validate if all file entries have a corresponding dir ftype entry. This check
# makes sure we create the parent directory even if it wasn't set in the metadata.
os.makedirs(b_parent_dir)
shutil.move(to_bytes(tmpfile_obj.name, errors='surrogate_or_strict'), b_dest_filepath)
def _get_tar_file_member(tar, filename):
n_filename = to_native(filename, errors='surrogate_or_strict')
try:
member = tar.getmember(n_filename)
except KeyError:
raise AnsibleError("Collection tar at '%s' does not contain the expected file '%s'." % (
to_native(tar.name),
n_filename))
return _tarfile_extract(tar, member)
def _get_json_from_tar_file(b_path, filename):
file_contents = ''
with tarfile.open(b_path, mode='r') as collection_tar:
with _get_tar_file_member(collection_tar, filename) as tar_obj:
bufsize = 65536
data = tar_obj.read(bufsize)
while data:
file_contents += to_text(data)
data = tar_obj.read(bufsize)
return json.loads(file_contents)
def _get_tar_file_hash(b_path, filename):
with tarfile.open(b_path, mode='r') as collection_tar:
with _get_tar_file_member(collection_tar, filename) as tar_obj:
return _consume_file(tar_obj)
def _consume_file(read_from, write_to=None):
bufsize = 65536
sha256_digest = sha256()
data = read_from.read(bufsize)
while data:
if write_to is not None:
write_to.write(data)
write_to.flush()
sha256_digest.update(data)
data = read_from.read(bufsize)
return sha256_digest.hexdigest()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,415 |
`ansible-galaxy collection install` creates all files with mode 0600
|
##### SUMMARY
When installing a collection, all files are created with mode 0600. This is caused by the way the collection .tar.gz file is extracted: https://github.com/ansible/ansible/blob/d3ec31f8d5683926aa6a05bb573d9929a6266fac/lib/ansible/galaxy/collection.py#L1076-L1090 The file created by `tempfile.NamedTemporaryFile()` seems to have mode `0600` and is simply moved to the final destination without a following `os.chmod`.
(See also this discussion: https://github.com/ansible-collections/community.general/pull/29#discussion_r396147440)
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
lib/ansible/galaxy/collection.py
##### ANSIBLE VERSION
```
2.9
2.10
```
|
https://github.com/ansible/ansible/issues/68415
|
https://github.com/ansible/ansible/pull/68418
|
a9d2ceafe429171c0e2ad007058b88bae57c74ce
|
127d54b3630c65043ec12c4af2024f8ef0bc6d09
| 2020-03-23T21:15:27Z |
python
| 2020-03-24T22:08:23Z |
test/units/cli/test_galaxy.py
|
# -*- coding: utf-8 -*-
# (c) 2016, Adrian Likins <[email protected]>
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import ansible
import json
import os
import pytest
import shutil
import tarfile
import tempfile
import yaml
import ansible.constants as C
from ansible import context
from ansible.cli.galaxy import GalaxyCLI
from ansible.galaxy.api import GalaxyAPI
from ansible.errors import AnsibleError
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.utils import context_objects as co
from units.compat import unittest
from units.compat.mock import patch, MagicMock
@pytest.fixture(autouse='function')
def reset_cli_args():
co.GlobalCLIArgs._Singleton__instance = None
yield
co.GlobalCLIArgs._Singleton__instance = None
class TestGalaxy(unittest.TestCase):
@classmethod
def setUpClass(cls):
'''creating prerequisites for installing a role; setUpClass occurs ONCE whereas setUp occurs with every method tested.'''
# class data for easy viewing: role_dir, role_tar, role_name, role_req, role_path
cls.temp_dir = tempfile.mkdtemp(prefix='ansible-test_galaxy-')
os.chdir(cls.temp_dir)
if os.path.exists("./delete_me"):
shutil.rmtree("./delete_me")
# creating framework for a role
gc = GalaxyCLI(args=["ansible-galaxy", "init", "--offline", "delete_me"])
gc.run()
cls.role_dir = "./delete_me"
cls.role_name = "delete_me"
# making a temp dir for role installation
cls.role_path = os.path.join(tempfile.mkdtemp(), "roles")
if not os.path.isdir(cls.role_path):
os.makedirs(cls.role_path)
# creating a tar file name for class data
cls.role_tar = './delete_me.tar.gz'
cls.makeTar(cls.role_tar, cls.role_dir)
# creating a temp file with installation requirements
cls.role_req = './delete_me_requirements.yml'
fd = open(cls.role_req, "w")
fd.write("- 'src': '%s'\n 'name': '%s'\n 'path': '%s'" % (cls.role_tar, cls.role_name, cls.role_path))
fd.close()
@classmethod
def makeTar(cls, output_file, source_dir):
''' used for making a tarfile from a role directory '''
# adding directory into a tar file
try:
tar = tarfile.open(output_file, "w:gz")
tar.add(source_dir, arcname=os.path.basename(source_dir))
except AttributeError: # tarfile obj. has no attribute __exit__ prior to python 2. 7
pass
finally: # ensuring closure of tarfile obj
tar.close()
@classmethod
def tearDownClass(cls):
'''After tests are finished removes things created in setUpClass'''
# deleting the temp role directory
if os.path.exists(cls.role_dir):
shutil.rmtree(cls.role_dir)
if os.path.exists(cls.role_req):
os.remove(cls.role_req)
if os.path.exists(cls.role_tar):
os.remove(cls.role_tar)
if os.path.isdir(cls.role_path):
shutil.rmtree(cls.role_path)
os.chdir('/')
shutil.rmtree(cls.temp_dir)
def setUp(self):
# Reset the stored command line args
co.GlobalCLIArgs._Singleton__instance = None
self.default_args = ['ansible-galaxy']
def tearDown(self):
# Reset the stored command line args
co.GlobalCLIArgs._Singleton__instance = None
def test_init(self):
galaxy_cli = GalaxyCLI(args=self.default_args)
self.assertTrue(isinstance(galaxy_cli, GalaxyCLI))
def test_display_min(self):
gc = GalaxyCLI(args=self.default_args)
role_info = {'name': 'some_role_name'}
display_result = gc._display_role_info(role_info)
self.assertTrue(display_result.find('some_role_name') > -1)
def test_display_galaxy_info(self):
gc = GalaxyCLI(args=self.default_args)
galaxy_info = {}
role_info = {'name': 'some_role_name',
'galaxy_info': galaxy_info}
display_result = gc._display_role_info(role_info)
if display_result.find('\n\tgalaxy_info:') == -1:
self.fail('Expected galaxy_info to be indented once')
def test_run(self):
''' verifies that the GalaxyCLI object's api is created and that execute() is called. '''
gc = GalaxyCLI(args=["ansible-galaxy", "install", "--ignore-errors", "imaginary_role"])
gc.parse()
with patch.object(ansible.cli.CLI, "run", return_value=None) as mock_run:
gc.run()
# testing
self.assertIsInstance(gc.galaxy, ansible.galaxy.Galaxy)
self.assertEqual(mock_run.call_count, 1)
self.assertTrue(isinstance(gc.api, ansible.galaxy.api.GalaxyAPI))
def test_execute_remove(self):
# installing role
gc = GalaxyCLI(args=["ansible-galaxy", "install", "-p", self.role_path, "-r", self.role_req, '--force'])
gc.run()
# location where the role was installed
role_file = os.path.join(self.role_path, self.role_name)
# removing role
# Have to reset the arguments in the context object manually since we're doing the
# equivalent of running the command line program twice
co.GlobalCLIArgs._Singleton__instance = None
gc = GalaxyCLI(args=["ansible-galaxy", "remove", role_file, self.role_name])
gc.run()
# testing role was removed
removed_role = not os.path.exists(role_file)
self.assertTrue(removed_role)
def test_exit_without_ignore_without_flag(self):
''' tests that GalaxyCLI exits with the error specified if the --ignore-errors flag is not used '''
gc = GalaxyCLI(args=["ansible-galaxy", "install", "--server=None", "fake_role_name"])
with patch.object(ansible.utils.display.Display, "display", return_value=None) as mocked_display:
# testing that error expected is raised
self.assertRaises(AnsibleError, gc.run)
self.assertTrue(mocked_display.called_once_with("- downloading role 'fake_role_name', owned by "))
def test_exit_without_ignore_with_flag(self):
''' tests that GalaxyCLI exits without the error specified if the --ignore-errors flag is used '''
# testing with --ignore-errors flag
gc = GalaxyCLI(args=["ansible-galaxy", "install", "--server=None", "fake_role_name", "--ignore-errors"])
with patch.object(ansible.utils.display.Display, "display", return_value=None) as mocked_display:
gc.run()
self.assertTrue(mocked_display.called_once_with("- downloading role 'fake_role_name', owned by "))
def test_parse_no_action(self):
''' testing the options parser when no action is given '''
gc = GalaxyCLI(args=["ansible-galaxy", ""])
self.assertRaises(SystemExit, gc.parse)
def test_parse_invalid_action(self):
''' testing the options parser when an invalid action is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "NOT_ACTION"])
self.assertRaises(SystemExit, gc.parse)
def test_parse_delete(self):
''' testing the options parser when the action 'delete' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "delete", "foo", "bar"])
gc.parse()
self.assertEqual(context.CLIARGS['verbosity'], 0)
def test_parse_import(self):
''' testing the options parser when the action 'import' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "import", "foo", "bar"])
gc.parse()
self.assertEqual(context.CLIARGS['wait'], True)
self.assertEqual(context.CLIARGS['reference'], None)
self.assertEqual(context.CLIARGS['check_status'], False)
self.assertEqual(context.CLIARGS['verbosity'], 0)
def test_parse_info(self):
''' testing the options parser when the action 'info' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "info", "foo", "bar"])
gc.parse()
self.assertEqual(context.CLIARGS['offline'], False)
def test_parse_init(self):
''' testing the options parser when the action 'init' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "init", "foo"])
gc.parse()
self.assertEqual(context.CLIARGS['offline'], False)
self.assertEqual(context.CLIARGS['force'], False)
def test_parse_install(self):
''' testing the options parser when the action 'install' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "install"])
gc.parse()
self.assertEqual(context.CLIARGS['ignore_errors'], False)
self.assertEqual(context.CLIARGS['no_deps'], False)
self.assertEqual(context.CLIARGS['role_file'], None)
self.assertEqual(context.CLIARGS['force'], False)
def test_parse_list(self):
''' testing the options parser when the action 'list' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "list"])
gc.parse()
self.assertEqual(context.CLIARGS['verbosity'], 0)
def test_parse_login(self):
''' testing the options parser when the action 'login' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "login"])
gc.parse()
self.assertEqual(context.CLIARGS['verbosity'], 0)
self.assertEqual(context.CLIARGS['token'], None)
def test_parse_remove(self):
''' testing the options parser when the action 'remove' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "remove", "foo"])
gc.parse()
self.assertEqual(context.CLIARGS['verbosity'], 0)
def test_parse_search(self):
''' testing the options parswer when the action 'search' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "search"])
gc.parse()
self.assertEqual(context.CLIARGS['platforms'], None)
self.assertEqual(context.CLIARGS['galaxy_tags'], None)
self.assertEqual(context.CLIARGS['author'], None)
def test_parse_setup(self):
''' testing the options parser when the action 'setup' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "setup", "source", "github_user", "github_repo", "secret"])
gc.parse()
self.assertEqual(context.CLIARGS['verbosity'], 0)
self.assertEqual(context.CLIARGS['remove_id'], None)
self.assertEqual(context.CLIARGS['setup_list'], False)
class ValidRoleTests(object):
expected_role_dirs = ('defaults', 'files', 'handlers', 'meta', 'tasks', 'templates', 'vars', 'tests')
@classmethod
def setUpRole(cls, role_name, galaxy_args=None, skeleton_path=None, use_explicit_type=False):
if galaxy_args is None:
galaxy_args = []
if skeleton_path is not None:
cls.role_skeleton_path = skeleton_path
galaxy_args += ['--role-skeleton', skeleton_path]
# Make temp directory for testing
cls.test_dir = tempfile.mkdtemp()
if not os.path.isdir(cls.test_dir):
os.makedirs(cls.test_dir)
cls.role_dir = os.path.join(cls.test_dir, role_name)
cls.role_name = role_name
# create role using default skeleton
args = ['ansible-galaxy']
if use_explicit_type:
args += ['role']
args += ['init', '-c', '--offline'] + galaxy_args + ['--init-path', cls.test_dir, cls.role_name]
gc = GalaxyCLI(args=args)
gc.run()
cls.gc = gc
if skeleton_path is None:
cls.role_skeleton_path = gc.galaxy.default_role_skeleton_path
@classmethod
def tearDownClass(cls):
if os.path.isdir(cls.test_dir):
shutil.rmtree(cls.test_dir)
def test_metadata(self):
with open(os.path.join(self.role_dir, 'meta', 'main.yml'), 'r') as mf:
metadata = yaml.safe_load(mf)
self.assertIn('galaxy_info', metadata, msg='unable to find galaxy_info in metadata')
self.assertIn('dependencies', metadata, msg='unable to find dependencies in metadata')
def test_readme(self):
readme_path = os.path.join(self.role_dir, 'README.md')
self.assertTrue(os.path.exists(readme_path), msg='Readme doesn\'t exist')
def test_main_ymls(self):
need_main_ymls = set(self.expected_role_dirs) - set(['meta', 'tests', 'files', 'templates'])
for d in need_main_ymls:
main_yml = os.path.join(self.role_dir, d, 'main.yml')
self.assertTrue(os.path.exists(main_yml))
expected_string = "---\n# {0} file for {1}".format(d, self.role_name)
with open(main_yml, 'r') as f:
self.assertEqual(expected_string, f.read().strip())
def test_role_dirs(self):
for d in self.expected_role_dirs:
self.assertTrue(os.path.isdir(os.path.join(self.role_dir, d)), msg="Expected role subdirectory {0} doesn't exist".format(d))
def test_travis_yml(self):
with open(os.path.join(self.role_dir, '.travis.yml'), 'r') as f:
contents = f.read()
with open(os.path.join(self.role_skeleton_path, '.travis.yml'), 'r') as f:
expected_contents = f.read()
self.assertEqual(expected_contents, contents, msg='.travis.yml does not match expected')
def test_readme_contents(self):
with open(os.path.join(self.role_dir, 'README.md'), 'r') as readme:
contents = readme.read()
with open(os.path.join(self.role_skeleton_path, 'README.md'), 'r') as f:
expected_contents = f.read()
self.assertEqual(expected_contents, contents, msg='README.md does not match expected')
def test_test_yml(self):
with open(os.path.join(self.role_dir, 'tests', 'test.yml'), 'r') as f:
test_playbook = yaml.safe_load(f)
print(test_playbook)
self.assertEqual(len(test_playbook), 1)
self.assertEqual(test_playbook[0]['hosts'], 'localhost')
self.assertEqual(test_playbook[0]['remote_user'], 'root')
self.assertListEqual(test_playbook[0]['roles'], [self.role_name], msg='The list of roles included in the test play doesn\'t match')
class TestGalaxyInitDefault(unittest.TestCase, ValidRoleTests):
@classmethod
def setUpClass(cls):
cls.setUpRole(role_name='delete_me')
def test_metadata_contents(self):
with open(os.path.join(self.role_dir, 'meta', 'main.yml'), 'r') as mf:
metadata = yaml.safe_load(mf)
self.assertEqual(metadata.get('galaxy_info', dict()).get('author'), 'your name', msg='author was not set properly in metadata')
class TestGalaxyInitAPB(unittest.TestCase, ValidRoleTests):
@classmethod
def setUpClass(cls):
cls.setUpRole('delete_me_apb', galaxy_args=['--type=apb'])
def test_metadata_apb_tag(self):
with open(os.path.join(self.role_dir, 'meta', 'main.yml'), 'r') as mf:
metadata = yaml.safe_load(mf)
self.assertIn('apb', metadata.get('galaxy_info', dict()).get('galaxy_tags', []), msg='apb tag not set in role metadata')
def test_metadata_contents(self):
with open(os.path.join(self.role_dir, 'meta', 'main.yml'), 'r') as mf:
metadata = yaml.safe_load(mf)
self.assertEqual(metadata.get('galaxy_info', dict()).get('author'), 'your name', msg='author was not set properly in metadata')
def test_apb_yml(self):
self.assertTrue(os.path.exists(os.path.join(self.role_dir, 'apb.yml')), msg='apb.yml was not created')
def test_test_yml(self):
with open(os.path.join(self.role_dir, 'tests', 'test.yml'), 'r') as f:
test_playbook = yaml.safe_load(f)
print(test_playbook)
self.assertEqual(len(test_playbook), 1)
self.assertEqual(test_playbook[0]['hosts'], 'localhost')
self.assertFalse(test_playbook[0]['gather_facts'])
self.assertEqual(test_playbook[0]['connection'], 'local')
self.assertIsNone(test_playbook[0]['tasks'], msg='We\'re expecting an unset list of tasks in test.yml')
class TestGalaxyInitContainer(unittest.TestCase, ValidRoleTests):
@classmethod
def setUpClass(cls):
cls.setUpRole('delete_me_container', galaxy_args=['--type=container'])
def test_metadata_container_tag(self):
with open(os.path.join(self.role_dir, 'meta', 'main.yml'), 'r') as mf:
metadata = yaml.safe_load(mf)
self.assertIn('container', metadata.get('galaxy_info', dict()).get('galaxy_tags', []), msg='container tag not set in role metadata')
def test_metadata_contents(self):
with open(os.path.join(self.role_dir, 'meta', 'main.yml'), 'r') as mf:
metadata = yaml.safe_load(mf)
self.assertEqual(metadata.get('galaxy_info', dict()).get('author'), 'your name', msg='author was not set properly in metadata')
def test_meta_container_yml(self):
self.assertTrue(os.path.exists(os.path.join(self.role_dir, 'meta', 'container.yml')), msg='container.yml was not created')
def test_test_yml(self):
with open(os.path.join(self.role_dir, 'tests', 'test.yml'), 'r') as f:
test_playbook = yaml.safe_load(f)
print(test_playbook)
self.assertEqual(len(test_playbook), 1)
self.assertEqual(test_playbook[0]['hosts'], 'localhost')
self.assertFalse(test_playbook[0]['gather_facts'])
self.assertEqual(test_playbook[0]['connection'], 'local')
self.assertIsNone(test_playbook[0]['tasks'], msg='We\'re expecting an unset list of tasks in test.yml')
class TestGalaxyInitSkeleton(unittest.TestCase, ValidRoleTests):
@classmethod
def setUpClass(cls):
role_skeleton_path = os.path.join(os.path.split(__file__)[0], 'test_data', 'role_skeleton')
cls.setUpRole('delete_me_skeleton', skeleton_path=role_skeleton_path, use_explicit_type=True)
def test_empty_files_dir(self):
files_dir = os.path.join(self.role_dir, 'files')
self.assertTrue(os.path.isdir(files_dir))
self.assertListEqual(os.listdir(files_dir), [], msg='we expect the files directory to be empty, is ignore working?')
def test_template_ignore_jinja(self):
test_conf_j2 = os.path.join(self.role_dir, 'templates', 'test.conf.j2')
self.assertTrue(os.path.exists(test_conf_j2), msg="The test.conf.j2 template doesn't seem to exist, is it being rendered as test.conf?")
with open(test_conf_j2, 'r') as f:
contents = f.read()
expected_contents = '[defaults]\ntest_key = {{ test_variable }}'
self.assertEqual(expected_contents, contents.strip(), msg="test.conf.j2 doesn't contain what it should, is it being rendered?")
def test_template_ignore_jinja_subfolder(self):
test_conf_j2 = os.path.join(self.role_dir, 'templates', 'subfolder', 'test.conf.j2')
self.assertTrue(os.path.exists(test_conf_j2), msg="The test.conf.j2 template doesn't seem to exist, is it being rendered as test.conf?")
with open(test_conf_j2, 'r') as f:
contents = f.read()
expected_contents = '[defaults]\ntest_key = {{ test_variable }}'
self.assertEqual(expected_contents, contents.strip(), msg="test.conf.j2 doesn't contain what it should, is it being rendered?")
def test_template_ignore_similar_folder(self):
self.assertTrue(os.path.exists(os.path.join(self.role_dir, 'templates_extra', 'templates.txt')))
def test_skeleton_option(self):
self.assertEqual(self.role_skeleton_path, context.CLIARGS['role_skeleton'], msg='Skeleton path was not parsed properly from the command line')
@pytest.mark.parametrize('cli_args, expected', [
(['ansible-galaxy', 'collection', 'init', 'abc.def'], 0),
(['ansible-galaxy', 'collection', 'init', 'abc.def', '-vvv'], 3),
(['ansible-galaxy', '-vv', 'collection', 'init', 'abc.def'], 2),
# Due to our manual parsing we want to verify that -v set in the sub parser takes precedence. This behaviour is
# deprecated and tests should be removed when the code that handles it is removed
(['ansible-galaxy', '-vv', 'collection', 'init', 'abc.def', '-v'], 1),
(['ansible-galaxy', '-vv', 'collection', 'init', 'abc.def', '-vvvv'], 4),
(['ansible-galaxy', '-vvv', 'init', 'name'], 3),
(['ansible-galaxy', '-vvvvv', 'init', '-v', 'name'], 1),
])
def test_verbosity_arguments(cli_args, expected, monkeypatch):
# Mock out the functions so we don't actually execute anything
for func_name in [f for f in dir(GalaxyCLI) if f.startswith("execute_")]:
monkeypatch.setattr(GalaxyCLI, func_name, MagicMock())
cli = GalaxyCLI(args=cli_args)
cli.run()
assert context.CLIARGS['verbosity'] == expected
@pytest.fixture()
def collection_skeleton(request, tmp_path_factory):
name, skeleton_path = request.param
galaxy_args = ['ansible-galaxy', 'collection', 'init', '-c']
if skeleton_path is not None:
galaxy_args += ['--collection-skeleton', skeleton_path]
test_dir = to_text(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections'))
galaxy_args += ['--init-path', test_dir, name]
GalaxyCLI(args=galaxy_args).run()
namespace_name, collection_name = name.split('.', 1)
collection_dir = os.path.join(test_dir, namespace_name, collection_name)
return collection_dir
@pytest.mark.parametrize('collection_skeleton', [
('ansible_test.my_collection', None),
], indirect=True)
def test_collection_default(collection_skeleton):
meta_path = os.path.join(collection_skeleton, 'galaxy.yml')
with open(meta_path, 'r') as galaxy_meta:
metadata = yaml.safe_load(galaxy_meta)
assert metadata['namespace'] == 'ansible_test'
assert metadata['name'] == 'my_collection'
assert metadata['authors'] == ['your name <[email protected]>']
assert metadata['readme'] == 'README.md'
assert metadata['version'] == '1.0.0'
assert metadata['description'] == 'your collection description'
assert metadata['license'] == ['GPL-2.0-or-later']
assert metadata['tags'] == []
assert metadata['dependencies'] == {}
assert metadata['documentation'] == 'http://docs.example.com'
assert metadata['repository'] == 'http://example.com/repository'
assert metadata['homepage'] == 'http://example.com'
assert metadata['issues'] == 'http://example.com/issue/tracker'
for d in ['docs', 'plugins', 'roles']:
assert os.path.isdir(os.path.join(collection_skeleton, d)), \
"Expected collection subdirectory {0} doesn't exist".format(d)
@pytest.mark.parametrize('collection_skeleton', [
('ansible_test.delete_me_skeleton', os.path.join(os.path.split(__file__)[0], 'test_data', 'collection_skeleton')),
], indirect=True)
def test_collection_skeleton(collection_skeleton):
meta_path = os.path.join(collection_skeleton, 'galaxy.yml')
with open(meta_path, 'r') as galaxy_meta:
metadata = yaml.safe_load(galaxy_meta)
assert metadata['namespace'] == 'ansible_test'
assert metadata['name'] == 'delete_me_skeleton'
assert metadata['authors'] == ['Ansible Cow <[email protected]>', 'Tu Cow <[email protected]>']
assert metadata['version'] == '0.1.0'
assert metadata['readme'] == 'README.md'
assert len(metadata) == 5
assert os.path.exists(os.path.join(collection_skeleton, 'README.md'))
# Test empty directories exist and are empty
for empty_dir in ['plugins/action', 'plugins/filter', 'plugins/inventory', 'plugins/lookup',
'plugins/module_utils', 'plugins/modules']:
assert os.listdir(os.path.join(collection_skeleton, empty_dir)) == []
# Test files that don't end with .j2 were not templated
doc_file = os.path.join(collection_skeleton, 'docs', 'My Collection.md')
with open(doc_file, 'r') as f:
doc_contents = f.read()
assert doc_contents.strip() == 'Welcome to my test collection doc for {{ namespace }}.'
# Test files that end with .j2 but are in the templates directory were not templated
for template_dir in ['playbooks/templates', 'playbooks/templates/subfolder',
'roles/common/templates', 'roles/common/templates/subfolder']:
test_conf_j2 = os.path.join(collection_skeleton, template_dir, 'test.conf.j2')
assert os.path.exists(test_conf_j2)
with open(test_conf_j2, 'r') as f:
contents = f.read()
expected_contents = '[defaults]\ntest_key = {{ test_variable }}'
assert expected_contents == contents.strip()
@pytest.fixture()
def collection_artifact(collection_skeleton, tmp_path_factory):
''' Creates a collection artifact tarball that is ready to be published and installed '''
output_dir = to_text(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Output'))
# Because we call GalaxyCLI in collection_skeleton we need to reset the singleton back to None so it uses the new
# args, we reset the original args once it is done.
orig_cli_args = co.GlobalCLIArgs._Singleton__instance
try:
co.GlobalCLIArgs._Singleton__instance = None
galaxy_args = ['ansible-galaxy', 'collection', 'build', collection_skeleton, '--output-path', output_dir]
gc = GalaxyCLI(args=galaxy_args)
gc.run()
yield output_dir
finally:
co.GlobalCLIArgs._Singleton__instance = orig_cli_args
def test_invalid_skeleton_path():
expected = "- the skeleton path '/fake/path' does not exist, cannot init collection"
gc = GalaxyCLI(args=['ansible-galaxy', 'collection', 'init', 'my.collection', '--collection-skeleton',
'/fake/path'])
with pytest.raises(AnsibleError, match=expected):
gc.run()
@pytest.mark.parametrize("name", [
"",
"invalid",
"hypen-ns.collection",
"ns.hyphen-collection",
"ns.collection.weird",
])
def test_invalid_collection_name_init(name):
expected = "Invalid collection name '%s', name must be in the format <namespace>.<collection>" % name
gc = GalaxyCLI(args=['ansible-galaxy', 'collection', 'init', name])
with pytest.raises(AnsibleError, match=expected):
gc.run()
@pytest.mark.parametrize("name, expected", [
("", ""),
("invalid", "invalid"),
("invalid:1.0.0", "invalid"),
("hypen-ns.collection", "hypen-ns.collection"),
("ns.hyphen-collection", "ns.hyphen-collection"),
("ns.collection.weird", "ns.collection.weird"),
])
def test_invalid_collection_name_install(name, expected, tmp_path_factory):
install_path = to_text(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections'))
expected = "Invalid collection name '%s', name must be in the format <namespace>.<collection>" % expected
gc = GalaxyCLI(args=['ansible-galaxy', 'collection', 'install', name, '-p', os.path.join(install_path, 'install')])
with pytest.raises(AnsibleError, match=expected):
gc.run()
@pytest.mark.parametrize('collection_skeleton', [
('ansible_test.build_collection', None),
], indirect=True)
def test_collection_build(collection_artifact):
tar_path = os.path.join(collection_artifact, 'ansible_test-build_collection-1.0.0.tar.gz')
assert tarfile.is_tarfile(tar_path)
with tarfile.open(tar_path, mode='r') as tar:
tar_members = tar.getmembers()
valid_files = ['MANIFEST.json', 'FILES.json', 'roles', 'docs', 'plugins', 'plugins/README.md', 'README.md']
assert len(tar_members) == 7
# Verify the uid and gid is 0 and the correct perms are set
for member in tar_members:
assert member.name in valid_files
assert member.gid == 0
assert member.gname == ''
assert member.uid == 0
assert member.uname == ''
if member.isdir():
assert member.mode == 0o0755
else:
assert member.mode == 0o0644
manifest_file = tar.extractfile(tar_members[0])
try:
manifest = json.loads(to_text(manifest_file.read()))
finally:
manifest_file.close()
coll_info = manifest['collection_info']
file_manifest = manifest['file_manifest_file']
assert manifest['format'] == 1
assert len(manifest.keys()) == 3
assert coll_info['namespace'] == 'ansible_test'
assert coll_info['name'] == 'build_collection'
assert coll_info['version'] == '1.0.0'
assert coll_info['authors'] == ['your name <[email protected]>']
assert coll_info['readme'] == 'README.md'
assert coll_info['tags'] == []
assert coll_info['description'] == 'your collection description'
assert coll_info['license'] == ['GPL-2.0-or-later']
assert coll_info['license_file'] is None
assert coll_info['dependencies'] == {}
assert coll_info['repository'] == 'http://example.com/repository'
assert coll_info['documentation'] == 'http://docs.example.com'
assert coll_info['homepage'] == 'http://example.com'
assert coll_info['issues'] == 'http://example.com/issue/tracker'
assert len(coll_info.keys()) == 14
assert file_manifest['name'] == 'FILES.json'
assert file_manifest['ftype'] == 'file'
assert file_manifest['chksum_type'] == 'sha256'
assert file_manifest['chksum_sha256'] is not None # Order of keys makes it hard to verify the checksum
assert file_manifest['format'] == 1
assert len(file_manifest.keys()) == 5
files_file = tar.extractfile(tar_members[1])
try:
files = json.loads(to_text(files_file.read()))
finally:
files_file.close()
assert len(files['files']) == 6
assert files['format'] == 1
assert len(files.keys()) == 2
valid_files_entries = ['.', 'roles', 'docs', 'plugins', 'plugins/README.md', 'README.md']
for file_entry in files['files']:
assert file_entry['name'] in valid_files_entries
assert file_entry['format'] == 1
if file_entry['name'] == 'plugins/README.md':
assert file_entry['ftype'] == 'file'
assert file_entry['chksum_type'] == 'sha256'
# Can't test the actual checksum as the html link changes based on the version.
assert file_entry['chksum_sha256'] is not None
elif file_entry['name'] == 'README.md':
assert file_entry['ftype'] == 'file'
assert file_entry['chksum_type'] == 'sha256'
assert file_entry['chksum_sha256'] == '45923ca2ece0e8ce31d29e5df9d8b649fe55e2f5b5b61c9724d7cc187bd6ad4a'
else:
assert file_entry['ftype'] == 'dir'
assert file_entry['chksum_type'] is None
assert file_entry['chksum_sha256'] is None
assert len(file_entry.keys()) == 5
@pytest.fixture()
def collection_install(reset_cli_args, tmp_path_factory, monkeypatch):
mock_install = MagicMock()
monkeypatch.setattr(ansible.cli.galaxy, 'install_collections', mock_install)
mock_warning = MagicMock()
monkeypatch.setattr(ansible.utils.display.Display, 'warning', mock_warning)
output_dir = to_text((tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Output')))
yield mock_install, mock_warning, output_dir
def test_collection_install_with_names(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', 'namespace2.collection:1.0.1',
'--collections-path', output_dir]
GalaxyCLI(args=galaxy_args).run()
collection_path = os.path.join(output_dir, 'ansible_collections')
assert os.path.isdir(collection_path)
assert mock_warning.call_count == 1
assert "The specified collections path '%s' is not part of the configured Ansible collections path" % output_dir \
in mock_warning.call_args[0][0]
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.collection', '*', None),
('namespace2.collection', '1.0.1', None)]
assert mock_install.call_args[0][1] == collection_path
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
def test_collection_install_with_requirements_file(collection_install):
mock_install, mock_warning, output_dir = collection_install
requirements_file = os.path.join(output_dir, 'requirements.yml')
with open(requirements_file, 'wb') as req_obj:
req_obj.write(b'''---
collections:
- namespace.coll
- name: namespace2.coll
version: '>2.0.1'
''')
galaxy_args = ['ansible-galaxy', 'collection', 'install', '--requirements-file', requirements_file,
'--collections-path', output_dir]
GalaxyCLI(args=galaxy_args).run()
collection_path = os.path.join(output_dir, 'ansible_collections')
assert os.path.isdir(collection_path)
assert mock_warning.call_count == 1
assert "The specified collections path '%s' is not part of the configured Ansible collections path" % output_dir \
in mock_warning.call_args[0][0]
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.coll', '*', None),
('namespace2.coll', '>2.0.1', None)]
assert mock_install.call_args[0][1] == collection_path
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
def test_collection_install_with_relative_path(collection_install, monkeypatch):
mock_install = collection_install[0]
mock_req = MagicMock()
mock_req.return_value = {'collections': [('namespace.coll', '*', None)]}
monkeypatch.setattr(ansible.cli.galaxy.GalaxyCLI, '_parse_requirements_file', mock_req)
monkeypatch.setattr(os, 'makedirs', MagicMock())
requirements_file = './requirements.myl'
collections_path = './ansible_collections'
galaxy_args = ['ansible-galaxy', 'collection', 'install', '--requirements-file', requirements_file,
'--collections-path', collections_path]
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.coll', '*', None)]
assert mock_install.call_args[0][1] == os.path.abspath(collections_path)
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
assert mock_req.call_count == 1
assert mock_req.call_args[0][0] == os.path.abspath(requirements_file)
def test_collection_install_with_unexpanded_path(collection_install, monkeypatch):
mock_install = collection_install[0]
mock_req = MagicMock()
mock_req.return_value = {'collections': [('namespace.coll', '*', None)]}
monkeypatch.setattr(ansible.cli.galaxy.GalaxyCLI, '_parse_requirements_file', mock_req)
monkeypatch.setattr(os, 'makedirs', MagicMock())
requirements_file = '~/requirements.myl'
collections_path = '~/ansible_collections'
galaxy_args = ['ansible-galaxy', 'collection', 'install', '--requirements-file', requirements_file,
'--collections-path', collections_path]
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.coll', '*', None)]
assert mock_install.call_args[0][1] == os.path.expanduser(os.path.expandvars(collections_path))
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
assert mock_req.call_count == 1
assert mock_req.call_args[0][0] == os.path.expanduser(os.path.expandvars(requirements_file))
def test_collection_install_in_collection_dir(collection_install, monkeypatch):
mock_install, mock_warning, output_dir = collection_install
collections_path = C.COLLECTIONS_PATHS[0]
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', 'namespace2.collection:1.0.1',
'--collections-path', collections_path]
GalaxyCLI(args=galaxy_args).run()
assert mock_warning.call_count == 0
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.collection', '*', None),
('namespace2.collection', '1.0.1', None)]
assert mock_install.call_args[0][1] == os.path.join(collections_path, 'ansible_collections')
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
def test_collection_install_with_url(collection_install):
mock_install, dummy, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'https://foo/bar/foo-bar-v1.0.0.tar.gz',
'--collections-path', output_dir]
GalaxyCLI(args=galaxy_args).run()
collection_path = os.path.join(output_dir, 'ansible_collections')
assert os.path.isdir(collection_path)
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('https://foo/bar/foo-bar-v1.0.0.tar.gz', '*', None)]
assert mock_install.call_args[0][1] == collection_path
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
def test_collection_install_name_and_requirements_fail(collection_install):
test_path = collection_install[2]
expected = 'The positional collection_name arg and --requirements-file are mutually exclusive.'
with pytest.raises(AnsibleError, match=expected):
GalaxyCLI(args=['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path',
test_path, '--requirements-file', test_path]).run()
def test_collection_install_no_name_and_requirements_fail(collection_install):
test_path = collection_install[2]
expected = 'You must specify a collection name or a requirements file.'
with pytest.raises(AnsibleError, match=expected):
GalaxyCLI(args=['ansible-galaxy', 'collection', 'install', '--collections-path', test_path]).run()
def test_collection_install_path_with_ansible_collections(collection_install):
mock_install, mock_warning, output_dir = collection_install
collection_path = os.path.join(output_dir, 'ansible_collections')
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', 'namespace2.collection:1.0.1',
'--collections-path', collection_path]
GalaxyCLI(args=galaxy_args).run()
assert os.path.isdir(collection_path)
assert mock_warning.call_count == 1
assert "The specified collections path '%s' is not part of the configured Ansible collections path" \
% collection_path in mock_warning.call_args[0][0]
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.collection', '*', None),
('namespace2.collection', '1.0.1', None)]
assert mock_install.call_args[0][1] == collection_path
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
def test_collection_install_ignore_certs(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path', output_dir,
'--ignore-certs']
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_args[0][3] is False
def test_collection_install_force(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path', output_dir,
'--force']
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_args[0][6] is True
def test_collection_install_force_deps(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path', output_dir,
'--force-with-deps']
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_args[0][7] is True
def test_collection_install_no_deps(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path', output_dir,
'--no-deps']
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_args[0][5] is True
def test_collection_install_ignore(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path', output_dir,
'--ignore-errors']
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_args[0][4] is True
def test_collection_install_custom_server(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path', output_dir,
'--server', 'https://galaxy-dev.ansible.com']
GalaxyCLI(args=galaxy_args).run()
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy-dev.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
@pytest.fixture()
def requirements_file(request, tmp_path_factory):
content = request.param
test_dir = to_text(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Requirements'))
requirements_file = os.path.join(test_dir, 'requirements.yml')
if content:
with open(requirements_file, 'wb') as req_obj:
req_obj.write(to_bytes(content))
yield requirements_file
@pytest.fixture()
def requirements_cli(monkeypatch):
monkeypatch.setattr(GalaxyCLI, 'execute_install', MagicMock())
cli = GalaxyCLI(args=['ansible-galaxy', 'install'])
cli.run()
return cli
@pytest.mark.parametrize('requirements_file', [None], indirect=True)
def test_parse_requirements_file_that_doesnt_exist(requirements_cli, requirements_file):
expected = "The requirements file '%s' does not exist." % to_native(requirements_file)
with pytest.raises(AnsibleError, match=expected):
requirements_cli._parse_requirements_file(requirements_file)
@pytest.mark.parametrize('requirements_file', ['not a valid yml file: hi: world'], indirect=True)
def test_parse_requirements_file_that_isnt_yaml(requirements_cli, requirements_file):
expected = "Failed to parse the requirements yml at '%s' with the following error" % to_native(requirements_file)
with pytest.raises(AnsibleError, match=expected):
requirements_cli._parse_requirements_file(requirements_file)
@pytest.mark.parametrize('requirements_file', [('''
# Older role based requirements.yml
- galaxy.role
- anotherrole
''')], indirect=True)
def test_parse_requirements_in_older_format_illega(requirements_cli, requirements_file):
expected = "Expecting requirements file to be a dict with the key 'collections' that contains a list of " \
"collections to install"
with pytest.raises(AnsibleError, match=expected):
requirements_cli._parse_requirements_file(requirements_file, allow_old_format=False)
@pytest.mark.parametrize('requirements_file', ['''
collections:
- version: 1.0.0
'''], indirect=True)
def test_parse_requirements_without_mandatory_name_key(requirements_cli, requirements_file):
expected = "Collections requirement entry should contain the key name."
with pytest.raises(AnsibleError, match=expected):
requirements_cli._parse_requirements_file(requirements_file)
@pytest.mark.parametrize('requirements_file', [('''
collections:
- namespace.collection1
- namespace.collection2
'''), ('''
collections:
- name: namespace.collection1
- name: namespace.collection2
''')], indirect=True)
def test_parse_requirements(requirements_cli, requirements_file):
expected = {
'roles': [],
'collections': [('namespace.collection1', '*', None), ('namespace.collection2', '*', None)]
}
actual = requirements_cli._parse_requirements_file(requirements_file)
assert actual == expected
@pytest.mark.parametrize('requirements_file', ['''
collections:
- name: namespace.collection1
version: ">=1.0.0,<=2.0.0"
source: https://galaxy-dev.ansible.com
- namespace.collection2'''], indirect=True)
def test_parse_requirements_with_extra_info(requirements_cli, requirements_file):
actual = requirements_cli._parse_requirements_file(requirements_file)
assert len(actual['roles']) == 0
assert len(actual['collections']) == 2
assert actual['collections'][0][0] == 'namespace.collection1'
assert actual['collections'][0][1] == '>=1.0.0,<=2.0.0'
assert actual['collections'][0][2].api_server == 'https://galaxy-dev.ansible.com'
assert actual['collections'][0][2].name == 'explicit_requirement_namespace.collection1'
assert actual['collections'][0][2].token is None
assert actual['collections'][0][2].username is None
assert actual['collections'][0][2].password is None
assert actual['collections'][0][2].validate_certs is True
assert actual['collections'][1] == ('namespace.collection2', '*', None)
@pytest.mark.parametrize('requirements_file', ['''
roles:
- username.role_name
- src: username2.role_name2
- src: ssh://github.com/user/repo
scm: git
collections:
- namespace.collection2
'''], indirect=True)
def test_parse_requirements_with_roles_and_collections(requirements_cli, requirements_file):
actual = requirements_cli._parse_requirements_file(requirements_file)
assert len(actual['roles']) == 3
assert actual['roles'][0].name == 'username.role_name'
assert actual['roles'][1].name == 'username2.role_name2'
assert actual['roles'][2].name == 'repo'
assert actual['roles'][2].src == 'ssh://github.com/user/repo'
assert len(actual['collections']) == 1
assert actual['collections'][0] == ('namespace.collection2', '*', None)
@pytest.mark.parametrize('requirements_file', ['''
collections:
- name: namespace.collection
- name: namespace2.collection2
source: https://galaxy-dev.ansible.com/
- name: namespace3.collection3
source: server
'''], indirect=True)
def test_parse_requirements_with_collection_source(requirements_cli, requirements_file):
galaxy_api = GalaxyAPI(requirements_cli.api, 'server', 'https://config-server')
requirements_cli.api_servers.append(galaxy_api)
actual = requirements_cli._parse_requirements_file(requirements_file)
assert actual['roles'] == []
assert len(actual['collections']) == 3
assert actual['collections'][0] == ('namespace.collection', '*', None)
assert actual['collections'][1][0] == 'namespace2.collection2'
assert actual['collections'][1][1] == '*'
assert actual['collections'][1][2].api_server == 'https://galaxy-dev.ansible.com/'
assert actual['collections'][1][2].name == 'explicit_requirement_namespace2.collection2'
assert actual['collections'][1][2].token is None
assert actual['collections'][2] == ('namespace3.collection3', '*', galaxy_api)
@pytest.mark.parametrize('requirements_file', ['''
- username.included_role
- src: https://github.com/user/repo
'''], indirect=True)
def test_parse_requirements_roles_with_include(requirements_cli, requirements_file):
reqs = [
'ansible.role',
{'include': requirements_file},
]
parent_requirements = os.path.join(os.path.dirname(requirements_file), 'parent.yaml')
with open(to_bytes(parent_requirements), 'wb') as req_fd:
req_fd.write(to_bytes(yaml.safe_dump(reqs)))
actual = requirements_cli._parse_requirements_file(parent_requirements)
assert len(actual['roles']) == 3
assert actual['collections'] == []
assert actual['roles'][0].name == 'ansible.role'
assert actual['roles'][1].name == 'username.included_role'
assert actual['roles'][2].name == 'repo'
assert actual['roles'][2].src == 'https://github.com/user/repo'
@pytest.mark.parametrize('requirements_file', ['''
- username.role
- include: missing.yml
'''], indirect=True)
def test_parse_requirements_roles_with_include_missing(requirements_cli, requirements_file):
expected = "Failed to find include requirements file 'missing.yml' in '%s'" % to_native(requirements_file)
with pytest.raises(AnsibleError, match=expected):
requirements_cli._parse_requirements_file(requirements_file)
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,415 |
`ansible-galaxy collection install` creates all files with mode 0600
|
##### SUMMARY
When installing a collection, all files are created with mode 0600. This is caused by the way the collection .tar.gz file is extracted: https://github.com/ansible/ansible/blob/d3ec31f8d5683926aa6a05bb573d9929a6266fac/lib/ansible/galaxy/collection.py#L1076-L1090 The file created by `tempfile.NamedTemporaryFile()` seems to have mode `0600` and is simply moved to the final destination without a following `os.chmod`.
(See also this discussion: https://github.com/ansible-collections/community.general/pull/29#discussion_r396147440)
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
lib/ansible/galaxy/collection.py
##### ANSIBLE VERSION
```
2.9
2.10
```
|
https://github.com/ansible/ansible/issues/68415
|
https://github.com/ansible/ansible/pull/68418
|
a9d2ceafe429171c0e2ad007058b88bae57c74ce
|
127d54b3630c65043ec12c4af2024f8ef0bc6d09
| 2020-03-23T21:15:27Z |
python
| 2020-03-24T22:08:23Z |
test/units/galaxy/test_collection_install.py
|
# -*- coding: utf-8 -*-
# Copyright: (c) 2019, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import copy
import json
import os
import pytest
import re
import shutil
import tarfile
import yaml
from io import BytesIO, StringIO
from units.compat.mock import MagicMock
import ansible.module_utils.six.moves.urllib.error as urllib_error
from ansible import context
from ansible.cli.galaxy import GalaxyCLI
from ansible.errors import AnsibleError
from ansible.galaxy import collection, api
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.utils import context_objects as co
from ansible.utils.display import Display
def call_galaxy_cli(args):
orig = co.GlobalCLIArgs._Singleton__instance
co.GlobalCLIArgs._Singleton__instance = None
try:
GalaxyCLI(args=['ansible-galaxy', 'collection'] + args).run()
finally:
co.GlobalCLIArgs._Singleton__instance = orig
def artifact_json(namespace, name, version, dependencies, server):
json_str = json.dumps({
'artifact': {
'filename': '%s-%s-%s.tar.gz' % (namespace, name, version),
'sha256': '2d76f3b8c4bab1072848107fb3914c345f71a12a1722f25c08f5d3f51f4ab5fd',
'size': 1234,
},
'download_url': '%s/download/%s-%s-%s.tar.gz' % (server, namespace, name, version),
'metadata': {
'namespace': namespace,
'name': name,
'dependencies': dependencies,
},
'version': version
})
return to_text(json_str)
def artifact_versions_json(namespace, name, versions, galaxy_api, available_api_versions=None):
results = []
available_api_versions = available_api_versions or {}
api_version = 'v2'
if 'v3' in available_api_versions:
api_version = 'v3'
for version in versions:
results.append({
'href': '%s/api/%s/%s/%s/versions/%s/' % (galaxy_api.api_server, api_version, namespace, name, version),
'version': version,
})
if api_version == 'v2':
json_str = json.dumps({
'count': len(versions),
'next': None,
'previous': None,
'results': results
})
if api_version == 'v3':
response = {'meta': {'count': len(versions)},
'data': results,
'links': {'first': None,
'last': None,
'next': None,
'previous': None},
}
json_str = json.dumps(response)
return to_text(json_str)
def error_json(galaxy_api, errors_to_return=None, available_api_versions=None):
errors_to_return = errors_to_return or []
available_api_versions = available_api_versions or {}
response = {}
api_version = 'v2'
if 'v3' in available_api_versions:
api_version = 'v3'
if api_version == 'v2':
assert len(errors_to_return) <= 1
if errors_to_return:
response = errors_to_return[0]
if api_version == 'v3':
response['errors'] = errors_to_return
json_str = json.dumps(response)
return to_text(json_str)
@pytest.fixture(autouse='function')
def reset_cli_args():
co.GlobalCLIArgs._Singleton__instance = None
yield
co.GlobalCLIArgs._Singleton__instance = None
@pytest.fixture()
def collection_artifact(request, tmp_path_factory):
test_dir = to_text(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
namespace = 'ansible_namespace'
collection = 'collection'
skeleton_path = os.path.join(os.path.dirname(os.path.split(__file__)[0]), 'cli', 'test_data', 'collection_skeleton')
collection_path = os.path.join(test_dir, namespace, collection)
call_galaxy_cli(['init', '%s.%s' % (namespace, collection), '-c', '--init-path', test_dir,
'--collection-skeleton', skeleton_path])
dependencies = getattr(request, 'param', None)
if dependencies:
galaxy_yml = os.path.join(collection_path, 'galaxy.yml')
with open(galaxy_yml, 'rb+') as galaxy_obj:
existing_yaml = yaml.safe_load(galaxy_obj)
existing_yaml['dependencies'] = dependencies
galaxy_obj.seek(0)
galaxy_obj.write(to_bytes(yaml.safe_dump(existing_yaml)))
galaxy_obj.truncate()
call_galaxy_cli(['build', collection_path, '--output-path', test_dir])
collection_tar = os.path.join(test_dir, '%s-%s-0.1.0.tar.gz' % (namespace, collection))
return to_bytes(collection_path), to_bytes(collection_tar)
@pytest.fixture()
def galaxy_server():
context.CLIARGS._store = {'ignore_certs': False}
galaxy_api = api.GalaxyAPI(None, 'test_server', 'https://galaxy.ansible.com')
return galaxy_api
def test_build_requirement_from_path(collection_artifact):
actual = collection.CollectionRequirement.from_path(collection_artifact[0], True)
assert actual.namespace == u'ansible_namespace'
assert actual.name == u'collection'
assert actual.b_path == collection_artifact[0]
assert actual.api is None
assert actual.skip is True
assert actual.versions == set([u'*'])
assert actual.latest_version == u'*'
assert actual.dependencies == {}
@pytest.mark.parametrize('version', ['1.1.1', '1.1.0', '1.0.0'])
def test_build_requirement_from_path_with_manifest(version, collection_artifact):
manifest_path = os.path.join(collection_artifact[0], b'MANIFEST.json')
manifest_value = json.dumps({
'collection_info': {
'namespace': 'namespace',
'name': 'name',
'version': version,
'dependencies': {
'ansible_namespace.collection': '*'
}
}
})
with open(manifest_path, 'wb') as manifest_obj:
manifest_obj.write(to_bytes(manifest_value))
actual = collection.CollectionRequirement.from_path(collection_artifact[0], True)
# While the folder name suggests a different collection, we treat MANIFEST.json as the source of truth.
assert actual.namespace == u'namespace'
assert actual.name == u'name'
assert actual.b_path == collection_artifact[0]
assert actual.api is None
assert actual.skip is True
assert actual.versions == set([to_text(version)])
assert actual.latest_version == to_text(version)
assert actual.dependencies == {'ansible_namespace.collection': '*'}
def test_build_requirement_from_path_invalid_manifest(collection_artifact):
manifest_path = os.path.join(collection_artifact[0], b'MANIFEST.json')
with open(manifest_path, 'wb') as manifest_obj:
manifest_obj.write(b"not json")
expected = "Collection file at '%s' does not contain a valid json string." % to_native(manifest_path)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_path(collection_artifact[0], True)
def test_build_requirement_from_path_no_version(collection_artifact, monkeypatch):
manifest_path = os.path.join(collection_artifact[0], b'MANIFEST.json')
manifest_value = json.dumps({
'collection_info': {
'namespace': 'namespace',
'name': 'name',
'version': '',
'dependencies': {}
}
})
with open(manifest_path, 'wb') as manifest_obj:
manifest_obj.write(to_bytes(manifest_value))
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
actual = collection.CollectionRequirement.from_path(collection_artifact[0], True)
# While the folder name suggests a different collection, we treat MANIFEST.json as the source of truth.
assert actual.namespace == u'namespace'
assert actual.name == u'name'
assert actual.b_path == collection_artifact[0]
assert actual.api is None
assert actual.skip is True
assert actual.versions == set(['*'])
assert actual.latest_version == u'*'
assert actual.dependencies == {}
assert mock_display.call_count == 1
actual_warn = ' '.join(mock_display.mock_calls[0][1][0].split('\n'))
expected_warn = "Collection at '%s' does not have a valid version set, falling back to '*'. Found version: ''" \
% to_text(collection_artifact[0])
assert expected_warn in actual_warn
def test_build_requirement_from_tar(collection_artifact):
actual = collection.CollectionRequirement.from_tar(collection_artifact[1], True, True)
assert actual.namespace == u'ansible_namespace'
assert actual.name == u'collection'
assert actual.b_path == collection_artifact[1]
assert actual.api is None
assert actual.skip is False
assert actual.versions == set([u'0.1.0'])
assert actual.latest_version == u'0.1.0'
assert actual.dependencies == {}
def test_build_requirement_from_tar_fail_not_tar(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
test_file = os.path.join(test_dir, b'fake.tar.gz')
with open(test_file, 'wb') as test_obj:
test_obj.write(b"\x00\x01\x02\x03")
expected = "Collection artifact at '%s' is not a valid tar file." % to_native(test_file)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(test_file, True, True)
def test_build_requirement_from_tar_no_manifest(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
json_data = to_bytes(json.dumps(
{
'files': [],
'format': 1,
}
))
tar_path = os.path.join(test_dir, b'ansible-collections.tar.gz')
with tarfile.open(tar_path, 'w:gz') as tfile:
b_io = BytesIO(json_data)
tar_info = tarfile.TarInfo('FILES.json')
tar_info.size = len(json_data)
tar_info.mode = 0o0644
tfile.addfile(tarinfo=tar_info, fileobj=b_io)
expected = "Collection at '%s' does not contain the required file MANIFEST.json." % to_native(tar_path)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(tar_path, True, True)
def test_build_requirement_from_tar_no_files(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
json_data = to_bytes(json.dumps(
{
'collection_info': {},
}
))
tar_path = os.path.join(test_dir, b'ansible-collections.tar.gz')
with tarfile.open(tar_path, 'w:gz') as tfile:
b_io = BytesIO(json_data)
tar_info = tarfile.TarInfo('MANIFEST.json')
tar_info.size = len(json_data)
tar_info.mode = 0o0644
tfile.addfile(tarinfo=tar_info, fileobj=b_io)
expected = "Collection at '%s' does not contain the required file FILES.json." % to_native(tar_path)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(tar_path, True, True)
def test_build_requirement_from_tar_invalid_manifest(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
json_data = b"not a json"
tar_path = os.path.join(test_dir, b'ansible-collections.tar.gz')
with tarfile.open(tar_path, 'w:gz') as tfile:
b_io = BytesIO(json_data)
tar_info = tarfile.TarInfo('MANIFEST.json')
tar_info.size = len(json_data)
tar_info.mode = 0o0644
tfile.addfile(tarinfo=tar_info, fileobj=b_io)
expected = "Collection tar file member MANIFEST.json does not contain a valid json string."
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(tar_path, True, True)
def test_build_requirement_from_name(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['2.1.9', '2.1.10']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '*', True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.1.9', u'2.1.10'])
assert actual.latest_version == u'2.1.10'
assert actual.dependencies == {}
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
def test_build_requirement_from_name_with_prerelease(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['1.0.1', '2.0.1-beta.1', '2.0.1']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '*', True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'1.0.1', u'2.0.1'])
assert actual.latest_version == u'2.0.1'
assert actual.dependencies == {}
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
def test_build_requirment_from_name_with_prerelease_explicit(galaxy_server, monkeypatch):
mock_get_info = MagicMock()
mock_get_info.return_value = api.CollectionVersionMetadata('namespace', 'collection', '2.0.1-beta.1', None, None,
{})
monkeypatch.setattr(galaxy_server, 'get_collection_version_metadata', mock_get_info)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '2.0.1-beta.1', True,
True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.1-beta.1'])
assert actual.latest_version == u'2.0.1-beta.1'
assert actual.dependencies == {}
assert mock_get_info.call_count == 1
assert mock_get_info.mock_calls[0][1] == ('namespace', 'collection', '2.0.1-beta.1')
def test_build_requirement_from_name_second_server(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['1.0.1', '1.0.2', '1.0.3']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
broken_server = copy.copy(galaxy_server)
broken_server.api_server = 'https://broken.com/'
mock_404 = MagicMock()
mock_404.side_effect = api.GalaxyError(urllib_error.HTTPError('https://galaxy.server.com', 404, 'msg', {},
StringIO()), "custom msg")
monkeypatch.setattr(broken_server, 'get_collection_versions', mock_404)
actual = collection.CollectionRequirement.from_name('namespace.collection', [broken_server, galaxy_server],
'>1.0.1', False, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
# assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'1.0.2', u'1.0.3'])
assert actual.latest_version == u'1.0.3'
assert actual.dependencies == {}
assert mock_404.call_count == 1
assert mock_404.mock_calls[0][1] == ('namespace', 'collection')
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
def test_build_requirement_from_name_missing(galaxy_server, monkeypatch):
mock_open = MagicMock()
mock_open.side_effect = api.GalaxyError(urllib_error.HTTPError('https://galaxy.server.com', 404, 'msg', {},
StringIO()), "")
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_open)
expected = "Failed to find collection namespace.collection:*"
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server, galaxy_server], '*', False,
True)
def test_build_requirement_from_name_401_unauthorized(galaxy_server, monkeypatch):
mock_open = MagicMock()
mock_open.side_effect = api.GalaxyError(urllib_error.HTTPError('https://galaxy.server.com', 401, 'msg', {},
StringIO()), "error")
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_open)
expected = "error (HTTP Code: 401, Message: msg)"
with pytest.raises(api.GalaxyError, match=re.escape(expected)):
collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server, galaxy_server], '*', False)
def test_build_requirement_from_name_single_version(galaxy_server, monkeypatch):
mock_get_info = MagicMock()
mock_get_info.return_value = api.CollectionVersionMetadata('namespace', 'collection', '2.0.0', None, None,
{})
monkeypatch.setattr(galaxy_server, 'get_collection_version_metadata', mock_get_info)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '2.0.0', True,
True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.0'])
assert actual.latest_version == u'2.0.0'
assert actual.dependencies == {}
assert mock_get_info.call_count == 1
assert mock_get_info.mock_calls[0][1] == ('namespace', 'collection', '2.0.0')
def test_build_requirement_from_name_multiple_versions_one_match(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['2.0.0', '2.0.1', '2.0.2']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
mock_get_info = MagicMock()
mock_get_info.return_value = api.CollectionVersionMetadata('namespace', 'collection', '2.0.1', None, None,
{})
monkeypatch.setattr(galaxy_server, 'get_collection_version_metadata', mock_get_info)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '>=2.0.1,<2.0.2',
True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.1'])
assert actual.latest_version == u'2.0.1'
assert actual.dependencies == {}
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
assert mock_get_info.call_count == 1
assert mock_get_info.mock_calls[0][1] == ('namespace', 'collection', '2.0.1')
def test_build_requirement_from_name_multiple_version_results(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['2.0.0', '2.0.1', '2.0.2', '2.0.3', '2.0.4', '2.0.5']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '!=2.0.2',
True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.0', u'2.0.1', u'2.0.3', u'2.0.4', u'2.0.5'])
assert actual.latest_version == u'2.0.5'
assert actual.dependencies == {}
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
@pytest.mark.parametrize('versions, requirement, expected_filter, expected_latest', [
[['1.0.0', '1.0.1'], '*', ['1.0.0', '1.0.1'], '1.0.1'],
[['1.0.0', '1.0.5', '1.1.0'], '>1.0.0,<1.1.0', ['1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '>1.0.0,<=1.0.5', ['1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '>=1.1.0', ['1.1.0'], '1.1.0'],
[['1.0.0', '1.0.5', '1.1.0'], '!=1.1.0', ['1.0.0', '1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '==1.0.5', ['1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '1.0.5', ['1.0.5'], '1.0.5'],
[['1.0.0', '2.0.0', '3.0.0'], '>=2', ['2.0.0', '3.0.0'], '3.0.0'],
])
def test_add_collection_requirements(versions, requirement, expected_filter, expected_latest):
req = collection.CollectionRequirement('namespace', 'name', None, 'https://galaxy.com', versions, requirement,
False)
assert req.versions == set(expected_filter)
assert req.latest_version == expected_latest
def test_add_collection_requirement_to_unknown_installed_version(monkeypatch):
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
req = collection.CollectionRequirement('namespace', 'name', None, 'https://galaxy.com', ['*'], '*', False,
skip=True)
req.add_requirement('parent.collection', '1.0.0')
assert req.latest_version == '*'
assert mock_display.call_count == 1
actual_warn = ' '.join(mock_display.mock_calls[0][1][0].split('\n'))
assert "Failed to validate the collection requirement 'namespace.name:1.0.0' for parent.collection" in actual_warn
def test_add_collection_wildcard_requirement_to_unknown_installed_version():
req = collection.CollectionRequirement('namespace', 'name', None, 'https://galaxy.com', ['*'], '*', False,
skip=True)
req.add_requirement(str(req), '*')
assert req.versions == set('*')
assert req.latest_version == '*'
def test_add_collection_requirement_with_conflict(galaxy_server):
expected = "Cannot meet requirement ==1.0.2 for dependency namespace.name from source '%s'. Available versions " \
"before last requirement added: 1.0.0, 1.0.1\n" \
"Requirements from:\n" \
"\tbase - 'namespace.name:==1.0.2'" % galaxy_server.api_server
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement('namespace', 'name', None, galaxy_server, ['1.0.0', '1.0.1'], '==1.0.2',
False)
def test_add_requirement_to_existing_collection_with_conflict(galaxy_server):
req = collection.CollectionRequirement('namespace', 'name', None, galaxy_server, ['1.0.0', '1.0.1'], '*', False)
expected = "Cannot meet dependency requirement 'namespace.name:1.0.2' for collection namespace.collection2 from " \
"source '%s'. Available versions before last requirement added: 1.0.0, 1.0.1\n" \
"Requirements from:\n" \
"\tbase - 'namespace.name:*'\n" \
"\tnamespace.collection2 - 'namespace.name:1.0.2'" % galaxy_server.api_server
with pytest.raises(AnsibleError, match=re.escape(expected)):
req.add_requirement('namespace.collection2', '1.0.2')
def test_add_requirement_to_installed_collection_with_conflict():
source = 'https://galaxy.ansible.com'
req = collection.CollectionRequirement('namespace', 'name', None, source, ['1.0.0', '1.0.1'], '*', False,
skip=True)
expected = "Cannot meet requirement namespace.name:1.0.2 as it is already installed at version '1.0.1'. " \
"Use --force to overwrite"
with pytest.raises(AnsibleError, match=re.escape(expected)):
req.add_requirement(None, '1.0.2')
def test_add_requirement_to_installed_collection_with_conflict_as_dep():
source = 'https://galaxy.ansible.com'
req = collection.CollectionRequirement('namespace', 'name', None, source, ['1.0.0', '1.0.1'], '*', False,
skip=True)
expected = "Cannot meet requirement namespace.name:1.0.2 as it is already installed at version '1.0.1'. " \
"Use --force-with-deps to overwrite"
with pytest.raises(AnsibleError, match=re.escape(expected)):
req.add_requirement('namespace.collection2', '1.0.2')
def test_install_skipped_collection(monkeypatch):
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
req = collection.CollectionRequirement('namespace', 'name', None, 'source', ['1.0.0'], '*', False, skip=True)
req.install(None, None)
assert mock_display.call_count == 1
assert mock_display.mock_calls[0][1][0] == "Skipping 'namespace.name' as it is already installed"
def test_install_collection(collection_artifact, monkeypatch):
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
collection_tar = collection_artifact[1]
output_path = os.path.join(os.path.split(collection_tar)[0], b'output')
collection_path = os.path.join(output_path, b'ansible_namespace', b'collection')
os.makedirs(os.path.join(collection_path, b'delete_me')) # Create a folder to verify the install cleans out the dir
temp_path = os.path.join(os.path.split(collection_tar)[0], b'temp')
os.makedirs(temp_path)
req = collection.CollectionRequirement.from_tar(collection_tar, True, True)
req.install(to_text(output_path), temp_path)
# Ensure the temp directory is empty, nothing is left behind
assert os.listdir(temp_path) == []
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
assert mock_display.call_count == 1
assert mock_display.mock_calls[0][1][0] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" \
% to_text(collection_path)
def test_install_collection_with_download(galaxy_server, collection_artifact, monkeypatch):
collection_tar = collection_artifact[1]
output_path = os.path.join(os.path.split(collection_tar)[0], b'output')
collection_path = os.path.join(output_path, b'ansible_namespace', b'collection')
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
mock_download = MagicMock()
mock_download.return_value = collection_tar
monkeypatch.setattr(collection, '_download_file', mock_download)
monkeypatch.setattr(galaxy_server, '_available_api_versions', {'v2': 'v2/'})
temp_path = os.path.join(os.path.split(collection_tar)[0], b'temp')
os.makedirs(temp_path)
meta = api.CollectionVersionMetadata('ansible_namespace', 'collection', '0.1.0', 'https://downloadme.com',
'myhash', {})
req = collection.CollectionRequirement('ansible_namespace', 'collection', None, galaxy_server,
['0.1.0'], '*', False, metadata=meta)
req.install(to_text(output_path), temp_path)
# Ensure the temp directory is empty, nothing is left behind
assert os.listdir(temp_path) == []
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
assert mock_display.call_count == 1
assert mock_display.mock_calls[0][1][0] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" \
% to_text(collection_path)
assert mock_download.call_count == 1
assert mock_download.mock_calls[0][1][0] == 'https://downloadme.com'
assert mock_download.mock_calls[0][1][1] == temp_path
assert mock_download.mock_calls[0][1][2] == 'myhash'
assert mock_download.mock_calls[0][1][3] is True
def test_install_collections_from_tar(collection_artifact, monkeypatch):
collection_path, collection_tar = collection_artifact
temp_path = os.path.split(collection_tar)[0]
shutil.rmtree(collection_path)
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path)
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
with open(os.path.join(collection_path, b'MANIFEST.json'), 'rb') as manifest_obj:
actual_manifest = json.loads(to_text(manifest_obj.read()))
assert actual_manifest['collection_info']['namespace'] == 'ansible_namespace'
assert actual_manifest['collection_info']['name'] == 'collection'
assert actual_manifest['collection_info']['version'] == '0.1.0'
# Filter out the progress cursor display calls.
display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1]
assert len(display_msgs) == 3
assert display_msgs[0] == "Process install dependency map"
assert display_msgs[1] == "Starting collection install process"
assert display_msgs[2] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" % to_text(collection_path)
def test_install_collections_existing_without_force(collection_artifact, monkeypatch):
collection_path, collection_tar = collection_artifact
temp_path = os.path.split(collection_tar)[0]
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
# If we don't delete collection_path it will think the original build skeleton is installed so we expect a skip
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path)
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'README.md', b'docs', b'galaxy.yml', b'playbooks', b'plugins', b'roles']
# Filter out the progress cursor display calls.
display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1]
assert len(display_msgs) == 4
# Msg1 is the warning about not MANIFEST.json, cannot really check message as it has line breaks which varies based
# on the path size
assert display_msgs[1] == "Process install dependency map"
assert display_msgs[2] == "Starting collection install process"
assert display_msgs[3] == "Skipping 'ansible_namespace.collection' as it is already installed"
# Makes sure we don't get stuck in some recursive loop
@pytest.mark.parametrize('collection_artifact', [
{'ansible_namespace.collection': '>=0.0.1'},
], indirect=True)
def test_install_collection_with_circular_dependency(collection_artifact, monkeypatch):
collection_path, collection_tar = collection_artifact
temp_path = os.path.split(collection_tar)[0]
shutil.rmtree(collection_path)
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path)
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
with open(os.path.join(collection_path, b'MANIFEST.json'), 'rb') as manifest_obj:
actual_manifest = json.loads(to_text(manifest_obj.read()))
assert actual_manifest['collection_info']['namespace'] == 'ansible_namespace'
assert actual_manifest['collection_info']['name'] == 'collection'
assert actual_manifest['collection_info']['version'] == '0.1.0'
# Filter out the progress cursor display calls.
display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1]
assert len(display_msgs) == 3
assert display_msgs[0] == "Process install dependency map"
assert display_msgs[1] == "Starting collection install process"
assert display_msgs[2] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" % to_text(collection_path)
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 66,803 |
Collection tarballs don't preserve file modes
|
##### SUMMARY
If you have executable files in your source code they are not kept in the tarball.
Could be related to https://github.com/ansible/ansible/blob/99d7f150873011e7515851db9b44ff486efa9d77/lib/ansible/galaxy/collection.py#L763
This was noticed as I had some auxiliary scripts in my repo.
Need to also consider scripts in `files/` if/when Collections support roles.
This seems to fail the "principal of least suprise"
**Proposal 1: Keep on stripping executable flags**
Add some line to the output
`WARNING: tests/coverage.sh: source file has executable flag, though this will be ignored in generated tar.gz`
Update developing a collection documentation page
Update integration tests to ensure that executable mode is always stripped.
Update code to link to this issue
**Proposal 2: Allow executible flags**
Update developing a collection documentation page
Update integration tests to ensure that executable mode is preserved
Update code to link to this issue
Changelog
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-galaxy
##### ANSIBLE VERSION
```paste below
2.10
```
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/66803
|
https://github.com/ansible/ansible/pull/68418
|
a9d2ceafe429171c0e2ad007058b88bae57c74ce
|
127d54b3630c65043ec12c4af2024f8ef0bc6d09
| 2020-01-27T11:28:53Z |
python
| 2020-03-24T22:08:23Z |
changelogs/fragments/collection-install-mode.yaml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 66,803 |
Collection tarballs don't preserve file modes
|
##### SUMMARY
If you have executable files in your source code they are not kept in the tarball.
Could be related to https://github.com/ansible/ansible/blob/99d7f150873011e7515851db9b44ff486efa9d77/lib/ansible/galaxy/collection.py#L763
This was noticed as I had some auxiliary scripts in my repo.
Need to also consider scripts in `files/` if/when Collections support roles.
This seems to fail the "principal of least suprise"
**Proposal 1: Keep on stripping executable flags**
Add some line to the output
`WARNING: tests/coverage.sh: source file has executable flag, though this will be ignored in generated tar.gz`
Update developing a collection documentation page
Update integration tests to ensure that executable mode is always stripped.
Update code to link to this issue
**Proposal 2: Allow executible flags**
Update developing a collection documentation page
Update integration tests to ensure that executable mode is preserved
Update code to link to this issue
Changelog
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-galaxy
##### ANSIBLE VERSION
```paste below
2.10
```
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/66803
|
https://github.com/ansible/ansible/pull/68418
|
a9d2ceafe429171c0e2ad007058b88bae57c74ce
|
127d54b3630c65043ec12c4af2024f8ef0bc6d09
| 2020-01-27T11:28:53Z |
python
| 2020-03-24T22:08:23Z |
lib/ansible/galaxy/collection.py
|
# Copyright: (c) 2019, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import fnmatch
import json
import operator
import os
import shutil
import sys
import tarfile
import tempfile
import threading
import time
import yaml
from collections import namedtuple
from contextlib import contextmanager
from distutils.version import LooseVersion
from hashlib import sha256
from io import BytesIO
from yaml.error import YAMLError
try:
import queue
except ImportError:
import Queue as queue # Python 2
import ansible.constants as C
from ansible.errors import AnsibleError
from ansible.galaxy import get_collections_galaxy_meta_info
from ansible.galaxy.api import CollectionVersionMetadata, GalaxyError
from ansible.galaxy.user_agent import user_agent
from ansible.module_utils import six
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.utils.collection_loader import AnsibleCollectionRef
from ansible.utils.display import Display
from ansible.utils.hashing import secure_hash, secure_hash_s
from ansible.utils.version import SemanticVersion
from ansible.module_utils.urls import open_url
urlparse = six.moves.urllib.parse.urlparse
urllib_error = six.moves.urllib.error
display = Display()
MANIFEST_FORMAT = 1
ModifiedContent = namedtuple('ModifiedContent', ['filename', 'expected', 'installed'])
class CollectionRequirement:
_FILE_MAPPING = [(b'MANIFEST.json', 'manifest_file'), (b'FILES.json', 'files_file')]
def __init__(self, namespace, name, b_path, api, versions, requirement, force, parent=None, metadata=None,
files=None, skip=False, allow_pre_releases=False):
"""
Represents a collection requirement, the versions that are available to be installed as well as any
dependencies the collection has.
:param namespace: The collection namespace.
:param name: The collection name.
:param b_path: Byte str of the path to the collection tarball if it has already been downloaded.
:param api: The GalaxyAPI to use if the collection is from Galaxy.
:param versions: A list of versions of the collection that are available.
:param requirement: The version requirement string used to verify the list of versions fit the requirements.
:param force: Whether the force flag applied to the collection.
:param parent: The name of the parent the collection is a dependency of.
:param metadata: The galaxy.api.CollectionVersionMetadata that has already been retrieved from the Galaxy
server.
:param files: The files that exist inside the collection. This is based on the FILES.json file inside the
collection artifact.
:param skip: Whether to skip installing the collection. Should be set if the collection is already installed
and force is not set.
:param allow_pre_releases: Whether to skip pre-release versions of collections.
"""
self.namespace = namespace
self.name = name
self.b_path = b_path
self.api = api
self._versions = set(versions)
self.force = force
self.skip = skip
self.required_by = []
self.allow_pre_releases = allow_pre_releases
self._metadata = metadata
self._files = files
self.add_requirement(parent, requirement)
def __str__(self):
return to_native("%s.%s" % (self.namespace, self.name))
def __unicode__(self):
return u"%s.%s" % (self.namespace, self.name)
@property
def metadata(self):
self._get_metadata()
return self._metadata
@property
def versions(self):
if self.allow_pre_releases:
return self._versions
return set(v for v in self._versions if v == '*' or not SemanticVersion(v).is_prerelease)
@versions.setter
def versions(self, value):
self._versions = set(value)
@property
def pre_releases(self):
return set(v for v in self._versions if SemanticVersion(v).is_prerelease)
@property
def latest_version(self):
try:
return max([v for v in self.versions if v != '*'], key=SemanticVersion)
except ValueError: # ValueError: max() arg is an empty sequence
return '*'
@property
def dependencies(self):
if not self._metadata:
if len(self.versions) > 1:
return {}
self._get_metadata()
dependencies = self._metadata.dependencies
if dependencies is None:
return {}
return dependencies
def add_requirement(self, parent, requirement):
self.required_by.append((parent, requirement))
new_versions = set(v for v in self.versions if self._meets_requirements(v, requirement, parent))
if len(new_versions) == 0:
if self.skip:
force_flag = '--force-with-deps' if parent else '--force'
version = self.latest_version if self.latest_version != '*' else 'unknown'
msg = "Cannot meet requirement %s:%s as it is already installed at version '%s'. Use %s to overwrite" \
% (to_text(self), requirement, version, force_flag)
raise AnsibleError(msg)
elif parent is None:
msg = "Cannot meet requirement %s for dependency %s" % (requirement, to_text(self))
else:
msg = "Cannot meet dependency requirement '%s:%s' for collection %s" \
% (to_text(self), requirement, parent)
collection_source = to_text(self.b_path, nonstring='passthru') or self.api.api_server
req_by = "\n".join(
"\t%s - '%s:%s'" % (to_text(p) if p else 'base', to_text(self), r)
for p, r in self.required_by
)
versions = ", ".join(sorted(self.versions, key=SemanticVersion))
if not self.versions and self.pre_releases:
pre_release_msg = (
'\nThis collection only contains pre-releases. Utilize `--pre` to install pre-releases, or '
'explicitly provide the pre-release version.'
)
else:
pre_release_msg = ''
raise AnsibleError(
"%s from source '%s'. Available versions before last requirement added: %s\nRequirements from:\n%s%s"
% (msg, collection_source, versions, req_by, pre_release_msg)
)
self.versions = new_versions
def install(self, path, b_temp_path):
if self.skip:
display.display("Skipping '%s' as it is already installed" % to_text(self))
return
# Install if it is not
collection_path = os.path.join(path, self.namespace, self.name)
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
display.display("Installing '%s:%s' to '%s'" % (to_text(self), self.latest_version, collection_path))
if self.b_path is None:
download_url = self._metadata.download_url
artifact_hash = self._metadata.artifact_sha256
headers = {}
self.api._add_auth_token(headers, download_url, required=False)
self.b_path = _download_file(download_url, b_temp_path, artifact_hash, self.api.validate_certs,
headers=headers)
if os.path.exists(b_collection_path):
shutil.rmtree(b_collection_path)
os.makedirs(b_collection_path)
with tarfile.open(self.b_path, mode='r') as collection_tar:
files_member_obj = collection_tar.getmember('FILES.json')
with _tarfile_extract(collection_tar, files_member_obj) as files_obj:
files = json.loads(to_text(files_obj.read(), errors='surrogate_or_strict'))
_extract_tar_file(collection_tar, 'MANIFEST.json', b_collection_path, b_temp_path)
_extract_tar_file(collection_tar, 'FILES.json', b_collection_path, b_temp_path)
for file_info in files['files']:
file_name = file_info['name']
if file_name == '.':
continue
if file_info['ftype'] == 'file':
_extract_tar_file(collection_tar, file_name, b_collection_path, b_temp_path,
expected_hash=file_info['chksum_sha256'])
else:
os.makedirs(os.path.join(b_collection_path, to_bytes(file_name, errors='surrogate_or_strict')))
def set_latest_version(self):
self.versions = set([self.latest_version])
self._get_metadata()
def verify(self, remote_collection, path, b_temp_tar_path):
if not self.skip:
display.display("'%s' has not been installed, nothing to verify" % (to_text(self)))
return
collection_path = os.path.join(path, self.namespace, self.name)
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
display.vvv("Verifying '%s:%s'." % (to_text(self), self.latest_version))
display.vvv("Installed collection found at '%s'" % collection_path)
display.vvv("Remote collection found at '%s'" % remote_collection.metadata.download_url)
# Compare installed version versus requirement version
if self.latest_version != remote_collection.latest_version:
err = "%s has the version '%s' but is being compared to '%s'" % (to_text(self), self.latest_version, remote_collection.latest_version)
display.display(err)
return
modified_content = []
# Verify the manifest hash matches before verifying the file manifest
expected_hash = _get_tar_file_hash(b_temp_tar_path, 'MANIFEST.json')
self._verify_file_hash(b_collection_path, 'MANIFEST.json', expected_hash, modified_content)
manifest = _get_json_from_tar_file(b_temp_tar_path, 'MANIFEST.json')
# Use the manifest to verify the file manifest checksum
file_manifest_data = manifest['file_manifest_file']
file_manifest_filename = file_manifest_data['name']
expected_hash = file_manifest_data['chksum_%s' % file_manifest_data['chksum_type']]
# Verify the file manifest before using it to verify individual files
self._verify_file_hash(b_collection_path, file_manifest_filename, expected_hash, modified_content)
file_manifest = _get_json_from_tar_file(b_temp_tar_path, file_manifest_filename)
# Use the file manifest to verify individual file checksums
for manifest_data in file_manifest['files']:
if manifest_data['ftype'] == 'file':
expected_hash = manifest_data['chksum_%s' % manifest_data['chksum_type']]
self._verify_file_hash(b_collection_path, manifest_data['name'], expected_hash, modified_content)
if modified_content:
display.display("Collection %s contains modified content in the following files:" % to_text(self))
display.display(to_text(self))
display.vvv(to_text(self.b_path))
for content_change in modified_content:
display.display(' %s' % content_change.filename)
display.vvv(" Expected: %s\n Found: %s" % (content_change.expected, content_change.installed))
else:
display.vvv("Successfully verified that checksums for '%s:%s' match the remote collection" % (to_text(self), self.latest_version))
def _verify_file_hash(self, b_path, filename, expected_hash, error_queue):
b_file_path = to_bytes(os.path.join(to_text(b_path), filename), errors='surrogate_or_strict')
if not os.path.isfile(b_file_path):
actual_hash = None
else:
with open(b_file_path, mode='rb') as file_object:
actual_hash = _consume_file(file_object)
if expected_hash != actual_hash:
error_queue.append(ModifiedContent(filename=filename, expected=expected_hash, installed=actual_hash))
def _get_metadata(self):
if self._metadata:
return
self._metadata = self.api.get_collection_version_metadata(self.namespace, self.name, self.latest_version)
def _meets_requirements(self, version, requirements, parent):
"""
Supports version identifiers can be '==', '!=', '>', '>=', '<', '<=', '*'. Each requirement is delimited by ','
"""
op_map = {
'!=': operator.ne,
'==': operator.eq,
'=': operator.eq,
'>=': operator.ge,
'>': operator.gt,
'<=': operator.le,
'<': operator.lt,
}
for req in list(requirements.split(',')):
op_pos = 2 if len(req) > 1 and req[1] == '=' else 1
op = op_map.get(req[:op_pos])
requirement = req[op_pos:]
if not op:
requirement = req
op = operator.eq
# In the case we are checking a new requirement on a base requirement (parent != None) we can't accept
# version as '*' (unknown version) unless the requirement is also '*'.
if parent and version == '*' and requirement != '*':
display.warning("Failed to validate the collection requirement '%s:%s' for %s when the existing "
"install does not have a version set, the collection may not work."
% (to_text(self), req, parent))
continue
elif requirement == '*' or version == '*':
continue
if not op(SemanticVersion(version), SemanticVersion.from_loose_version(LooseVersion(requirement))):
break
else:
return True
# The loop was broken early, it does not meet all the requirements
return False
@staticmethod
def from_tar(b_path, force, parent=None):
if not tarfile.is_tarfile(b_path):
raise AnsibleError("Collection artifact at '%s' is not a valid tar file." % to_native(b_path))
info = {}
with tarfile.open(b_path, mode='r') as collection_tar:
for b_member_name, property_name in CollectionRequirement._FILE_MAPPING:
n_member_name = to_native(b_member_name)
try:
member = collection_tar.getmember(n_member_name)
except KeyError:
raise AnsibleError("Collection at '%s' does not contain the required file %s."
% (to_native(b_path), n_member_name))
with _tarfile_extract(collection_tar, member) as member_obj:
try:
info[property_name] = json.loads(to_text(member_obj.read(), errors='surrogate_or_strict'))
except ValueError:
raise AnsibleError("Collection tar file member %s does not contain a valid json string."
% n_member_name)
meta = info['manifest_file']['collection_info']
files = info['files_file']['files']
namespace = meta['namespace']
name = meta['name']
version = meta['version']
meta = CollectionVersionMetadata(namespace, name, version, None, None, meta['dependencies'])
if SemanticVersion(version).is_prerelease:
allow_pre_release = True
else:
allow_pre_release = False
return CollectionRequirement(namespace, name, b_path, None, [version], version, force, parent=parent,
metadata=meta, files=files, allow_pre_releases=allow_pre_release)
@staticmethod
def from_path(b_path, force, parent=None):
info = {}
for b_file_name, property_name in CollectionRequirement._FILE_MAPPING:
b_file_path = os.path.join(b_path, b_file_name)
if not os.path.exists(b_file_path):
continue
with open(b_file_path, 'rb') as file_obj:
try:
info[property_name] = json.loads(to_text(file_obj.read(), errors='surrogate_or_strict'))
except ValueError:
raise AnsibleError("Collection file at '%s' does not contain a valid json string."
% to_native(b_file_path))
allow_pre_release = False
if 'manifest_file' in info:
manifest = info['manifest_file']['collection_info']
namespace = manifest['namespace']
name = manifest['name']
version = to_text(manifest['version'], errors='surrogate_or_strict')
try:
_v = SemanticVersion()
_v.parse(version)
if _v.is_prerelease:
allow_pre_release = True
except ValueError:
display.warning("Collection at '%s' does not have a valid version set, falling back to '*'. Found "
"version: '%s'" % (to_text(b_path), version))
version = '*'
dependencies = manifest['dependencies']
else:
display.warning("Collection at '%s' does not have a MANIFEST.json file, cannot detect version."
% to_text(b_path))
parent_dir, name = os.path.split(to_text(b_path, errors='surrogate_or_strict'))
namespace = os.path.split(parent_dir)[1]
version = '*'
dependencies = {}
meta = CollectionVersionMetadata(namespace, name, version, None, None, dependencies)
files = info.get('files_file', {}).get('files', {})
return CollectionRequirement(namespace, name, b_path, None, [version], version, force, parent=parent,
metadata=meta, files=files, skip=True, allow_pre_releases=allow_pre_release)
@staticmethod
def from_name(collection, apis, requirement, force, parent=None, allow_pre_release=False):
namespace, name = collection.split('.', 1)
galaxy_meta = None
for api in apis:
try:
if not (requirement == '*' or requirement.startswith('<') or requirement.startswith('>') or
requirement.startswith('!=')):
# Exact requirement
allow_pre_release = True
if requirement.startswith('='):
requirement = requirement.lstrip('=')
resp = api.get_collection_version_metadata(namespace, name, requirement)
galaxy_meta = resp
versions = [resp.version]
else:
versions = api.get_collection_versions(namespace, name)
except GalaxyError as err:
if err.http_code == 404:
display.vvv("Collection '%s' is not available from server %s %s"
% (collection, api.name, api.api_server))
continue
raise
display.vvv("Collection '%s' obtained from server %s %s" % (collection, api.name, api.api_server))
break
else:
raise AnsibleError("Failed to find collection %s:%s" % (collection, requirement))
req = CollectionRequirement(namespace, name, None, api, versions, requirement, force, parent=parent,
metadata=galaxy_meta, allow_pre_releases=allow_pre_release)
return req
def build_collection(collection_path, output_path, force):
"""
Creates the Ansible collection artifact in a .tar.gz file.
:param collection_path: The path to the collection to build. This should be the directory that contains the
galaxy.yml file.
:param output_path: The path to create the collection build artifact. This should be a directory.
:param force: Whether to overwrite an existing collection build artifact or fail.
:return: The path to the collection build artifact.
"""
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
b_galaxy_path = os.path.join(b_collection_path, b'galaxy.yml')
if not os.path.exists(b_galaxy_path):
raise AnsibleError("The collection galaxy.yml path '%s' does not exist." % to_native(b_galaxy_path))
collection_meta = _get_galaxy_yml(b_galaxy_path)
file_manifest = _build_files_manifest(b_collection_path, collection_meta['namespace'], collection_meta['name'],
collection_meta['build_ignore'])
collection_manifest = _build_manifest(**collection_meta)
collection_output = os.path.join(output_path, "%s-%s-%s.tar.gz" % (collection_meta['namespace'],
collection_meta['name'],
collection_meta['version']))
b_collection_output = to_bytes(collection_output, errors='surrogate_or_strict')
if os.path.exists(b_collection_output):
if os.path.isdir(b_collection_output):
raise AnsibleError("The output collection artifact '%s' already exists, "
"but is a directory - aborting" % to_native(collection_output))
elif not force:
raise AnsibleError("The file '%s' already exists. You can use --force to re-create "
"the collection artifact." % to_native(collection_output))
_build_collection_tar(b_collection_path, b_collection_output, collection_manifest, file_manifest)
def publish_collection(collection_path, api, wait, timeout):
"""
Publish an Ansible collection tarball into an Ansible Galaxy server.
:param collection_path: The path to the collection tarball to publish.
:param api: A GalaxyAPI to publish the collection to.
:param wait: Whether to wait until the import process is complete.
:param timeout: The time in seconds to wait for the import process to finish, 0 is indefinite.
"""
import_uri = api.publish_collection(collection_path)
if wait:
# Galaxy returns a url fragment which differs between v2 and v3. The second to last entry is
# always the task_id, though.
# v2: {"task": "https://galaxy-dev.ansible.com/api/v2/collection-imports/35573/"}
# v3: {"task": "/api/automation-hub/v3/imports/collections/838d1308-a8f4-402c-95cb-7823f3806cd8/"}
task_id = None
for path_segment in reversed(import_uri.split('/')):
if path_segment:
task_id = path_segment
break
if not task_id:
raise AnsibleError("Publishing the collection did not return valid task info. Cannot wait for task status. Returned task info: '%s'" % import_uri)
display.display("Collection has been published to the Galaxy server %s %s" % (api.name, api.api_server))
with _display_progress():
api.wait_import_task(task_id, timeout)
display.display("Collection has been successfully published and imported to the Galaxy server %s %s"
% (api.name, api.api_server))
else:
display.display("Collection has been pushed to the Galaxy server %s %s, not waiting until import has "
"completed due to --no-wait being set. Import task results can be found at %s"
% (api.name, api.api_server, import_uri))
def install_collections(collections, output_path, apis, validate_certs, ignore_errors, no_deps, force, force_deps,
allow_pre_release=False):
"""
Install Ansible collections to the path specified.
:param collections: The collections to install, should be a list of tuples with (name, requirement, Galaxy server).
:param output_path: The path to install the collections to.
:param apis: A list of GalaxyAPIs to query when searching for a collection.
:param validate_certs: Whether to validate the certificates if downloading a tarball.
:param ignore_errors: Whether to ignore any errors when installing the collection.
:param no_deps: Ignore any collection dependencies and only install the base requirements.
:param force: Re-install a collection if it has already been installed.
:param force_deps: Re-install a collection as well as its dependencies if they have already been installed.
"""
existing_collections = find_existing_collections(output_path)
with _tempdir() as b_temp_path:
display.display("Process install dependency map")
with _display_progress():
dependency_map = _build_dependency_map(collections, existing_collections, b_temp_path, apis,
validate_certs, force, force_deps, no_deps,
allow_pre_release=allow_pre_release)
display.display("Starting collection install process")
with _display_progress():
for collection in dependency_map.values():
try:
collection.install(output_path, b_temp_path)
except AnsibleError as err:
if ignore_errors:
display.warning("Failed to install collection %s but skipping due to --ignore-errors being set. "
"Error: %s" % (to_text(collection), to_text(err)))
else:
raise
def validate_collection_name(name):
"""
Validates the collection name as an input from the user or a requirements file fit the requirements.
:param name: The input name with optional range specifier split by ':'.
:return: The input value, required for argparse validation.
"""
collection, dummy, dummy = name.partition(':')
if AnsibleCollectionRef.is_valid_collection_name(collection):
return name
raise AnsibleError("Invalid collection name '%s', "
"name must be in the format <namespace>.<collection>. \n"
"Please make sure namespace and collection name contains "
"characters from [a-zA-Z0-9_] only." % name)
def validate_collection_path(collection_path):
""" Ensure a given path ends with 'ansible_collections'
:param collection_path: The path that should end in 'ansible_collections'
:return: collection_path ending in 'ansible_collections' if it does not already.
"""
if os.path.split(collection_path)[1] != 'ansible_collections':
return os.path.join(collection_path, 'ansible_collections')
return collection_path
def verify_collections(collections, search_paths, apis, validate_certs, ignore_errors, allow_pre_release=False):
with _display_progress():
with _tempdir() as b_temp_path:
for collection in collections:
try:
local_collection = None
b_collection = to_bytes(collection[0], errors='surrogate_or_strict')
if os.path.isfile(b_collection) or urlparse(collection[0]).scheme.lower() in ['http', 'https'] or len(collection[0].split('.')) != 2:
raise AnsibleError(message="'%s' is not a valid collection name. The format namespace.name is expected." % collection[0])
collection_name = collection[0]
namespace, name = collection_name.split('.')
collection_version = collection[1]
# Verify local collection exists before downloading it from a galaxy server
for search_path in search_paths:
b_search_path = to_bytes(os.path.join(search_path, namespace, name), errors='surrogate_or_strict')
if os.path.isdir(b_search_path):
local_collection = CollectionRequirement.from_path(b_search_path, False)
break
if local_collection is None:
raise AnsibleError(message='Collection %s is not installed in any of the collection paths.' % collection_name)
# Download collection on a galaxy server for comparison
try:
remote_collection = CollectionRequirement.from_name(collection_name, apis, collection_version, False, parent=None,
allow_pre_release=allow_pre_release)
except AnsibleError as e:
if e.message == 'Failed to find collection %s:%s' % (collection[0], collection[1]):
raise AnsibleError('Failed to find remote collection %s:%s on any of the galaxy servers' % (collection[0], collection[1]))
raise
download_url = remote_collection.metadata.download_url
headers = {}
remote_collection.api._add_auth_token(headers, download_url, required=False)
b_temp_tar_path = _download_file(download_url, b_temp_path, None, validate_certs, headers=headers)
local_collection.verify(remote_collection, search_path, b_temp_tar_path)
except AnsibleError as err:
if ignore_errors:
display.warning("Failed to verify collection %s but skipping due to --ignore-errors being set. "
"Error: %s" % (collection[0], to_text(err)))
else:
raise
@contextmanager
def _tempdir():
b_temp_path = tempfile.mkdtemp(dir=to_bytes(C.DEFAULT_LOCAL_TMP, errors='surrogate_or_strict'))
yield b_temp_path
shutil.rmtree(b_temp_path)
@contextmanager
def _tarfile_extract(tar, member):
tar_obj = tar.extractfile(member)
yield tar_obj
tar_obj.close()
@contextmanager
def _display_progress():
config_display = C.GALAXY_DISPLAY_PROGRESS
display_wheel = sys.stdout.isatty() if config_display is None else config_display
if not display_wheel:
yield
return
def progress(display_queue, actual_display):
actual_display.debug("Starting display_progress display thread")
t = threading.current_thread()
while True:
for c in "|/-\\":
actual_display.display(c + "\b", newline=False)
time.sleep(0.1)
# Display a message from the main thread
while True:
try:
method, args, kwargs = display_queue.get(block=False, timeout=0.1)
except queue.Empty:
break
else:
func = getattr(actual_display, method)
func(*args, **kwargs)
if getattr(t, "finish", False):
actual_display.debug("Received end signal for display_progress display thread")
return
class DisplayThread(object):
def __init__(self, display_queue):
self.display_queue = display_queue
def __getattr__(self, attr):
def call_display(*args, **kwargs):
self.display_queue.put((attr, args, kwargs))
return call_display
# Temporary override the global display class with our own which add the calls to a queue for the thread to call.
global display
old_display = display
try:
display_queue = queue.Queue()
display = DisplayThread(display_queue)
t = threading.Thread(target=progress, args=(display_queue, old_display))
t.daemon = True
t.start()
try:
yield
finally:
t.finish = True
t.join()
except Exception:
# The exception is re-raised so we can sure the thread is finished and not using the display anymore
raise
finally:
display = old_display
def _get_galaxy_yml(b_galaxy_yml_path):
meta_info = get_collections_galaxy_meta_info()
mandatory_keys = set()
string_keys = set()
list_keys = set()
dict_keys = set()
for info in meta_info:
if info.get('required', False):
mandatory_keys.add(info['key'])
key_list_type = {
'str': string_keys,
'list': list_keys,
'dict': dict_keys,
}[info.get('type', 'str')]
key_list_type.add(info['key'])
all_keys = frozenset(list(mandatory_keys) + list(string_keys) + list(list_keys) + list(dict_keys))
try:
with open(b_galaxy_yml_path, 'rb') as g_yaml:
galaxy_yml = yaml.safe_load(g_yaml)
except YAMLError as err:
raise AnsibleError("Failed to parse the galaxy.yml at '%s' with the following error:\n%s"
% (to_native(b_galaxy_yml_path), to_native(err)))
set_keys = set(galaxy_yml.keys())
missing_keys = mandatory_keys.difference(set_keys)
if missing_keys:
raise AnsibleError("The collection galaxy.yml at '%s' is missing the following mandatory keys: %s"
% (to_native(b_galaxy_yml_path), ", ".join(sorted(missing_keys))))
extra_keys = set_keys.difference(all_keys)
if len(extra_keys) > 0:
display.warning("Found unknown keys in collection galaxy.yml at '%s': %s"
% (to_text(b_galaxy_yml_path), ", ".join(extra_keys)))
# Add the defaults if they have not been set
for optional_string in string_keys:
if optional_string not in galaxy_yml:
galaxy_yml[optional_string] = None
for optional_list in list_keys:
list_val = galaxy_yml.get(optional_list, None)
if list_val is None:
galaxy_yml[optional_list] = []
elif not isinstance(list_val, list):
galaxy_yml[optional_list] = [list_val]
for optional_dict in dict_keys:
if optional_dict not in galaxy_yml:
galaxy_yml[optional_dict] = {}
# license is a builtin var in Python, to avoid confusion we just rename it to license_ids
galaxy_yml['license_ids'] = galaxy_yml['license']
del galaxy_yml['license']
return galaxy_yml
def _build_files_manifest(b_collection_path, namespace, name, ignore_patterns):
# We always ignore .pyc and .retry files as well as some well known version control directories. The ignore
# patterns can be extended by the build_ignore key in galaxy.yml
b_ignore_patterns = [
b'galaxy.yml',
b'*.pyc',
b'*.retry',
b'tests/output', # Ignore ansible-test result output directory.
to_bytes('{0}-{1}-*.tar.gz'.format(namespace, name)), # Ignores previously built artifacts in the root dir.
]
b_ignore_patterns += [to_bytes(p) for p in ignore_patterns]
b_ignore_dirs = frozenset([b'CVS', b'.bzr', b'.hg', b'.git', b'.svn', b'__pycache__', b'.tox'])
entry_template = {
'name': None,
'ftype': None,
'chksum_type': None,
'chksum_sha256': None,
'format': MANIFEST_FORMAT
}
manifest = {
'files': [
{
'name': '.',
'ftype': 'dir',
'chksum_type': None,
'chksum_sha256': None,
'format': MANIFEST_FORMAT,
},
],
'format': MANIFEST_FORMAT,
}
def _walk(b_path, b_top_level_dir):
for b_item in os.listdir(b_path):
b_abs_path = os.path.join(b_path, b_item)
b_rel_base_dir = b'' if b_path == b_top_level_dir else b_path[len(b_top_level_dir) + 1:]
b_rel_path = os.path.join(b_rel_base_dir, b_item)
rel_path = to_text(b_rel_path, errors='surrogate_or_strict')
if os.path.isdir(b_abs_path):
if any(b_item == b_path for b_path in b_ignore_dirs) or \
any(fnmatch.fnmatch(b_rel_path, b_pattern) for b_pattern in b_ignore_patterns):
display.vvv("Skipping '%s' for collection build" % to_text(b_abs_path))
continue
if os.path.islink(b_abs_path):
b_link_target = os.path.realpath(b_abs_path)
if not b_link_target.startswith(b_top_level_dir):
display.warning("Skipping '%s' as it is a symbolic link to a directory outside the collection"
% to_text(b_abs_path))
continue
manifest_entry = entry_template.copy()
manifest_entry['name'] = rel_path
manifest_entry['ftype'] = 'dir'
manifest['files'].append(manifest_entry)
_walk(b_abs_path, b_top_level_dir)
else:
if any(fnmatch.fnmatch(b_rel_path, b_pattern) for b_pattern in b_ignore_patterns):
display.vvv("Skipping '%s' for collection build" % to_text(b_abs_path))
continue
manifest_entry = entry_template.copy()
manifest_entry['name'] = rel_path
manifest_entry['ftype'] = 'file'
manifest_entry['chksum_type'] = 'sha256'
manifest_entry['chksum_sha256'] = secure_hash(b_abs_path, hash_func=sha256)
manifest['files'].append(manifest_entry)
_walk(b_collection_path, b_collection_path)
return manifest
def _build_manifest(namespace, name, version, authors, readme, tags, description, license_ids, license_file,
dependencies, repository, documentation, homepage, issues, **kwargs):
manifest = {
'collection_info': {
'namespace': namespace,
'name': name,
'version': version,
'authors': authors,
'readme': readme,
'tags': tags,
'description': description,
'license': license_ids,
'license_file': license_file if license_file else None, # Handle galaxy.yml having an empty string (None)
'dependencies': dependencies,
'repository': repository,
'documentation': documentation,
'homepage': homepage,
'issues': issues,
},
'file_manifest_file': {
'name': 'FILES.json',
'ftype': 'file',
'chksum_type': 'sha256',
'chksum_sha256': None, # Filled out in _build_collection_tar
'format': MANIFEST_FORMAT
},
'format': MANIFEST_FORMAT,
}
return manifest
def _build_collection_tar(b_collection_path, b_tar_path, collection_manifest, file_manifest):
files_manifest_json = to_bytes(json.dumps(file_manifest, indent=True), errors='surrogate_or_strict')
collection_manifest['file_manifest_file']['chksum_sha256'] = secure_hash_s(files_manifest_json, hash_func=sha256)
collection_manifest_json = to_bytes(json.dumps(collection_manifest, indent=True), errors='surrogate_or_strict')
with _tempdir() as b_temp_path:
b_tar_filepath = os.path.join(b_temp_path, os.path.basename(b_tar_path))
with tarfile.open(b_tar_filepath, mode='w:gz') as tar_file:
# Add the MANIFEST.json and FILES.json file to the archive
for name, b in [('MANIFEST.json', collection_manifest_json), ('FILES.json', files_manifest_json)]:
b_io = BytesIO(b)
tar_info = tarfile.TarInfo(name)
tar_info.size = len(b)
tar_info.mtime = time.time()
tar_info.mode = 0o0644
tar_file.addfile(tarinfo=tar_info, fileobj=b_io)
for file_info in file_manifest['files']:
if file_info['name'] == '.':
continue
# arcname expects a native string, cannot be bytes
filename = to_native(file_info['name'], errors='surrogate_or_strict')
b_src_path = os.path.join(b_collection_path, to_bytes(filename, errors='surrogate_or_strict'))
def reset_stat(tarinfo):
tarinfo.mode = 0o0755 if tarinfo.isdir() else 0o0644
tarinfo.uid = tarinfo.gid = 0
tarinfo.uname = tarinfo.gname = ''
return tarinfo
tar_file.add(os.path.realpath(b_src_path), arcname=filename, recursive=False, filter=reset_stat)
shutil.copy(b_tar_filepath, b_tar_path)
collection_name = "%s.%s" % (collection_manifest['collection_info']['namespace'],
collection_manifest['collection_info']['name'])
display.display('Created collection for %s at %s' % (collection_name, to_text(b_tar_path)))
def find_existing_collections(path):
collections = []
b_path = to_bytes(path, errors='surrogate_or_strict')
for b_namespace in os.listdir(b_path):
b_namespace_path = os.path.join(b_path, b_namespace)
if os.path.isfile(b_namespace_path):
continue
for b_collection in os.listdir(b_namespace_path):
b_collection_path = os.path.join(b_namespace_path, b_collection)
if os.path.isdir(b_collection_path):
req = CollectionRequirement.from_path(b_collection_path, False)
display.vvv("Found installed collection %s:%s at '%s'" % (to_text(req), req.latest_version,
to_text(b_collection_path)))
collections.append(req)
return collections
def _build_dependency_map(collections, existing_collections, b_temp_path, apis, validate_certs, force, force_deps,
no_deps, allow_pre_release=False):
dependency_map = {}
# First build the dependency map on the actual requirements
for name, version, source in collections:
_get_collection_info(dependency_map, existing_collections, name, version, source, b_temp_path, apis,
validate_certs, (force or force_deps), allow_pre_release=allow_pre_release)
checked_parents = set([to_text(c) for c in dependency_map.values() if c.skip])
while len(dependency_map) != len(checked_parents):
while not no_deps: # Only parse dependencies if no_deps was not set
parents_to_check = set(dependency_map.keys()).difference(checked_parents)
deps_exhausted = True
for parent in parents_to_check:
parent_info = dependency_map[parent]
if parent_info.dependencies:
deps_exhausted = False
for dep_name, dep_requirement in parent_info.dependencies.items():
_get_collection_info(dependency_map, existing_collections, dep_name, dep_requirement,
parent_info.api, b_temp_path, apis, validate_certs, force_deps,
parent=parent, allow_pre_release=allow_pre_release)
checked_parents.add(parent)
# No extra dependencies were resolved, exit loop
if deps_exhausted:
break
# Now we have resolved the deps to our best extent, now select the latest version for collections with
# multiple versions found and go from there
deps_not_checked = set(dependency_map.keys()).difference(checked_parents)
for collection in deps_not_checked:
dependency_map[collection].set_latest_version()
if no_deps or len(dependency_map[collection].dependencies) == 0:
checked_parents.add(collection)
return dependency_map
def _get_collection_info(dep_map, existing_collections, collection, requirement, source, b_temp_path, apis,
validate_certs, force, parent=None, allow_pre_release=False):
dep_msg = ""
if parent:
dep_msg = " - as dependency of %s" % parent
display.vvv("Processing requirement collection '%s'%s" % (to_text(collection), dep_msg))
b_tar_path = None
if os.path.isfile(to_bytes(collection, errors='surrogate_or_strict')):
display.vvvv("Collection requirement '%s' is a tar artifact" % to_text(collection))
b_tar_path = to_bytes(collection, errors='surrogate_or_strict')
elif urlparse(collection).scheme.lower() in ['http', 'https']:
display.vvvv("Collection requirement '%s' is a URL to a tar artifact" % collection)
try:
b_tar_path = _download_file(collection, b_temp_path, None, validate_certs)
except urllib_error.URLError as err:
raise AnsibleError("Failed to download collection tar from '%s': %s"
% (to_native(collection), to_native(err)))
if b_tar_path:
req = CollectionRequirement.from_tar(b_tar_path, force, parent=parent)
collection_name = to_text(req)
if collection_name in dep_map:
collection_info = dep_map[collection_name]
collection_info.add_requirement(None, req.latest_version)
else:
collection_info = req
else:
validate_collection_name(collection)
display.vvvv("Collection requirement '%s' is the name of a collection" % collection)
if collection in dep_map:
collection_info = dep_map[collection]
collection_info.add_requirement(parent, requirement)
else:
apis = [source] if source else apis
collection_info = CollectionRequirement.from_name(collection, apis, requirement, force, parent=parent,
allow_pre_release=allow_pre_release)
existing = [c for c in existing_collections if to_text(c) == to_text(collection_info)]
if existing and not collection_info.force:
# Test that the installed collection fits the requirement
existing[0].add_requirement(parent, requirement)
collection_info = existing[0]
dep_map[to_text(collection_info)] = collection_info
def _download_file(url, b_path, expected_hash, validate_certs, headers=None):
urlsplit = os.path.splitext(to_text(url.rsplit('/', 1)[1]))
b_file_name = to_bytes(urlsplit[0], errors='surrogate_or_strict')
b_file_ext = to_bytes(urlsplit[1], errors='surrogate_or_strict')
b_file_path = tempfile.NamedTemporaryFile(dir=b_path, prefix=b_file_name, suffix=b_file_ext, delete=False).name
display.vvv("Downloading %s to %s" % (url, to_text(b_path)))
# Galaxy redirs downloads to S3 which reject the request if an Authorization header is attached so don't redir that
resp = open_url(to_native(url, errors='surrogate_or_strict'), validate_certs=validate_certs, headers=headers,
unredirected_headers=['Authorization'], http_agent=user_agent())
with open(b_file_path, 'wb') as download_file:
actual_hash = _consume_file(resp, download_file)
if expected_hash:
display.vvvv("Validating downloaded file hash %s with expected hash %s" % (actual_hash, expected_hash))
if expected_hash != actual_hash:
raise AnsibleError("Mismatch artifact hash with downloaded file")
return b_file_path
def _extract_tar_file(tar, filename, b_dest, b_temp_path, expected_hash=None):
with _get_tar_file_member(tar, filename) as tar_obj:
with tempfile.NamedTemporaryFile(dir=b_temp_path, delete=False) as tmpfile_obj:
actual_hash = _consume_file(tar_obj, tmpfile_obj)
if expected_hash and actual_hash != expected_hash:
raise AnsibleError("Checksum mismatch for '%s' inside collection at '%s'"
% (to_native(filename, errors='surrogate_or_strict'), to_native(tar.name)))
b_dest_filepath = os.path.join(b_dest, to_bytes(filename, errors='surrogate_or_strict'))
b_parent_dir = os.path.split(b_dest_filepath)[0]
if not os.path.exists(b_parent_dir):
# Seems like Galaxy does not validate if all file entries have a corresponding dir ftype entry. This check
# makes sure we create the parent directory even if it wasn't set in the metadata.
os.makedirs(b_parent_dir)
shutil.move(to_bytes(tmpfile_obj.name, errors='surrogate_or_strict'), b_dest_filepath)
def _get_tar_file_member(tar, filename):
n_filename = to_native(filename, errors='surrogate_or_strict')
try:
member = tar.getmember(n_filename)
except KeyError:
raise AnsibleError("Collection tar at '%s' does not contain the expected file '%s'." % (
to_native(tar.name),
n_filename))
return _tarfile_extract(tar, member)
def _get_json_from_tar_file(b_path, filename):
file_contents = ''
with tarfile.open(b_path, mode='r') as collection_tar:
with _get_tar_file_member(collection_tar, filename) as tar_obj:
bufsize = 65536
data = tar_obj.read(bufsize)
while data:
file_contents += to_text(data)
data = tar_obj.read(bufsize)
return json.loads(file_contents)
def _get_tar_file_hash(b_path, filename):
with tarfile.open(b_path, mode='r') as collection_tar:
with _get_tar_file_member(collection_tar, filename) as tar_obj:
return _consume_file(tar_obj)
def _consume_file(read_from, write_to=None):
bufsize = 65536
sha256_digest = sha256()
data = read_from.read(bufsize)
while data:
if write_to is not None:
write_to.write(data)
write_to.flush()
sha256_digest.update(data)
data = read_from.read(bufsize)
return sha256_digest.hexdigest()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 66,803 |
Collection tarballs don't preserve file modes
|
##### SUMMARY
If you have executable files in your source code they are not kept in the tarball.
Could be related to https://github.com/ansible/ansible/blob/99d7f150873011e7515851db9b44ff486efa9d77/lib/ansible/galaxy/collection.py#L763
This was noticed as I had some auxiliary scripts in my repo.
Need to also consider scripts in `files/` if/when Collections support roles.
This seems to fail the "principal of least suprise"
**Proposal 1: Keep on stripping executable flags**
Add some line to the output
`WARNING: tests/coverage.sh: source file has executable flag, though this will be ignored in generated tar.gz`
Update developing a collection documentation page
Update integration tests to ensure that executable mode is always stripped.
Update code to link to this issue
**Proposal 2: Allow executible flags**
Update developing a collection documentation page
Update integration tests to ensure that executable mode is preserved
Update code to link to this issue
Changelog
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-galaxy
##### ANSIBLE VERSION
```paste below
2.10
```
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/66803
|
https://github.com/ansible/ansible/pull/68418
|
a9d2ceafe429171c0e2ad007058b88bae57c74ce
|
127d54b3630c65043ec12c4af2024f8ef0bc6d09
| 2020-01-27T11:28:53Z |
python
| 2020-03-24T22:08:23Z |
test/units/cli/test_galaxy.py
|
# -*- coding: utf-8 -*-
# (c) 2016, Adrian Likins <[email protected]>
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import ansible
import json
import os
import pytest
import shutil
import tarfile
import tempfile
import yaml
import ansible.constants as C
from ansible import context
from ansible.cli.galaxy import GalaxyCLI
from ansible.galaxy.api import GalaxyAPI
from ansible.errors import AnsibleError
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.utils import context_objects as co
from units.compat import unittest
from units.compat.mock import patch, MagicMock
@pytest.fixture(autouse='function')
def reset_cli_args():
co.GlobalCLIArgs._Singleton__instance = None
yield
co.GlobalCLIArgs._Singleton__instance = None
class TestGalaxy(unittest.TestCase):
@classmethod
def setUpClass(cls):
'''creating prerequisites for installing a role; setUpClass occurs ONCE whereas setUp occurs with every method tested.'''
# class data for easy viewing: role_dir, role_tar, role_name, role_req, role_path
cls.temp_dir = tempfile.mkdtemp(prefix='ansible-test_galaxy-')
os.chdir(cls.temp_dir)
if os.path.exists("./delete_me"):
shutil.rmtree("./delete_me")
# creating framework for a role
gc = GalaxyCLI(args=["ansible-galaxy", "init", "--offline", "delete_me"])
gc.run()
cls.role_dir = "./delete_me"
cls.role_name = "delete_me"
# making a temp dir for role installation
cls.role_path = os.path.join(tempfile.mkdtemp(), "roles")
if not os.path.isdir(cls.role_path):
os.makedirs(cls.role_path)
# creating a tar file name for class data
cls.role_tar = './delete_me.tar.gz'
cls.makeTar(cls.role_tar, cls.role_dir)
# creating a temp file with installation requirements
cls.role_req = './delete_me_requirements.yml'
fd = open(cls.role_req, "w")
fd.write("- 'src': '%s'\n 'name': '%s'\n 'path': '%s'" % (cls.role_tar, cls.role_name, cls.role_path))
fd.close()
@classmethod
def makeTar(cls, output_file, source_dir):
''' used for making a tarfile from a role directory '''
# adding directory into a tar file
try:
tar = tarfile.open(output_file, "w:gz")
tar.add(source_dir, arcname=os.path.basename(source_dir))
except AttributeError: # tarfile obj. has no attribute __exit__ prior to python 2. 7
pass
finally: # ensuring closure of tarfile obj
tar.close()
@classmethod
def tearDownClass(cls):
'''After tests are finished removes things created in setUpClass'''
# deleting the temp role directory
if os.path.exists(cls.role_dir):
shutil.rmtree(cls.role_dir)
if os.path.exists(cls.role_req):
os.remove(cls.role_req)
if os.path.exists(cls.role_tar):
os.remove(cls.role_tar)
if os.path.isdir(cls.role_path):
shutil.rmtree(cls.role_path)
os.chdir('/')
shutil.rmtree(cls.temp_dir)
def setUp(self):
# Reset the stored command line args
co.GlobalCLIArgs._Singleton__instance = None
self.default_args = ['ansible-galaxy']
def tearDown(self):
# Reset the stored command line args
co.GlobalCLIArgs._Singleton__instance = None
def test_init(self):
galaxy_cli = GalaxyCLI(args=self.default_args)
self.assertTrue(isinstance(galaxy_cli, GalaxyCLI))
def test_display_min(self):
gc = GalaxyCLI(args=self.default_args)
role_info = {'name': 'some_role_name'}
display_result = gc._display_role_info(role_info)
self.assertTrue(display_result.find('some_role_name') > -1)
def test_display_galaxy_info(self):
gc = GalaxyCLI(args=self.default_args)
galaxy_info = {}
role_info = {'name': 'some_role_name',
'galaxy_info': galaxy_info}
display_result = gc._display_role_info(role_info)
if display_result.find('\n\tgalaxy_info:') == -1:
self.fail('Expected galaxy_info to be indented once')
def test_run(self):
''' verifies that the GalaxyCLI object's api is created and that execute() is called. '''
gc = GalaxyCLI(args=["ansible-galaxy", "install", "--ignore-errors", "imaginary_role"])
gc.parse()
with patch.object(ansible.cli.CLI, "run", return_value=None) as mock_run:
gc.run()
# testing
self.assertIsInstance(gc.galaxy, ansible.galaxy.Galaxy)
self.assertEqual(mock_run.call_count, 1)
self.assertTrue(isinstance(gc.api, ansible.galaxy.api.GalaxyAPI))
def test_execute_remove(self):
# installing role
gc = GalaxyCLI(args=["ansible-galaxy", "install", "-p", self.role_path, "-r", self.role_req, '--force'])
gc.run()
# location where the role was installed
role_file = os.path.join(self.role_path, self.role_name)
# removing role
# Have to reset the arguments in the context object manually since we're doing the
# equivalent of running the command line program twice
co.GlobalCLIArgs._Singleton__instance = None
gc = GalaxyCLI(args=["ansible-galaxy", "remove", role_file, self.role_name])
gc.run()
# testing role was removed
removed_role = not os.path.exists(role_file)
self.assertTrue(removed_role)
def test_exit_without_ignore_without_flag(self):
''' tests that GalaxyCLI exits with the error specified if the --ignore-errors flag is not used '''
gc = GalaxyCLI(args=["ansible-galaxy", "install", "--server=None", "fake_role_name"])
with patch.object(ansible.utils.display.Display, "display", return_value=None) as mocked_display:
# testing that error expected is raised
self.assertRaises(AnsibleError, gc.run)
self.assertTrue(mocked_display.called_once_with("- downloading role 'fake_role_name', owned by "))
def test_exit_without_ignore_with_flag(self):
''' tests that GalaxyCLI exits without the error specified if the --ignore-errors flag is used '''
# testing with --ignore-errors flag
gc = GalaxyCLI(args=["ansible-galaxy", "install", "--server=None", "fake_role_name", "--ignore-errors"])
with patch.object(ansible.utils.display.Display, "display", return_value=None) as mocked_display:
gc.run()
self.assertTrue(mocked_display.called_once_with("- downloading role 'fake_role_name', owned by "))
def test_parse_no_action(self):
''' testing the options parser when no action is given '''
gc = GalaxyCLI(args=["ansible-galaxy", ""])
self.assertRaises(SystemExit, gc.parse)
def test_parse_invalid_action(self):
''' testing the options parser when an invalid action is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "NOT_ACTION"])
self.assertRaises(SystemExit, gc.parse)
def test_parse_delete(self):
''' testing the options parser when the action 'delete' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "delete", "foo", "bar"])
gc.parse()
self.assertEqual(context.CLIARGS['verbosity'], 0)
def test_parse_import(self):
''' testing the options parser when the action 'import' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "import", "foo", "bar"])
gc.parse()
self.assertEqual(context.CLIARGS['wait'], True)
self.assertEqual(context.CLIARGS['reference'], None)
self.assertEqual(context.CLIARGS['check_status'], False)
self.assertEqual(context.CLIARGS['verbosity'], 0)
def test_parse_info(self):
''' testing the options parser when the action 'info' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "info", "foo", "bar"])
gc.parse()
self.assertEqual(context.CLIARGS['offline'], False)
def test_parse_init(self):
''' testing the options parser when the action 'init' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "init", "foo"])
gc.parse()
self.assertEqual(context.CLIARGS['offline'], False)
self.assertEqual(context.CLIARGS['force'], False)
def test_parse_install(self):
''' testing the options parser when the action 'install' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "install"])
gc.parse()
self.assertEqual(context.CLIARGS['ignore_errors'], False)
self.assertEqual(context.CLIARGS['no_deps'], False)
self.assertEqual(context.CLIARGS['role_file'], None)
self.assertEqual(context.CLIARGS['force'], False)
def test_parse_list(self):
''' testing the options parser when the action 'list' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "list"])
gc.parse()
self.assertEqual(context.CLIARGS['verbosity'], 0)
def test_parse_login(self):
''' testing the options parser when the action 'login' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "login"])
gc.parse()
self.assertEqual(context.CLIARGS['verbosity'], 0)
self.assertEqual(context.CLIARGS['token'], None)
def test_parse_remove(self):
''' testing the options parser when the action 'remove' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "remove", "foo"])
gc.parse()
self.assertEqual(context.CLIARGS['verbosity'], 0)
def test_parse_search(self):
''' testing the options parswer when the action 'search' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "search"])
gc.parse()
self.assertEqual(context.CLIARGS['platforms'], None)
self.assertEqual(context.CLIARGS['galaxy_tags'], None)
self.assertEqual(context.CLIARGS['author'], None)
def test_parse_setup(self):
''' testing the options parser when the action 'setup' is given '''
gc = GalaxyCLI(args=["ansible-galaxy", "setup", "source", "github_user", "github_repo", "secret"])
gc.parse()
self.assertEqual(context.CLIARGS['verbosity'], 0)
self.assertEqual(context.CLIARGS['remove_id'], None)
self.assertEqual(context.CLIARGS['setup_list'], False)
class ValidRoleTests(object):
expected_role_dirs = ('defaults', 'files', 'handlers', 'meta', 'tasks', 'templates', 'vars', 'tests')
@classmethod
def setUpRole(cls, role_name, galaxy_args=None, skeleton_path=None, use_explicit_type=False):
if galaxy_args is None:
galaxy_args = []
if skeleton_path is not None:
cls.role_skeleton_path = skeleton_path
galaxy_args += ['--role-skeleton', skeleton_path]
# Make temp directory for testing
cls.test_dir = tempfile.mkdtemp()
if not os.path.isdir(cls.test_dir):
os.makedirs(cls.test_dir)
cls.role_dir = os.path.join(cls.test_dir, role_name)
cls.role_name = role_name
# create role using default skeleton
args = ['ansible-galaxy']
if use_explicit_type:
args += ['role']
args += ['init', '-c', '--offline'] + galaxy_args + ['--init-path', cls.test_dir, cls.role_name]
gc = GalaxyCLI(args=args)
gc.run()
cls.gc = gc
if skeleton_path is None:
cls.role_skeleton_path = gc.galaxy.default_role_skeleton_path
@classmethod
def tearDownClass(cls):
if os.path.isdir(cls.test_dir):
shutil.rmtree(cls.test_dir)
def test_metadata(self):
with open(os.path.join(self.role_dir, 'meta', 'main.yml'), 'r') as mf:
metadata = yaml.safe_load(mf)
self.assertIn('galaxy_info', metadata, msg='unable to find galaxy_info in metadata')
self.assertIn('dependencies', metadata, msg='unable to find dependencies in metadata')
def test_readme(self):
readme_path = os.path.join(self.role_dir, 'README.md')
self.assertTrue(os.path.exists(readme_path), msg='Readme doesn\'t exist')
def test_main_ymls(self):
need_main_ymls = set(self.expected_role_dirs) - set(['meta', 'tests', 'files', 'templates'])
for d in need_main_ymls:
main_yml = os.path.join(self.role_dir, d, 'main.yml')
self.assertTrue(os.path.exists(main_yml))
expected_string = "---\n# {0} file for {1}".format(d, self.role_name)
with open(main_yml, 'r') as f:
self.assertEqual(expected_string, f.read().strip())
def test_role_dirs(self):
for d in self.expected_role_dirs:
self.assertTrue(os.path.isdir(os.path.join(self.role_dir, d)), msg="Expected role subdirectory {0} doesn't exist".format(d))
def test_travis_yml(self):
with open(os.path.join(self.role_dir, '.travis.yml'), 'r') as f:
contents = f.read()
with open(os.path.join(self.role_skeleton_path, '.travis.yml'), 'r') as f:
expected_contents = f.read()
self.assertEqual(expected_contents, contents, msg='.travis.yml does not match expected')
def test_readme_contents(self):
with open(os.path.join(self.role_dir, 'README.md'), 'r') as readme:
contents = readme.read()
with open(os.path.join(self.role_skeleton_path, 'README.md'), 'r') as f:
expected_contents = f.read()
self.assertEqual(expected_contents, contents, msg='README.md does not match expected')
def test_test_yml(self):
with open(os.path.join(self.role_dir, 'tests', 'test.yml'), 'r') as f:
test_playbook = yaml.safe_load(f)
print(test_playbook)
self.assertEqual(len(test_playbook), 1)
self.assertEqual(test_playbook[0]['hosts'], 'localhost')
self.assertEqual(test_playbook[0]['remote_user'], 'root')
self.assertListEqual(test_playbook[0]['roles'], [self.role_name], msg='The list of roles included in the test play doesn\'t match')
class TestGalaxyInitDefault(unittest.TestCase, ValidRoleTests):
@classmethod
def setUpClass(cls):
cls.setUpRole(role_name='delete_me')
def test_metadata_contents(self):
with open(os.path.join(self.role_dir, 'meta', 'main.yml'), 'r') as mf:
metadata = yaml.safe_load(mf)
self.assertEqual(metadata.get('galaxy_info', dict()).get('author'), 'your name', msg='author was not set properly in metadata')
class TestGalaxyInitAPB(unittest.TestCase, ValidRoleTests):
@classmethod
def setUpClass(cls):
cls.setUpRole('delete_me_apb', galaxy_args=['--type=apb'])
def test_metadata_apb_tag(self):
with open(os.path.join(self.role_dir, 'meta', 'main.yml'), 'r') as mf:
metadata = yaml.safe_load(mf)
self.assertIn('apb', metadata.get('galaxy_info', dict()).get('galaxy_tags', []), msg='apb tag not set in role metadata')
def test_metadata_contents(self):
with open(os.path.join(self.role_dir, 'meta', 'main.yml'), 'r') as mf:
metadata = yaml.safe_load(mf)
self.assertEqual(metadata.get('galaxy_info', dict()).get('author'), 'your name', msg='author was not set properly in metadata')
def test_apb_yml(self):
self.assertTrue(os.path.exists(os.path.join(self.role_dir, 'apb.yml')), msg='apb.yml was not created')
def test_test_yml(self):
with open(os.path.join(self.role_dir, 'tests', 'test.yml'), 'r') as f:
test_playbook = yaml.safe_load(f)
print(test_playbook)
self.assertEqual(len(test_playbook), 1)
self.assertEqual(test_playbook[0]['hosts'], 'localhost')
self.assertFalse(test_playbook[0]['gather_facts'])
self.assertEqual(test_playbook[0]['connection'], 'local')
self.assertIsNone(test_playbook[0]['tasks'], msg='We\'re expecting an unset list of tasks in test.yml')
class TestGalaxyInitContainer(unittest.TestCase, ValidRoleTests):
@classmethod
def setUpClass(cls):
cls.setUpRole('delete_me_container', galaxy_args=['--type=container'])
def test_metadata_container_tag(self):
with open(os.path.join(self.role_dir, 'meta', 'main.yml'), 'r') as mf:
metadata = yaml.safe_load(mf)
self.assertIn('container', metadata.get('galaxy_info', dict()).get('galaxy_tags', []), msg='container tag not set in role metadata')
def test_metadata_contents(self):
with open(os.path.join(self.role_dir, 'meta', 'main.yml'), 'r') as mf:
metadata = yaml.safe_load(mf)
self.assertEqual(metadata.get('galaxy_info', dict()).get('author'), 'your name', msg='author was not set properly in metadata')
def test_meta_container_yml(self):
self.assertTrue(os.path.exists(os.path.join(self.role_dir, 'meta', 'container.yml')), msg='container.yml was not created')
def test_test_yml(self):
with open(os.path.join(self.role_dir, 'tests', 'test.yml'), 'r') as f:
test_playbook = yaml.safe_load(f)
print(test_playbook)
self.assertEqual(len(test_playbook), 1)
self.assertEqual(test_playbook[0]['hosts'], 'localhost')
self.assertFalse(test_playbook[0]['gather_facts'])
self.assertEqual(test_playbook[0]['connection'], 'local')
self.assertIsNone(test_playbook[0]['tasks'], msg='We\'re expecting an unset list of tasks in test.yml')
class TestGalaxyInitSkeleton(unittest.TestCase, ValidRoleTests):
@classmethod
def setUpClass(cls):
role_skeleton_path = os.path.join(os.path.split(__file__)[0], 'test_data', 'role_skeleton')
cls.setUpRole('delete_me_skeleton', skeleton_path=role_skeleton_path, use_explicit_type=True)
def test_empty_files_dir(self):
files_dir = os.path.join(self.role_dir, 'files')
self.assertTrue(os.path.isdir(files_dir))
self.assertListEqual(os.listdir(files_dir), [], msg='we expect the files directory to be empty, is ignore working?')
def test_template_ignore_jinja(self):
test_conf_j2 = os.path.join(self.role_dir, 'templates', 'test.conf.j2')
self.assertTrue(os.path.exists(test_conf_j2), msg="The test.conf.j2 template doesn't seem to exist, is it being rendered as test.conf?")
with open(test_conf_j2, 'r') as f:
contents = f.read()
expected_contents = '[defaults]\ntest_key = {{ test_variable }}'
self.assertEqual(expected_contents, contents.strip(), msg="test.conf.j2 doesn't contain what it should, is it being rendered?")
def test_template_ignore_jinja_subfolder(self):
test_conf_j2 = os.path.join(self.role_dir, 'templates', 'subfolder', 'test.conf.j2')
self.assertTrue(os.path.exists(test_conf_j2), msg="The test.conf.j2 template doesn't seem to exist, is it being rendered as test.conf?")
with open(test_conf_j2, 'r') as f:
contents = f.read()
expected_contents = '[defaults]\ntest_key = {{ test_variable }}'
self.assertEqual(expected_contents, contents.strip(), msg="test.conf.j2 doesn't contain what it should, is it being rendered?")
def test_template_ignore_similar_folder(self):
self.assertTrue(os.path.exists(os.path.join(self.role_dir, 'templates_extra', 'templates.txt')))
def test_skeleton_option(self):
self.assertEqual(self.role_skeleton_path, context.CLIARGS['role_skeleton'], msg='Skeleton path was not parsed properly from the command line')
@pytest.mark.parametrize('cli_args, expected', [
(['ansible-galaxy', 'collection', 'init', 'abc.def'], 0),
(['ansible-galaxy', 'collection', 'init', 'abc.def', '-vvv'], 3),
(['ansible-galaxy', '-vv', 'collection', 'init', 'abc.def'], 2),
# Due to our manual parsing we want to verify that -v set in the sub parser takes precedence. This behaviour is
# deprecated and tests should be removed when the code that handles it is removed
(['ansible-galaxy', '-vv', 'collection', 'init', 'abc.def', '-v'], 1),
(['ansible-galaxy', '-vv', 'collection', 'init', 'abc.def', '-vvvv'], 4),
(['ansible-galaxy', '-vvv', 'init', 'name'], 3),
(['ansible-galaxy', '-vvvvv', 'init', '-v', 'name'], 1),
])
def test_verbosity_arguments(cli_args, expected, monkeypatch):
# Mock out the functions so we don't actually execute anything
for func_name in [f for f in dir(GalaxyCLI) if f.startswith("execute_")]:
monkeypatch.setattr(GalaxyCLI, func_name, MagicMock())
cli = GalaxyCLI(args=cli_args)
cli.run()
assert context.CLIARGS['verbosity'] == expected
@pytest.fixture()
def collection_skeleton(request, tmp_path_factory):
name, skeleton_path = request.param
galaxy_args = ['ansible-galaxy', 'collection', 'init', '-c']
if skeleton_path is not None:
galaxy_args += ['--collection-skeleton', skeleton_path]
test_dir = to_text(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections'))
galaxy_args += ['--init-path', test_dir, name]
GalaxyCLI(args=galaxy_args).run()
namespace_name, collection_name = name.split('.', 1)
collection_dir = os.path.join(test_dir, namespace_name, collection_name)
return collection_dir
@pytest.mark.parametrize('collection_skeleton', [
('ansible_test.my_collection', None),
], indirect=True)
def test_collection_default(collection_skeleton):
meta_path = os.path.join(collection_skeleton, 'galaxy.yml')
with open(meta_path, 'r') as galaxy_meta:
metadata = yaml.safe_load(galaxy_meta)
assert metadata['namespace'] == 'ansible_test'
assert metadata['name'] == 'my_collection'
assert metadata['authors'] == ['your name <[email protected]>']
assert metadata['readme'] == 'README.md'
assert metadata['version'] == '1.0.0'
assert metadata['description'] == 'your collection description'
assert metadata['license'] == ['GPL-2.0-or-later']
assert metadata['tags'] == []
assert metadata['dependencies'] == {}
assert metadata['documentation'] == 'http://docs.example.com'
assert metadata['repository'] == 'http://example.com/repository'
assert metadata['homepage'] == 'http://example.com'
assert metadata['issues'] == 'http://example.com/issue/tracker'
for d in ['docs', 'plugins', 'roles']:
assert os.path.isdir(os.path.join(collection_skeleton, d)), \
"Expected collection subdirectory {0} doesn't exist".format(d)
@pytest.mark.parametrize('collection_skeleton', [
('ansible_test.delete_me_skeleton', os.path.join(os.path.split(__file__)[0], 'test_data', 'collection_skeleton')),
], indirect=True)
def test_collection_skeleton(collection_skeleton):
meta_path = os.path.join(collection_skeleton, 'galaxy.yml')
with open(meta_path, 'r') as galaxy_meta:
metadata = yaml.safe_load(galaxy_meta)
assert metadata['namespace'] == 'ansible_test'
assert metadata['name'] == 'delete_me_skeleton'
assert metadata['authors'] == ['Ansible Cow <[email protected]>', 'Tu Cow <[email protected]>']
assert metadata['version'] == '0.1.0'
assert metadata['readme'] == 'README.md'
assert len(metadata) == 5
assert os.path.exists(os.path.join(collection_skeleton, 'README.md'))
# Test empty directories exist and are empty
for empty_dir in ['plugins/action', 'plugins/filter', 'plugins/inventory', 'plugins/lookup',
'plugins/module_utils', 'plugins/modules']:
assert os.listdir(os.path.join(collection_skeleton, empty_dir)) == []
# Test files that don't end with .j2 were not templated
doc_file = os.path.join(collection_skeleton, 'docs', 'My Collection.md')
with open(doc_file, 'r') as f:
doc_contents = f.read()
assert doc_contents.strip() == 'Welcome to my test collection doc for {{ namespace }}.'
# Test files that end with .j2 but are in the templates directory were not templated
for template_dir in ['playbooks/templates', 'playbooks/templates/subfolder',
'roles/common/templates', 'roles/common/templates/subfolder']:
test_conf_j2 = os.path.join(collection_skeleton, template_dir, 'test.conf.j2')
assert os.path.exists(test_conf_j2)
with open(test_conf_j2, 'r') as f:
contents = f.read()
expected_contents = '[defaults]\ntest_key = {{ test_variable }}'
assert expected_contents == contents.strip()
@pytest.fixture()
def collection_artifact(collection_skeleton, tmp_path_factory):
''' Creates a collection artifact tarball that is ready to be published and installed '''
output_dir = to_text(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Output'))
# Because we call GalaxyCLI in collection_skeleton we need to reset the singleton back to None so it uses the new
# args, we reset the original args once it is done.
orig_cli_args = co.GlobalCLIArgs._Singleton__instance
try:
co.GlobalCLIArgs._Singleton__instance = None
galaxy_args = ['ansible-galaxy', 'collection', 'build', collection_skeleton, '--output-path', output_dir]
gc = GalaxyCLI(args=galaxy_args)
gc.run()
yield output_dir
finally:
co.GlobalCLIArgs._Singleton__instance = orig_cli_args
def test_invalid_skeleton_path():
expected = "- the skeleton path '/fake/path' does not exist, cannot init collection"
gc = GalaxyCLI(args=['ansible-galaxy', 'collection', 'init', 'my.collection', '--collection-skeleton',
'/fake/path'])
with pytest.raises(AnsibleError, match=expected):
gc.run()
@pytest.mark.parametrize("name", [
"",
"invalid",
"hypen-ns.collection",
"ns.hyphen-collection",
"ns.collection.weird",
])
def test_invalid_collection_name_init(name):
expected = "Invalid collection name '%s', name must be in the format <namespace>.<collection>" % name
gc = GalaxyCLI(args=['ansible-galaxy', 'collection', 'init', name])
with pytest.raises(AnsibleError, match=expected):
gc.run()
@pytest.mark.parametrize("name, expected", [
("", ""),
("invalid", "invalid"),
("invalid:1.0.0", "invalid"),
("hypen-ns.collection", "hypen-ns.collection"),
("ns.hyphen-collection", "ns.hyphen-collection"),
("ns.collection.weird", "ns.collection.weird"),
])
def test_invalid_collection_name_install(name, expected, tmp_path_factory):
install_path = to_text(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections'))
expected = "Invalid collection name '%s', name must be in the format <namespace>.<collection>" % expected
gc = GalaxyCLI(args=['ansible-galaxy', 'collection', 'install', name, '-p', os.path.join(install_path, 'install')])
with pytest.raises(AnsibleError, match=expected):
gc.run()
@pytest.mark.parametrize('collection_skeleton', [
('ansible_test.build_collection', None),
], indirect=True)
def test_collection_build(collection_artifact):
tar_path = os.path.join(collection_artifact, 'ansible_test-build_collection-1.0.0.tar.gz')
assert tarfile.is_tarfile(tar_path)
with tarfile.open(tar_path, mode='r') as tar:
tar_members = tar.getmembers()
valid_files = ['MANIFEST.json', 'FILES.json', 'roles', 'docs', 'plugins', 'plugins/README.md', 'README.md']
assert len(tar_members) == 7
# Verify the uid and gid is 0 and the correct perms are set
for member in tar_members:
assert member.name in valid_files
assert member.gid == 0
assert member.gname == ''
assert member.uid == 0
assert member.uname == ''
if member.isdir():
assert member.mode == 0o0755
else:
assert member.mode == 0o0644
manifest_file = tar.extractfile(tar_members[0])
try:
manifest = json.loads(to_text(manifest_file.read()))
finally:
manifest_file.close()
coll_info = manifest['collection_info']
file_manifest = manifest['file_manifest_file']
assert manifest['format'] == 1
assert len(manifest.keys()) == 3
assert coll_info['namespace'] == 'ansible_test'
assert coll_info['name'] == 'build_collection'
assert coll_info['version'] == '1.0.0'
assert coll_info['authors'] == ['your name <[email protected]>']
assert coll_info['readme'] == 'README.md'
assert coll_info['tags'] == []
assert coll_info['description'] == 'your collection description'
assert coll_info['license'] == ['GPL-2.0-or-later']
assert coll_info['license_file'] is None
assert coll_info['dependencies'] == {}
assert coll_info['repository'] == 'http://example.com/repository'
assert coll_info['documentation'] == 'http://docs.example.com'
assert coll_info['homepage'] == 'http://example.com'
assert coll_info['issues'] == 'http://example.com/issue/tracker'
assert len(coll_info.keys()) == 14
assert file_manifest['name'] == 'FILES.json'
assert file_manifest['ftype'] == 'file'
assert file_manifest['chksum_type'] == 'sha256'
assert file_manifest['chksum_sha256'] is not None # Order of keys makes it hard to verify the checksum
assert file_manifest['format'] == 1
assert len(file_manifest.keys()) == 5
files_file = tar.extractfile(tar_members[1])
try:
files = json.loads(to_text(files_file.read()))
finally:
files_file.close()
assert len(files['files']) == 6
assert files['format'] == 1
assert len(files.keys()) == 2
valid_files_entries = ['.', 'roles', 'docs', 'plugins', 'plugins/README.md', 'README.md']
for file_entry in files['files']:
assert file_entry['name'] in valid_files_entries
assert file_entry['format'] == 1
if file_entry['name'] == 'plugins/README.md':
assert file_entry['ftype'] == 'file'
assert file_entry['chksum_type'] == 'sha256'
# Can't test the actual checksum as the html link changes based on the version.
assert file_entry['chksum_sha256'] is not None
elif file_entry['name'] == 'README.md':
assert file_entry['ftype'] == 'file'
assert file_entry['chksum_type'] == 'sha256'
assert file_entry['chksum_sha256'] == '45923ca2ece0e8ce31d29e5df9d8b649fe55e2f5b5b61c9724d7cc187bd6ad4a'
else:
assert file_entry['ftype'] == 'dir'
assert file_entry['chksum_type'] is None
assert file_entry['chksum_sha256'] is None
assert len(file_entry.keys()) == 5
@pytest.fixture()
def collection_install(reset_cli_args, tmp_path_factory, monkeypatch):
mock_install = MagicMock()
monkeypatch.setattr(ansible.cli.galaxy, 'install_collections', mock_install)
mock_warning = MagicMock()
monkeypatch.setattr(ansible.utils.display.Display, 'warning', mock_warning)
output_dir = to_text((tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Output')))
yield mock_install, mock_warning, output_dir
def test_collection_install_with_names(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', 'namespace2.collection:1.0.1',
'--collections-path', output_dir]
GalaxyCLI(args=galaxy_args).run()
collection_path = os.path.join(output_dir, 'ansible_collections')
assert os.path.isdir(collection_path)
assert mock_warning.call_count == 1
assert "The specified collections path '%s' is not part of the configured Ansible collections path" % output_dir \
in mock_warning.call_args[0][0]
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.collection', '*', None),
('namespace2.collection', '1.0.1', None)]
assert mock_install.call_args[0][1] == collection_path
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
def test_collection_install_with_requirements_file(collection_install):
mock_install, mock_warning, output_dir = collection_install
requirements_file = os.path.join(output_dir, 'requirements.yml')
with open(requirements_file, 'wb') as req_obj:
req_obj.write(b'''---
collections:
- namespace.coll
- name: namespace2.coll
version: '>2.0.1'
''')
galaxy_args = ['ansible-galaxy', 'collection', 'install', '--requirements-file', requirements_file,
'--collections-path', output_dir]
GalaxyCLI(args=galaxy_args).run()
collection_path = os.path.join(output_dir, 'ansible_collections')
assert os.path.isdir(collection_path)
assert mock_warning.call_count == 1
assert "The specified collections path '%s' is not part of the configured Ansible collections path" % output_dir \
in mock_warning.call_args[0][0]
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.coll', '*', None),
('namespace2.coll', '>2.0.1', None)]
assert mock_install.call_args[0][1] == collection_path
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
def test_collection_install_with_relative_path(collection_install, monkeypatch):
mock_install = collection_install[0]
mock_req = MagicMock()
mock_req.return_value = {'collections': [('namespace.coll', '*', None)]}
monkeypatch.setattr(ansible.cli.galaxy.GalaxyCLI, '_parse_requirements_file', mock_req)
monkeypatch.setattr(os, 'makedirs', MagicMock())
requirements_file = './requirements.myl'
collections_path = './ansible_collections'
galaxy_args = ['ansible-galaxy', 'collection', 'install', '--requirements-file', requirements_file,
'--collections-path', collections_path]
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.coll', '*', None)]
assert mock_install.call_args[0][1] == os.path.abspath(collections_path)
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
assert mock_req.call_count == 1
assert mock_req.call_args[0][0] == os.path.abspath(requirements_file)
def test_collection_install_with_unexpanded_path(collection_install, monkeypatch):
mock_install = collection_install[0]
mock_req = MagicMock()
mock_req.return_value = {'collections': [('namespace.coll', '*', None)]}
monkeypatch.setattr(ansible.cli.galaxy.GalaxyCLI, '_parse_requirements_file', mock_req)
monkeypatch.setattr(os, 'makedirs', MagicMock())
requirements_file = '~/requirements.myl'
collections_path = '~/ansible_collections'
galaxy_args = ['ansible-galaxy', 'collection', 'install', '--requirements-file', requirements_file,
'--collections-path', collections_path]
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.coll', '*', None)]
assert mock_install.call_args[0][1] == os.path.expanduser(os.path.expandvars(collections_path))
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
assert mock_req.call_count == 1
assert mock_req.call_args[0][0] == os.path.expanduser(os.path.expandvars(requirements_file))
def test_collection_install_in_collection_dir(collection_install, monkeypatch):
mock_install, mock_warning, output_dir = collection_install
collections_path = C.COLLECTIONS_PATHS[0]
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', 'namespace2.collection:1.0.1',
'--collections-path', collections_path]
GalaxyCLI(args=galaxy_args).run()
assert mock_warning.call_count == 0
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.collection', '*', None),
('namespace2.collection', '1.0.1', None)]
assert mock_install.call_args[0][1] == os.path.join(collections_path, 'ansible_collections')
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
def test_collection_install_with_url(collection_install):
mock_install, dummy, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'https://foo/bar/foo-bar-v1.0.0.tar.gz',
'--collections-path', output_dir]
GalaxyCLI(args=galaxy_args).run()
collection_path = os.path.join(output_dir, 'ansible_collections')
assert os.path.isdir(collection_path)
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('https://foo/bar/foo-bar-v1.0.0.tar.gz', '*', None)]
assert mock_install.call_args[0][1] == collection_path
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
def test_collection_install_name_and_requirements_fail(collection_install):
test_path = collection_install[2]
expected = 'The positional collection_name arg and --requirements-file are mutually exclusive.'
with pytest.raises(AnsibleError, match=expected):
GalaxyCLI(args=['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path',
test_path, '--requirements-file', test_path]).run()
def test_collection_install_no_name_and_requirements_fail(collection_install):
test_path = collection_install[2]
expected = 'You must specify a collection name or a requirements file.'
with pytest.raises(AnsibleError, match=expected):
GalaxyCLI(args=['ansible-galaxy', 'collection', 'install', '--collections-path', test_path]).run()
def test_collection_install_path_with_ansible_collections(collection_install):
mock_install, mock_warning, output_dir = collection_install
collection_path = os.path.join(output_dir, 'ansible_collections')
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', 'namespace2.collection:1.0.1',
'--collections-path', collection_path]
GalaxyCLI(args=galaxy_args).run()
assert os.path.isdir(collection_path)
assert mock_warning.call_count == 1
assert "The specified collections path '%s' is not part of the configured Ansible collections path" \
% collection_path in mock_warning.call_args[0][0]
assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.collection', '*', None),
('namespace2.collection', '1.0.1', None)]
assert mock_install.call_args[0][1] == collection_path
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
assert mock_install.call_args[0][3] is True
assert mock_install.call_args[0][4] is False
assert mock_install.call_args[0][5] is False
assert mock_install.call_args[0][6] is False
assert mock_install.call_args[0][7] is False
def test_collection_install_ignore_certs(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path', output_dir,
'--ignore-certs']
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_args[0][3] is False
def test_collection_install_force(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path', output_dir,
'--force']
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_args[0][6] is True
def test_collection_install_force_deps(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path', output_dir,
'--force-with-deps']
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_args[0][7] is True
def test_collection_install_no_deps(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path', output_dir,
'--no-deps']
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_args[0][5] is True
def test_collection_install_ignore(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path', output_dir,
'--ignore-errors']
GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_args[0][4] is True
def test_collection_install_custom_server(collection_install):
mock_install, mock_warning, output_dir = collection_install
galaxy_args = ['ansible-galaxy', 'collection', 'install', 'namespace.collection', '--collections-path', output_dir,
'--server', 'https://galaxy-dev.ansible.com']
GalaxyCLI(args=galaxy_args).run()
assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy-dev.ansible.com'
assert mock_install.call_args[0][2][0].validate_certs is True
@pytest.fixture()
def requirements_file(request, tmp_path_factory):
content = request.param
test_dir = to_text(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Requirements'))
requirements_file = os.path.join(test_dir, 'requirements.yml')
if content:
with open(requirements_file, 'wb') as req_obj:
req_obj.write(to_bytes(content))
yield requirements_file
@pytest.fixture()
def requirements_cli(monkeypatch):
monkeypatch.setattr(GalaxyCLI, 'execute_install', MagicMock())
cli = GalaxyCLI(args=['ansible-galaxy', 'install'])
cli.run()
return cli
@pytest.mark.parametrize('requirements_file', [None], indirect=True)
def test_parse_requirements_file_that_doesnt_exist(requirements_cli, requirements_file):
expected = "The requirements file '%s' does not exist." % to_native(requirements_file)
with pytest.raises(AnsibleError, match=expected):
requirements_cli._parse_requirements_file(requirements_file)
@pytest.mark.parametrize('requirements_file', ['not a valid yml file: hi: world'], indirect=True)
def test_parse_requirements_file_that_isnt_yaml(requirements_cli, requirements_file):
expected = "Failed to parse the requirements yml at '%s' with the following error" % to_native(requirements_file)
with pytest.raises(AnsibleError, match=expected):
requirements_cli._parse_requirements_file(requirements_file)
@pytest.mark.parametrize('requirements_file', [('''
# Older role based requirements.yml
- galaxy.role
- anotherrole
''')], indirect=True)
def test_parse_requirements_in_older_format_illega(requirements_cli, requirements_file):
expected = "Expecting requirements file to be a dict with the key 'collections' that contains a list of " \
"collections to install"
with pytest.raises(AnsibleError, match=expected):
requirements_cli._parse_requirements_file(requirements_file, allow_old_format=False)
@pytest.mark.parametrize('requirements_file', ['''
collections:
- version: 1.0.0
'''], indirect=True)
def test_parse_requirements_without_mandatory_name_key(requirements_cli, requirements_file):
expected = "Collections requirement entry should contain the key name."
with pytest.raises(AnsibleError, match=expected):
requirements_cli._parse_requirements_file(requirements_file)
@pytest.mark.parametrize('requirements_file', [('''
collections:
- namespace.collection1
- namespace.collection2
'''), ('''
collections:
- name: namespace.collection1
- name: namespace.collection2
''')], indirect=True)
def test_parse_requirements(requirements_cli, requirements_file):
expected = {
'roles': [],
'collections': [('namespace.collection1', '*', None), ('namespace.collection2', '*', None)]
}
actual = requirements_cli._parse_requirements_file(requirements_file)
assert actual == expected
@pytest.mark.parametrize('requirements_file', ['''
collections:
- name: namespace.collection1
version: ">=1.0.0,<=2.0.0"
source: https://galaxy-dev.ansible.com
- namespace.collection2'''], indirect=True)
def test_parse_requirements_with_extra_info(requirements_cli, requirements_file):
actual = requirements_cli._parse_requirements_file(requirements_file)
assert len(actual['roles']) == 0
assert len(actual['collections']) == 2
assert actual['collections'][0][0] == 'namespace.collection1'
assert actual['collections'][0][1] == '>=1.0.0,<=2.0.0'
assert actual['collections'][0][2].api_server == 'https://galaxy-dev.ansible.com'
assert actual['collections'][0][2].name == 'explicit_requirement_namespace.collection1'
assert actual['collections'][0][2].token is None
assert actual['collections'][0][2].username is None
assert actual['collections'][0][2].password is None
assert actual['collections'][0][2].validate_certs is True
assert actual['collections'][1] == ('namespace.collection2', '*', None)
@pytest.mark.parametrize('requirements_file', ['''
roles:
- username.role_name
- src: username2.role_name2
- src: ssh://github.com/user/repo
scm: git
collections:
- namespace.collection2
'''], indirect=True)
def test_parse_requirements_with_roles_and_collections(requirements_cli, requirements_file):
actual = requirements_cli._parse_requirements_file(requirements_file)
assert len(actual['roles']) == 3
assert actual['roles'][0].name == 'username.role_name'
assert actual['roles'][1].name == 'username2.role_name2'
assert actual['roles'][2].name == 'repo'
assert actual['roles'][2].src == 'ssh://github.com/user/repo'
assert len(actual['collections']) == 1
assert actual['collections'][0] == ('namespace.collection2', '*', None)
@pytest.mark.parametrize('requirements_file', ['''
collections:
- name: namespace.collection
- name: namespace2.collection2
source: https://galaxy-dev.ansible.com/
- name: namespace3.collection3
source: server
'''], indirect=True)
def test_parse_requirements_with_collection_source(requirements_cli, requirements_file):
galaxy_api = GalaxyAPI(requirements_cli.api, 'server', 'https://config-server')
requirements_cli.api_servers.append(galaxy_api)
actual = requirements_cli._parse_requirements_file(requirements_file)
assert actual['roles'] == []
assert len(actual['collections']) == 3
assert actual['collections'][0] == ('namespace.collection', '*', None)
assert actual['collections'][1][0] == 'namespace2.collection2'
assert actual['collections'][1][1] == '*'
assert actual['collections'][1][2].api_server == 'https://galaxy-dev.ansible.com/'
assert actual['collections'][1][2].name == 'explicit_requirement_namespace2.collection2'
assert actual['collections'][1][2].token is None
assert actual['collections'][2] == ('namespace3.collection3', '*', galaxy_api)
@pytest.mark.parametrize('requirements_file', ['''
- username.included_role
- src: https://github.com/user/repo
'''], indirect=True)
def test_parse_requirements_roles_with_include(requirements_cli, requirements_file):
reqs = [
'ansible.role',
{'include': requirements_file},
]
parent_requirements = os.path.join(os.path.dirname(requirements_file), 'parent.yaml')
with open(to_bytes(parent_requirements), 'wb') as req_fd:
req_fd.write(to_bytes(yaml.safe_dump(reqs)))
actual = requirements_cli._parse_requirements_file(parent_requirements)
assert len(actual['roles']) == 3
assert actual['collections'] == []
assert actual['roles'][0].name == 'ansible.role'
assert actual['roles'][1].name == 'username.included_role'
assert actual['roles'][2].name == 'repo'
assert actual['roles'][2].src == 'https://github.com/user/repo'
@pytest.mark.parametrize('requirements_file', ['''
- username.role
- include: missing.yml
'''], indirect=True)
def test_parse_requirements_roles_with_include_missing(requirements_cli, requirements_file):
expected = "Failed to find include requirements file 'missing.yml' in '%s'" % to_native(requirements_file)
with pytest.raises(AnsibleError, match=expected):
requirements_cli._parse_requirements_file(requirements_file)
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 66,803 |
Collection tarballs don't preserve file modes
|
##### SUMMARY
If you have executable files in your source code they are not kept in the tarball.
Could be related to https://github.com/ansible/ansible/blob/99d7f150873011e7515851db9b44ff486efa9d77/lib/ansible/galaxy/collection.py#L763
This was noticed as I had some auxiliary scripts in my repo.
Need to also consider scripts in `files/` if/when Collections support roles.
This seems to fail the "principal of least suprise"
**Proposal 1: Keep on stripping executable flags**
Add some line to the output
`WARNING: tests/coverage.sh: source file has executable flag, though this will be ignored in generated tar.gz`
Update developing a collection documentation page
Update integration tests to ensure that executable mode is always stripped.
Update code to link to this issue
**Proposal 2: Allow executible flags**
Update developing a collection documentation page
Update integration tests to ensure that executable mode is preserved
Update code to link to this issue
Changelog
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-galaxy
##### ANSIBLE VERSION
```paste below
2.10
```
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/66803
|
https://github.com/ansible/ansible/pull/68418
|
a9d2ceafe429171c0e2ad007058b88bae57c74ce
|
127d54b3630c65043ec12c4af2024f8ef0bc6d09
| 2020-01-27T11:28:53Z |
python
| 2020-03-24T22:08:23Z |
test/units/galaxy/test_collection_install.py
|
# -*- coding: utf-8 -*-
# Copyright: (c) 2019, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import copy
import json
import os
import pytest
import re
import shutil
import tarfile
import yaml
from io import BytesIO, StringIO
from units.compat.mock import MagicMock
import ansible.module_utils.six.moves.urllib.error as urllib_error
from ansible import context
from ansible.cli.galaxy import GalaxyCLI
from ansible.errors import AnsibleError
from ansible.galaxy import collection, api
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.utils import context_objects as co
from ansible.utils.display import Display
def call_galaxy_cli(args):
orig = co.GlobalCLIArgs._Singleton__instance
co.GlobalCLIArgs._Singleton__instance = None
try:
GalaxyCLI(args=['ansible-galaxy', 'collection'] + args).run()
finally:
co.GlobalCLIArgs._Singleton__instance = orig
def artifact_json(namespace, name, version, dependencies, server):
json_str = json.dumps({
'artifact': {
'filename': '%s-%s-%s.tar.gz' % (namespace, name, version),
'sha256': '2d76f3b8c4bab1072848107fb3914c345f71a12a1722f25c08f5d3f51f4ab5fd',
'size': 1234,
},
'download_url': '%s/download/%s-%s-%s.tar.gz' % (server, namespace, name, version),
'metadata': {
'namespace': namespace,
'name': name,
'dependencies': dependencies,
},
'version': version
})
return to_text(json_str)
def artifact_versions_json(namespace, name, versions, galaxy_api, available_api_versions=None):
results = []
available_api_versions = available_api_versions or {}
api_version = 'v2'
if 'v3' in available_api_versions:
api_version = 'v3'
for version in versions:
results.append({
'href': '%s/api/%s/%s/%s/versions/%s/' % (galaxy_api.api_server, api_version, namespace, name, version),
'version': version,
})
if api_version == 'v2':
json_str = json.dumps({
'count': len(versions),
'next': None,
'previous': None,
'results': results
})
if api_version == 'v3':
response = {'meta': {'count': len(versions)},
'data': results,
'links': {'first': None,
'last': None,
'next': None,
'previous': None},
}
json_str = json.dumps(response)
return to_text(json_str)
def error_json(galaxy_api, errors_to_return=None, available_api_versions=None):
errors_to_return = errors_to_return or []
available_api_versions = available_api_versions or {}
response = {}
api_version = 'v2'
if 'v3' in available_api_versions:
api_version = 'v3'
if api_version == 'v2':
assert len(errors_to_return) <= 1
if errors_to_return:
response = errors_to_return[0]
if api_version == 'v3':
response['errors'] = errors_to_return
json_str = json.dumps(response)
return to_text(json_str)
@pytest.fixture(autouse='function')
def reset_cli_args():
co.GlobalCLIArgs._Singleton__instance = None
yield
co.GlobalCLIArgs._Singleton__instance = None
@pytest.fixture()
def collection_artifact(request, tmp_path_factory):
test_dir = to_text(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
namespace = 'ansible_namespace'
collection = 'collection'
skeleton_path = os.path.join(os.path.dirname(os.path.split(__file__)[0]), 'cli', 'test_data', 'collection_skeleton')
collection_path = os.path.join(test_dir, namespace, collection)
call_galaxy_cli(['init', '%s.%s' % (namespace, collection), '-c', '--init-path', test_dir,
'--collection-skeleton', skeleton_path])
dependencies = getattr(request, 'param', None)
if dependencies:
galaxy_yml = os.path.join(collection_path, 'galaxy.yml')
with open(galaxy_yml, 'rb+') as galaxy_obj:
existing_yaml = yaml.safe_load(galaxy_obj)
existing_yaml['dependencies'] = dependencies
galaxy_obj.seek(0)
galaxy_obj.write(to_bytes(yaml.safe_dump(existing_yaml)))
galaxy_obj.truncate()
call_galaxy_cli(['build', collection_path, '--output-path', test_dir])
collection_tar = os.path.join(test_dir, '%s-%s-0.1.0.tar.gz' % (namespace, collection))
return to_bytes(collection_path), to_bytes(collection_tar)
@pytest.fixture()
def galaxy_server():
context.CLIARGS._store = {'ignore_certs': False}
galaxy_api = api.GalaxyAPI(None, 'test_server', 'https://galaxy.ansible.com')
return galaxy_api
def test_build_requirement_from_path(collection_artifact):
actual = collection.CollectionRequirement.from_path(collection_artifact[0], True)
assert actual.namespace == u'ansible_namespace'
assert actual.name == u'collection'
assert actual.b_path == collection_artifact[0]
assert actual.api is None
assert actual.skip is True
assert actual.versions == set([u'*'])
assert actual.latest_version == u'*'
assert actual.dependencies == {}
@pytest.mark.parametrize('version', ['1.1.1', '1.1.0', '1.0.0'])
def test_build_requirement_from_path_with_manifest(version, collection_artifact):
manifest_path = os.path.join(collection_artifact[0], b'MANIFEST.json')
manifest_value = json.dumps({
'collection_info': {
'namespace': 'namespace',
'name': 'name',
'version': version,
'dependencies': {
'ansible_namespace.collection': '*'
}
}
})
with open(manifest_path, 'wb') as manifest_obj:
manifest_obj.write(to_bytes(manifest_value))
actual = collection.CollectionRequirement.from_path(collection_artifact[0], True)
# While the folder name suggests a different collection, we treat MANIFEST.json as the source of truth.
assert actual.namespace == u'namespace'
assert actual.name == u'name'
assert actual.b_path == collection_artifact[0]
assert actual.api is None
assert actual.skip is True
assert actual.versions == set([to_text(version)])
assert actual.latest_version == to_text(version)
assert actual.dependencies == {'ansible_namespace.collection': '*'}
def test_build_requirement_from_path_invalid_manifest(collection_artifact):
manifest_path = os.path.join(collection_artifact[0], b'MANIFEST.json')
with open(manifest_path, 'wb') as manifest_obj:
manifest_obj.write(b"not json")
expected = "Collection file at '%s' does not contain a valid json string." % to_native(manifest_path)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_path(collection_artifact[0], True)
def test_build_requirement_from_path_no_version(collection_artifact, monkeypatch):
manifest_path = os.path.join(collection_artifact[0], b'MANIFEST.json')
manifest_value = json.dumps({
'collection_info': {
'namespace': 'namespace',
'name': 'name',
'version': '',
'dependencies': {}
}
})
with open(manifest_path, 'wb') as manifest_obj:
manifest_obj.write(to_bytes(manifest_value))
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
actual = collection.CollectionRequirement.from_path(collection_artifact[0], True)
# While the folder name suggests a different collection, we treat MANIFEST.json as the source of truth.
assert actual.namespace == u'namespace'
assert actual.name == u'name'
assert actual.b_path == collection_artifact[0]
assert actual.api is None
assert actual.skip is True
assert actual.versions == set(['*'])
assert actual.latest_version == u'*'
assert actual.dependencies == {}
assert mock_display.call_count == 1
actual_warn = ' '.join(mock_display.mock_calls[0][1][0].split('\n'))
expected_warn = "Collection at '%s' does not have a valid version set, falling back to '*'. Found version: ''" \
% to_text(collection_artifact[0])
assert expected_warn in actual_warn
def test_build_requirement_from_tar(collection_artifact):
actual = collection.CollectionRequirement.from_tar(collection_artifact[1], True, True)
assert actual.namespace == u'ansible_namespace'
assert actual.name == u'collection'
assert actual.b_path == collection_artifact[1]
assert actual.api is None
assert actual.skip is False
assert actual.versions == set([u'0.1.0'])
assert actual.latest_version == u'0.1.0'
assert actual.dependencies == {}
def test_build_requirement_from_tar_fail_not_tar(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
test_file = os.path.join(test_dir, b'fake.tar.gz')
with open(test_file, 'wb') as test_obj:
test_obj.write(b"\x00\x01\x02\x03")
expected = "Collection artifact at '%s' is not a valid tar file." % to_native(test_file)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(test_file, True, True)
def test_build_requirement_from_tar_no_manifest(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
json_data = to_bytes(json.dumps(
{
'files': [],
'format': 1,
}
))
tar_path = os.path.join(test_dir, b'ansible-collections.tar.gz')
with tarfile.open(tar_path, 'w:gz') as tfile:
b_io = BytesIO(json_data)
tar_info = tarfile.TarInfo('FILES.json')
tar_info.size = len(json_data)
tar_info.mode = 0o0644
tfile.addfile(tarinfo=tar_info, fileobj=b_io)
expected = "Collection at '%s' does not contain the required file MANIFEST.json." % to_native(tar_path)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(tar_path, True, True)
def test_build_requirement_from_tar_no_files(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
json_data = to_bytes(json.dumps(
{
'collection_info': {},
}
))
tar_path = os.path.join(test_dir, b'ansible-collections.tar.gz')
with tarfile.open(tar_path, 'w:gz') as tfile:
b_io = BytesIO(json_data)
tar_info = tarfile.TarInfo('MANIFEST.json')
tar_info.size = len(json_data)
tar_info.mode = 0o0644
tfile.addfile(tarinfo=tar_info, fileobj=b_io)
expected = "Collection at '%s' does not contain the required file FILES.json." % to_native(tar_path)
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(tar_path, True, True)
def test_build_requirement_from_tar_invalid_manifest(tmp_path_factory):
test_dir = to_bytes(tmp_path_factory.mktemp('test-ÅÑŚÌβŁÈ Collections Input'))
json_data = b"not a json"
tar_path = os.path.join(test_dir, b'ansible-collections.tar.gz')
with tarfile.open(tar_path, 'w:gz') as tfile:
b_io = BytesIO(json_data)
tar_info = tarfile.TarInfo('MANIFEST.json')
tar_info.size = len(json_data)
tar_info.mode = 0o0644
tfile.addfile(tarinfo=tar_info, fileobj=b_io)
expected = "Collection tar file member MANIFEST.json does not contain a valid json string."
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_tar(tar_path, True, True)
def test_build_requirement_from_name(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['2.1.9', '2.1.10']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '*', True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.1.9', u'2.1.10'])
assert actual.latest_version == u'2.1.10'
assert actual.dependencies == {}
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
def test_build_requirement_from_name_with_prerelease(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['1.0.1', '2.0.1-beta.1', '2.0.1']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '*', True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'1.0.1', u'2.0.1'])
assert actual.latest_version == u'2.0.1'
assert actual.dependencies == {}
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
def test_build_requirment_from_name_with_prerelease_explicit(galaxy_server, monkeypatch):
mock_get_info = MagicMock()
mock_get_info.return_value = api.CollectionVersionMetadata('namespace', 'collection', '2.0.1-beta.1', None, None,
{})
monkeypatch.setattr(galaxy_server, 'get_collection_version_metadata', mock_get_info)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '2.0.1-beta.1', True,
True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.1-beta.1'])
assert actual.latest_version == u'2.0.1-beta.1'
assert actual.dependencies == {}
assert mock_get_info.call_count == 1
assert mock_get_info.mock_calls[0][1] == ('namespace', 'collection', '2.0.1-beta.1')
def test_build_requirement_from_name_second_server(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['1.0.1', '1.0.2', '1.0.3']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
broken_server = copy.copy(galaxy_server)
broken_server.api_server = 'https://broken.com/'
mock_404 = MagicMock()
mock_404.side_effect = api.GalaxyError(urllib_error.HTTPError('https://galaxy.server.com', 404, 'msg', {},
StringIO()), "custom msg")
monkeypatch.setattr(broken_server, 'get_collection_versions', mock_404)
actual = collection.CollectionRequirement.from_name('namespace.collection', [broken_server, galaxy_server],
'>1.0.1', False, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
# assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'1.0.2', u'1.0.3'])
assert actual.latest_version == u'1.0.3'
assert actual.dependencies == {}
assert mock_404.call_count == 1
assert mock_404.mock_calls[0][1] == ('namespace', 'collection')
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
def test_build_requirement_from_name_missing(galaxy_server, monkeypatch):
mock_open = MagicMock()
mock_open.side_effect = api.GalaxyError(urllib_error.HTTPError('https://galaxy.server.com', 404, 'msg', {},
StringIO()), "")
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_open)
expected = "Failed to find collection namespace.collection:*"
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server, galaxy_server], '*', False,
True)
def test_build_requirement_from_name_401_unauthorized(galaxy_server, monkeypatch):
mock_open = MagicMock()
mock_open.side_effect = api.GalaxyError(urllib_error.HTTPError('https://galaxy.server.com', 401, 'msg', {},
StringIO()), "error")
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_open)
expected = "error (HTTP Code: 401, Message: msg)"
with pytest.raises(api.GalaxyError, match=re.escape(expected)):
collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server, galaxy_server], '*', False)
def test_build_requirement_from_name_single_version(galaxy_server, monkeypatch):
mock_get_info = MagicMock()
mock_get_info.return_value = api.CollectionVersionMetadata('namespace', 'collection', '2.0.0', None, None,
{})
monkeypatch.setattr(galaxy_server, 'get_collection_version_metadata', mock_get_info)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '2.0.0', True,
True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.0'])
assert actual.latest_version == u'2.0.0'
assert actual.dependencies == {}
assert mock_get_info.call_count == 1
assert mock_get_info.mock_calls[0][1] == ('namespace', 'collection', '2.0.0')
def test_build_requirement_from_name_multiple_versions_one_match(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['2.0.0', '2.0.1', '2.0.2']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
mock_get_info = MagicMock()
mock_get_info.return_value = api.CollectionVersionMetadata('namespace', 'collection', '2.0.1', None, None,
{})
monkeypatch.setattr(galaxy_server, 'get_collection_version_metadata', mock_get_info)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '>=2.0.1,<2.0.2',
True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.1'])
assert actual.latest_version == u'2.0.1'
assert actual.dependencies == {}
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
assert mock_get_info.call_count == 1
assert mock_get_info.mock_calls[0][1] == ('namespace', 'collection', '2.0.1')
def test_build_requirement_from_name_multiple_version_results(galaxy_server, monkeypatch):
mock_get_versions = MagicMock()
mock_get_versions.return_value = ['2.0.0', '2.0.1', '2.0.2', '2.0.3', '2.0.4', '2.0.5']
monkeypatch.setattr(galaxy_server, 'get_collection_versions', mock_get_versions)
actual = collection.CollectionRequirement.from_name('namespace.collection', [galaxy_server], '!=2.0.2',
True, True)
assert actual.namespace == u'namespace'
assert actual.name == u'collection'
assert actual.b_path is None
assert actual.api == galaxy_server
assert actual.skip is False
assert actual.versions == set([u'2.0.0', u'2.0.1', u'2.0.3', u'2.0.4', u'2.0.5'])
assert actual.latest_version == u'2.0.5'
assert actual.dependencies == {}
assert mock_get_versions.call_count == 1
assert mock_get_versions.mock_calls[0][1] == ('namespace', 'collection')
@pytest.mark.parametrize('versions, requirement, expected_filter, expected_latest', [
[['1.0.0', '1.0.1'], '*', ['1.0.0', '1.0.1'], '1.0.1'],
[['1.0.0', '1.0.5', '1.1.0'], '>1.0.0,<1.1.0', ['1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '>1.0.0,<=1.0.5', ['1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '>=1.1.0', ['1.1.0'], '1.1.0'],
[['1.0.0', '1.0.5', '1.1.0'], '!=1.1.0', ['1.0.0', '1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '==1.0.5', ['1.0.5'], '1.0.5'],
[['1.0.0', '1.0.5', '1.1.0'], '1.0.5', ['1.0.5'], '1.0.5'],
[['1.0.0', '2.0.0', '3.0.0'], '>=2', ['2.0.0', '3.0.0'], '3.0.0'],
])
def test_add_collection_requirements(versions, requirement, expected_filter, expected_latest):
req = collection.CollectionRequirement('namespace', 'name', None, 'https://galaxy.com', versions, requirement,
False)
assert req.versions == set(expected_filter)
assert req.latest_version == expected_latest
def test_add_collection_requirement_to_unknown_installed_version(monkeypatch):
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
req = collection.CollectionRequirement('namespace', 'name', None, 'https://galaxy.com', ['*'], '*', False,
skip=True)
req.add_requirement('parent.collection', '1.0.0')
assert req.latest_version == '*'
assert mock_display.call_count == 1
actual_warn = ' '.join(mock_display.mock_calls[0][1][0].split('\n'))
assert "Failed to validate the collection requirement 'namespace.name:1.0.0' for parent.collection" in actual_warn
def test_add_collection_wildcard_requirement_to_unknown_installed_version():
req = collection.CollectionRequirement('namespace', 'name', None, 'https://galaxy.com', ['*'], '*', False,
skip=True)
req.add_requirement(str(req), '*')
assert req.versions == set('*')
assert req.latest_version == '*'
def test_add_collection_requirement_with_conflict(galaxy_server):
expected = "Cannot meet requirement ==1.0.2 for dependency namespace.name from source '%s'. Available versions " \
"before last requirement added: 1.0.0, 1.0.1\n" \
"Requirements from:\n" \
"\tbase - 'namespace.name:==1.0.2'" % galaxy_server.api_server
with pytest.raises(AnsibleError, match=expected):
collection.CollectionRequirement('namespace', 'name', None, galaxy_server, ['1.0.0', '1.0.1'], '==1.0.2',
False)
def test_add_requirement_to_existing_collection_with_conflict(galaxy_server):
req = collection.CollectionRequirement('namespace', 'name', None, galaxy_server, ['1.0.0', '1.0.1'], '*', False)
expected = "Cannot meet dependency requirement 'namespace.name:1.0.2' for collection namespace.collection2 from " \
"source '%s'. Available versions before last requirement added: 1.0.0, 1.0.1\n" \
"Requirements from:\n" \
"\tbase - 'namespace.name:*'\n" \
"\tnamespace.collection2 - 'namespace.name:1.0.2'" % galaxy_server.api_server
with pytest.raises(AnsibleError, match=re.escape(expected)):
req.add_requirement('namespace.collection2', '1.0.2')
def test_add_requirement_to_installed_collection_with_conflict():
source = 'https://galaxy.ansible.com'
req = collection.CollectionRequirement('namespace', 'name', None, source, ['1.0.0', '1.0.1'], '*', False,
skip=True)
expected = "Cannot meet requirement namespace.name:1.0.2 as it is already installed at version '1.0.1'. " \
"Use --force to overwrite"
with pytest.raises(AnsibleError, match=re.escape(expected)):
req.add_requirement(None, '1.0.2')
def test_add_requirement_to_installed_collection_with_conflict_as_dep():
source = 'https://galaxy.ansible.com'
req = collection.CollectionRequirement('namespace', 'name', None, source, ['1.0.0', '1.0.1'], '*', False,
skip=True)
expected = "Cannot meet requirement namespace.name:1.0.2 as it is already installed at version '1.0.1'. " \
"Use --force-with-deps to overwrite"
with pytest.raises(AnsibleError, match=re.escape(expected)):
req.add_requirement('namespace.collection2', '1.0.2')
def test_install_skipped_collection(monkeypatch):
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
req = collection.CollectionRequirement('namespace', 'name', None, 'source', ['1.0.0'], '*', False, skip=True)
req.install(None, None)
assert mock_display.call_count == 1
assert mock_display.mock_calls[0][1][0] == "Skipping 'namespace.name' as it is already installed"
def test_install_collection(collection_artifact, monkeypatch):
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
collection_tar = collection_artifact[1]
output_path = os.path.join(os.path.split(collection_tar)[0], b'output')
collection_path = os.path.join(output_path, b'ansible_namespace', b'collection')
os.makedirs(os.path.join(collection_path, b'delete_me')) # Create a folder to verify the install cleans out the dir
temp_path = os.path.join(os.path.split(collection_tar)[0], b'temp')
os.makedirs(temp_path)
req = collection.CollectionRequirement.from_tar(collection_tar, True, True)
req.install(to_text(output_path), temp_path)
# Ensure the temp directory is empty, nothing is left behind
assert os.listdir(temp_path) == []
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
assert mock_display.call_count == 1
assert mock_display.mock_calls[0][1][0] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" \
% to_text(collection_path)
def test_install_collection_with_download(galaxy_server, collection_artifact, monkeypatch):
collection_tar = collection_artifact[1]
output_path = os.path.join(os.path.split(collection_tar)[0], b'output')
collection_path = os.path.join(output_path, b'ansible_namespace', b'collection')
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
mock_download = MagicMock()
mock_download.return_value = collection_tar
monkeypatch.setattr(collection, '_download_file', mock_download)
monkeypatch.setattr(galaxy_server, '_available_api_versions', {'v2': 'v2/'})
temp_path = os.path.join(os.path.split(collection_tar)[0], b'temp')
os.makedirs(temp_path)
meta = api.CollectionVersionMetadata('ansible_namespace', 'collection', '0.1.0', 'https://downloadme.com',
'myhash', {})
req = collection.CollectionRequirement('ansible_namespace', 'collection', None, galaxy_server,
['0.1.0'], '*', False, metadata=meta)
req.install(to_text(output_path), temp_path)
# Ensure the temp directory is empty, nothing is left behind
assert os.listdir(temp_path) == []
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
assert mock_display.call_count == 1
assert mock_display.mock_calls[0][1][0] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" \
% to_text(collection_path)
assert mock_download.call_count == 1
assert mock_download.mock_calls[0][1][0] == 'https://downloadme.com'
assert mock_download.mock_calls[0][1][1] == temp_path
assert mock_download.mock_calls[0][1][2] == 'myhash'
assert mock_download.mock_calls[0][1][3] is True
def test_install_collections_from_tar(collection_artifact, monkeypatch):
collection_path, collection_tar = collection_artifact
temp_path = os.path.split(collection_tar)[0]
shutil.rmtree(collection_path)
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path)
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
with open(os.path.join(collection_path, b'MANIFEST.json'), 'rb') as manifest_obj:
actual_manifest = json.loads(to_text(manifest_obj.read()))
assert actual_manifest['collection_info']['namespace'] == 'ansible_namespace'
assert actual_manifest['collection_info']['name'] == 'collection'
assert actual_manifest['collection_info']['version'] == '0.1.0'
# Filter out the progress cursor display calls.
display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1]
assert len(display_msgs) == 3
assert display_msgs[0] == "Process install dependency map"
assert display_msgs[1] == "Starting collection install process"
assert display_msgs[2] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" % to_text(collection_path)
def test_install_collections_existing_without_force(collection_artifact, monkeypatch):
collection_path, collection_tar = collection_artifact
temp_path = os.path.split(collection_tar)[0]
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
# If we don't delete collection_path it will think the original build skeleton is installed so we expect a skip
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path)
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'README.md', b'docs', b'galaxy.yml', b'playbooks', b'plugins', b'roles']
# Filter out the progress cursor display calls.
display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1]
assert len(display_msgs) == 4
# Msg1 is the warning about not MANIFEST.json, cannot really check message as it has line breaks which varies based
# on the path size
assert display_msgs[1] == "Process install dependency map"
assert display_msgs[2] == "Starting collection install process"
assert display_msgs[3] == "Skipping 'ansible_namespace.collection' as it is already installed"
# Makes sure we don't get stuck in some recursive loop
@pytest.mark.parametrize('collection_artifact', [
{'ansible_namespace.collection': '>=0.0.1'},
], indirect=True)
def test_install_collection_with_circular_dependency(collection_artifact, monkeypatch):
collection_path, collection_tar = collection_artifact
temp_path = os.path.split(collection_tar)[0]
shutil.rmtree(collection_path)
mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display)
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path)
actual_files = os.listdir(collection_path)
actual_files.sort()
assert actual_files == [b'FILES.json', b'MANIFEST.json', b'README.md', b'docs', b'playbooks', b'plugins', b'roles']
with open(os.path.join(collection_path, b'MANIFEST.json'), 'rb') as manifest_obj:
actual_manifest = json.loads(to_text(manifest_obj.read()))
assert actual_manifest['collection_info']['namespace'] == 'ansible_namespace'
assert actual_manifest['collection_info']['name'] == 'collection'
assert actual_manifest['collection_info']['version'] == '0.1.0'
# Filter out the progress cursor display calls.
display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1]
assert len(display_msgs) == 3
assert display_msgs[0] == "Process install dependency map"
assert display_msgs[1] == "Starting collection install process"
assert display_msgs[2] == "Installing 'ansible_namespace.collection:0.1.0' to '%s'" % to_text(collection_path)
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 66,534 |
ansible-galaxy cli should have an option to keep the collection tarball when installing collection
|
<!--- Verify first that your feature was not already discussed on GitHub -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Describe the new feature/improvement briefly below -->
Use case is for customers on air-gapped customers who want to take the tarball using a flash drive inside their network.
Currently that is not possible because cli does not keep the tarball after installing the collection.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ansible-galaxy cli
##### ADDITIONAL INFORMATION
<!--- Describe how the feature would be used, why it is needed and what it would solve -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can also paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/66534
|
https://github.com/ansible/ansible/pull/67632
|
28f8b8976022728b24534cae871d2b3c8724ecce
|
a2deeb8fa27633194d12dfd8e8768ab57100e6d1
| 2020-01-16T15:02:42Z |
python
| 2020-03-24T22:32:43Z |
changelogs/fragments/galaxy-download.yaml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 66,534 |
ansible-galaxy cli should have an option to keep the collection tarball when installing collection
|
<!--- Verify first that your feature was not already discussed on GitHub -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Describe the new feature/improvement briefly below -->
Use case is for customers on air-gapped customers who want to take the tarball using a flash drive inside their network.
Currently that is not possible because cli does not keep the tarball after installing the collection.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ansible-galaxy cli
##### ADDITIONAL INFORMATION
<!--- Describe how the feature would be used, why it is needed and what it would solve -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can also paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/66534
|
https://github.com/ansible/ansible/pull/67632
|
28f8b8976022728b24534cae871d2b3c8724ecce
|
a2deeb8fa27633194d12dfd8e8768ab57100e6d1
| 2020-01-16T15:02:42Z |
python
| 2020-03-24T22:32:43Z |
docs/docsite/rst/user_guide/collections_using.rst
|
.. _collections:
*****************
Using collections
*****************
Collections are a distribution format for Ansible content that can include playbooks, roles, modules, and plugins.
You can install and use collections through `Ansible Galaxy <https://galaxy.ansible.com>`_.
* For details on how to *develop* collections see :ref:`developing_collections`.
* For the current development status of Collections and FAQ see `Ansible Collections Community Guide <https://github.com/ansible-collections/general/blob/master/README.rst>`_.
.. contents::
:local:
:depth: 2
.. _collections_installing:
Installing collections
======================
Installing collections with ``ansible-galaxy``
----------------------------------------------
.. include:: ../shared_snippets/installing_collections.txt
.. _collections_older_version:
Installing an older version of a collection
-------------------------------------------
.. include:: ../shared_snippets/installing_older_collection.txt
.. _collection_requirements_file:
Install multiple collections with a requirements file
-----------------------------------------------------
.. include:: ../shared_snippets/installing_multiple_collections.txt
.. _collection_offline_download:
Downloading a collection for offline use
-----------------------------------------
.. include:: ../shared_snippets/download_tarball_collections.txt
.. _galaxy_server_config:
Configuring the ``ansible-galaxy`` client
------------------------------------------
.. include:: ../shared_snippets/galaxy_server_list.txt
.. _collections_listing:
Listing collections
===================
To list installed collections, run ``ansible-galaxy collection list``. This shows all of the installed collections found in the configured collections search paths. The path where the collections are located are displayed as well as version information. If no version information is available, a ``*`` is displayed for the version number.
.. code-block:: shell
# /home/astark/.ansible/collections/ansible_collections
Collection Version
-------------------------- -------
cisco.aci 0.0.5
cisco.mso 0.0.4
sandwiches.ham *
splunk.enterprise_security 0.0.5
# /usr/share/ansible/collections/ansible_collections
Collection Version
----------------- -------
fortinet.fortios 1.0.6
pureport.pureport 0.0.8
sensu.sensu_go 1.3.0
Run with ``-vvv`` to display more detailed information.
To list a specific collection, pass a valid fully qualified collection name (FQCN) to the command ``ansible-galaxy collection list``. All instances of the collection will be listed.
.. code-block:: shell
> ansible-galaxy collection list fortinet.fortios
# /home/astark/.ansible/collections/ansible_collections
Collection Version
---------------- -------
fortinet.fortios 1.0.1
# /usr/share/ansible/collections/ansible_collections
Collection Version
---------------- -------
fortinet.fortios 1.0.6
To search other paths for collections, use the ``-p`` option. Specify multiple search paths by separating them with a ``:``. The list of paths specified on the command line will be added to the beginning of the configured collections search paths.
.. code-block:: shell
> ansible-galaxy collection list -p '/opt/ansible/collections:/etc/ansible/collections'
# /opt/ansible/collections/ansible_collections
Collection Version
--------------- -------
sandwiches.club 1.7.2
# /etc/ansible/collections/ansible_collections
Collection Version
-------------- -------
sandwiches.pbj 1.2.0
# /home/astark/.ansible/collections/ansible_collections
Collection Version
-------------------------- -------
cisco.aci 0.0.5
cisco.mso 0.0.4
fortinet.fortios 1.0.1
sandwiches.ham *
splunk.enterprise_security 0.0.5
# /usr/share/ansible/collections/ansible_collections
Collection Version
----------------- -------
fortinet.fortios 1.0.6
pureport.pureport 0.0.8
sensu.sensu_go 1.3.0
.. _using_collections:
Verifying collections
=====================
Verifying collections with ``ansible-galaxy``
---------------------------------------------
Once installed, you can verify that the content of the installed collection matches the content of the collection on the server. This feature expects that the collection is installed in one of the configured collection paths and that the collection exists on one of the configured galaxy servers.
.. code-block:: bash
ansible-galaxy collection verify my_namespace.my_collection
The output of the ``ansible-galaxy collection verify`` command is quiet if it is successful. If a collection has been modified, the altered files are listed under the collection name.
.. code-block:: bash
ansible-galaxy collection verify my_namespace.my_collection
Collection my_namespace.my_collection contains modified content in the following files:
my_namespace.my_collection
plugins/inventory/my_inventory.py
plugins/modules/my_module.py
You can use the ``-vvv`` flag to display additional information, such as the version and path of the installed collection, the URL of the remote collection used for validation, and successful verification output.
.. code-block:: bash
ansible-galaxy collection verify my_namespace.my_collection -vvv
...
Verifying 'my_namespace.my_collection:1.0.0'.
Installed collection found at '/path/to/ansible_collections/my_namespace/my_collection/'
Remote collection found at 'https://galaxy.ansible.com/download/my_namespace-my_collection-1.0.0.tar.gz'
Successfully verified that checksums for 'my_namespace.my_collection:1.0.0' match the remote collection
If you have a pre-release or non-latest version of a collection installed you should include the specific version to verify. If the version is omitted, the installed collection is verified against the latest version available on the server.
.. code-block:: bash
ansible-galaxy collection verify my_namespace.my_collection:1.0.0
In addition to the ``namespace.collection_name:version`` format, you can provide the collections to verify in a ``requirements.yml`` file. Dependencies listed in ``requirements.yml`` are not included in the verify process and should be verified separately.
.. code-block:: bash
ansible-galaxy collection verify -r requirements.yml
Verifying against ``tar.gz`` files is not supported. If your ``requirements.yml`` contains paths to tar files or URLs for installation, you can use the ``--ignore-errors`` flag to ensure that all collections using the ``namespace.name`` format in the file are processed.
Using collections in a Playbook
===============================
Once installed, you can reference a collection content by its fully qualified collection name (FQCN):
.. code-block:: yaml
- hosts: all
tasks:
- my_namespace.my_collection.mymodule:
option1: value
This works for roles or any type of plugin distributed within the collection:
.. code-block:: yaml
- hosts: all
tasks:
- import_role:
name: my_namespace.my_collection.role1
- my_namespace.mycollection.mymodule:
option1: value
- debug:
msg: '{{ lookup("my_namespace.my_collection.lookup1", 'param1')| my_namespace.my_collection.filter1 }}'
Simplifying module names with the ``collections`` keyword
=========================================================
The ``collections`` keyword lets you define a list of collections that your role or playbook should search for unqualified module and action names. So you can use the ``collections`` keyword, then simply refer to modules and action plugins by their short-form names throughout that role or playbook.
.. warning::
If your playbook uses both the ``collections`` keyword and one or more roles, the roles do not inherit the collections set by the playbook. See below for details.
Using ``collections`` in roles
------------------------------
Within a role, you can control which collections Ansible searches for the tasks inside the role using the ``collections`` keyword in the role's ``meta/main.yml``. Ansible will use the collections list defined inside the role even if the playbook that calls the role defines different collections in a separate ``collections`` keyword entry. Roles defined inside a collection always implicitly search their own collection first, so you don't need to use the ``collections`` keyword to access modules, actions, or other roles contained in the same collection.
.. code-block:: yaml
# myrole/meta/main.yml
collections:
- my_namespace.first_collection
- my_namespace.second_collection
- other_namespace.other_collection
Using ``collections`` in playbooks
----------------------------------
In a playbook, you can control the collections Ansible searches for modules and action plugins to execute. However, any roles you call in your playbook define their own collections search order; they do not inherit the calling playbook's settings. This is true even if the role does not define its own ``collections`` keyword.
.. code-block:: yaml
- hosts: all
collections:
- my_namespace.my_collection
tasks:
- import_role:
name: role1
- mymodule:
option1: value
- debug:
msg: '{{ lookup("my_namespace.my_collection.lookup1", 'param1')| my_namespace.my_collection.filter1 }}'
The ``collections`` keyword merely creates an ordered 'search path' for non-namespaced plugin and role references. It does not install content or otherwise change Ansible's behavior around the loading of plugins or roles. Note that an FQCN is still required for non-action or module plugins (e.g., lookups, filters, tests).
.. seealso::
:ref:`developing_collections`
Develop or modify a collection.
:ref:`collections_galaxy_meta`
Understand the collections metadata structure.
`Mailing List <https://groups.google.com/group/ansible-devel>`_
The development mailing list
`irc.freenode.net <http://irc.freenode.net>`_
#ansible IRC chat channel
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 66,534 |
ansible-galaxy cli should have an option to keep the collection tarball when installing collection
|
<!--- Verify first that your feature was not already discussed on GitHub -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Describe the new feature/improvement briefly below -->
Use case is for customers on air-gapped customers who want to take the tarball using a flash drive inside their network.
Currently that is not possible because cli does not keep the tarball after installing the collection.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ansible-galaxy cli
##### ADDITIONAL INFORMATION
<!--- Describe how the feature would be used, why it is needed and what it would solve -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can also paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/66534
|
https://github.com/ansible/ansible/pull/67632
|
28f8b8976022728b24534cae871d2b3c8724ecce
|
a2deeb8fa27633194d12dfd8e8768ab57100e6d1
| 2020-01-16T15:02:42Z |
python
| 2020-03-24T22:32:43Z |
lib/ansible/cli/galaxy.py
|
# Copyright: (c) 2013, James Cammarata <[email protected]>
# Copyright: (c) 2018, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import os.path
import re
import shutil
import textwrap
import time
import yaml
from jinja2 import BaseLoader, Environment, FileSystemLoader
from yaml.error import YAMLError
import ansible.constants as C
from ansible import context
from ansible.cli import CLI
from ansible.cli.arguments import option_helpers as opt_help
from ansible.errors import AnsibleError, AnsibleOptionsError
from ansible.galaxy import Galaxy, get_collections_galaxy_meta_info
from ansible.galaxy.api import GalaxyAPI
from ansible.galaxy.collection import (
build_collection,
CollectionRequirement,
find_existing_collections,
install_collections,
publish_collection,
validate_collection_name,
validate_collection_path,
verify_collections
)
from ansible.galaxy.login import GalaxyLogin
from ansible.galaxy.role import GalaxyRole
from ansible.galaxy.token import BasicAuthToken, GalaxyToken, KeycloakToken, NoTokenSentinel
from ansible.module_utils.ansible_release import __version__ as ansible_version
from ansible.module_utils.common.collections import is_iterable
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.module_utils import six
from ansible.parsing.yaml.loader import AnsibleLoader
from ansible.playbook.role.requirement import RoleRequirement
from ansible.utils.display import Display
from ansible.utils.plugin_docs import get_versioned_doclink
display = Display()
urlparse = six.moves.urllib.parse.urlparse
def _display_header(path, h1, h2, w1=10, w2=7):
display.display('\n# {0}\n{1:{cwidth}} {2:{vwidth}}\n{3} {4}\n'.format(
path,
h1,
h2,
'-' * max([len(h1), w1]), # Make sure that the number of dashes is at least the width of the header
'-' * max([len(h2), w2]),
cwidth=w1,
vwidth=w2,
))
def _display_role(gr):
install_info = gr.install_info
version = None
if install_info:
version = install_info.get("version", None)
if not version:
version = "(unknown version)"
display.display("- %s, %s" % (gr.name, version))
def _display_collection(collection, cwidth=10, vwidth=7, min_cwidth=10, min_vwidth=7):
display.display('{fqcn:{cwidth}} {version:{vwidth}}'.format(
fqcn=to_text(collection),
version=collection.latest_version,
cwidth=max(cwidth, min_cwidth), # Make sure the width isn't smaller than the header
vwidth=max(vwidth, min_vwidth)
))
def _get_collection_widths(collections):
if is_iterable(collections):
fqcn_set = set(to_text(c) for c in collections)
version_set = set(to_text(c.latest_version) for c in collections)
else:
fqcn_set = set([to_text(collections)])
version_set = set([collections.latest_version])
fqcn_length = len(max(fqcn_set, key=len))
version_length = len(max(version_set, key=len))
return fqcn_length, version_length
class GalaxyCLI(CLI):
'''command to manage Ansible roles in shared repositories, the default of which is Ansible Galaxy *https://galaxy.ansible.com*.'''
SKIP_INFO_KEYS = ("name", "description", "readme_html", "related", "summary_fields", "average_aw_composite", "average_aw_score", "url")
def __init__(self, args):
# Inject role into sys.argv[1] as a backwards compatibility step
if len(args) > 1 and args[1] not in ['-h', '--help', '--version'] and 'role' not in args and 'collection' not in args:
# TODO: Should we add a warning here and eventually deprecate the implicit role subcommand choice
# Remove this in Ansible 2.13 when we also remove -v as an option on the root parser for ansible-galaxy.
idx = 2 if args[1].startswith('-v') else 1
args.insert(idx, 'role')
self.api_servers = []
self.galaxy = None
super(GalaxyCLI, self).__init__(args)
def init_parser(self):
''' create an options parser for bin/ansible '''
super(GalaxyCLI, self).init_parser(
desc="Perform various Role and Collection related operations.",
)
# Common arguments that apply to more than 1 action
common = opt_help.argparse.ArgumentParser(add_help=False)
common.add_argument('-s', '--server', dest='api_server', help='The Galaxy API server URL')
common.add_argument('--token', '--api-key', dest='api_key',
help='The Ansible Galaxy API key which can be found at '
'https://galaxy.ansible.com/me/preferences. You can also use ansible-galaxy login to '
'retrieve this key or set the token for the GALAXY_SERVER_LIST entry.')
common.add_argument('-c', '--ignore-certs', action='store_true', dest='ignore_certs',
default=C.GALAXY_IGNORE_CERTS, help='Ignore SSL certificate validation errors.')
opt_help.add_verbosity_options(common)
force = opt_help.argparse.ArgumentParser(add_help=False)
force.add_argument('-f', '--force', dest='force', action='store_true', default=False,
help='Force overwriting an existing role or collection')
github = opt_help.argparse.ArgumentParser(add_help=False)
github.add_argument('github_user', help='GitHub username')
github.add_argument('github_repo', help='GitHub repository')
offline = opt_help.argparse.ArgumentParser(add_help=False)
offline.add_argument('--offline', dest='offline', default=False, action='store_true',
help="Don't query the galaxy API when creating roles")
default_roles_path = C.config.get_configuration_definition('DEFAULT_ROLES_PATH').get('default', '')
roles_path = opt_help.argparse.ArgumentParser(add_help=False)
roles_path.add_argument('-p', '--roles-path', dest='roles_path', type=opt_help.unfrack_path(pathsep=True),
default=C.DEFAULT_ROLES_PATH, action=opt_help.PrependListAction,
help='The path to the directory containing your roles. The default is the first '
'writable one configured via DEFAULT_ROLES_PATH: %s ' % default_roles_path)
collections_path = opt_help.argparse.ArgumentParser(add_help=False)
collections_path.add_argument('-p', '--collection-path', dest='collections_path', type=opt_help.unfrack_path(pathsep=True),
default=C.COLLECTIONS_PATHS, action=opt_help.PrependListAction,
help="One or more directories to search for collections in addition "
"to the default COLLECTIONS_PATHS. Separate multiple paths "
"with '{0}'.".format(os.path.pathsep))
# Add sub parser for the Galaxy role type (role or collection)
type_parser = self.parser.add_subparsers(metavar='TYPE', dest='type')
type_parser.required = True
# Add sub parser for the Galaxy collection actions
collection = type_parser.add_parser('collection', help='Manage an Ansible Galaxy collection.')
collection_parser = collection.add_subparsers(metavar='COLLECTION_ACTION', dest='action')
collection_parser.required = True
self.add_init_options(collection_parser, parents=[common, force])
self.add_build_options(collection_parser, parents=[common, force])
self.add_publish_options(collection_parser, parents=[common])
self.add_install_options(collection_parser, parents=[common, force])
self.add_list_options(collection_parser, parents=[common, collections_path])
self.add_verify_options(collection_parser, parents=[common, collections_path])
# Add sub parser for the Galaxy role actions
role = type_parser.add_parser('role', help='Manage an Ansible Galaxy role.')
role_parser = role.add_subparsers(metavar='ROLE_ACTION', dest='action')
role_parser.required = True
self.add_init_options(role_parser, parents=[common, force, offline])
self.add_remove_options(role_parser, parents=[common, roles_path])
self.add_delete_options(role_parser, parents=[common, github])
self.add_list_options(role_parser, parents=[common, roles_path])
self.add_search_options(role_parser, parents=[common])
self.add_import_options(role_parser, parents=[common, github])
self.add_setup_options(role_parser, parents=[common, roles_path])
self.add_login_options(role_parser, parents=[common])
self.add_info_options(role_parser, parents=[common, roles_path, offline])
self.add_install_options(role_parser, parents=[common, force, roles_path])
def add_init_options(self, parser, parents=None):
galaxy_type = 'collection' if parser.metavar == 'COLLECTION_ACTION' else 'role'
init_parser = parser.add_parser('init', parents=parents,
help='Initialize new {0} with the base structure of a '
'{0}.'.format(galaxy_type))
init_parser.set_defaults(func=self.execute_init)
init_parser.add_argument('--init-path', dest='init_path', default='./',
help='The path in which the skeleton {0} will be created. The default is the '
'current working directory.'.format(galaxy_type))
init_parser.add_argument('--{0}-skeleton'.format(galaxy_type), dest='{0}_skeleton'.format(galaxy_type),
default=C.GALAXY_ROLE_SKELETON,
help='The path to a {0} skeleton that the new {0} should be based '
'upon.'.format(galaxy_type))
obj_name_kwargs = {}
if galaxy_type == 'collection':
obj_name_kwargs['type'] = validate_collection_name
init_parser.add_argument('{0}_name'.format(galaxy_type), help='{0} name'.format(galaxy_type.capitalize()),
**obj_name_kwargs)
if galaxy_type == 'role':
init_parser.add_argument('--type', dest='role_type', action='store', default='default',
help="Initialize using an alternate role type. Valid types include: 'container', "
"'apb' and 'network'.")
def add_remove_options(self, parser, parents=None):
remove_parser = parser.add_parser('remove', parents=parents, help='Delete roles from roles_path.')
remove_parser.set_defaults(func=self.execute_remove)
remove_parser.add_argument('args', help='Role(s)', metavar='role', nargs='+')
def add_delete_options(self, parser, parents=None):
delete_parser = parser.add_parser('delete', parents=parents,
help='Removes the role from Galaxy. It does not remove or alter the actual '
'GitHub repository.')
delete_parser.set_defaults(func=self.execute_delete)
def add_list_options(self, parser, parents=None):
galaxy_type = 'role'
if parser.metavar == 'COLLECTION_ACTION':
galaxy_type = 'collection'
list_parser = parser.add_parser('list', parents=parents,
help='Show the name and version of each {0} installed in the {0}s_path.'.format(galaxy_type))
list_parser.set_defaults(func=self.execute_list)
list_parser.add_argument(galaxy_type, help=galaxy_type.capitalize(), nargs='?', metavar=galaxy_type)
def add_search_options(self, parser, parents=None):
search_parser = parser.add_parser('search', parents=parents,
help='Search the Galaxy database by tags, platforms, author and multiple '
'keywords.')
search_parser.set_defaults(func=self.execute_search)
search_parser.add_argument('--platforms', dest='platforms', help='list of OS platforms to filter by')
search_parser.add_argument('--galaxy-tags', dest='galaxy_tags', help='list of galaxy tags to filter by')
search_parser.add_argument('--author', dest='author', help='GitHub username')
search_parser.add_argument('args', help='Search terms', metavar='searchterm', nargs='*')
def add_import_options(self, parser, parents=None):
import_parser = parser.add_parser('import', parents=parents, help='Import a role')
import_parser.set_defaults(func=self.execute_import)
import_parser.add_argument('--no-wait', dest='wait', action='store_false', default=True,
help="Don't wait for import results.")
import_parser.add_argument('--branch', dest='reference',
help='The name of a branch to import. Defaults to the repository\'s default branch '
'(usually master)')
import_parser.add_argument('--role-name', dest='role_name',
help='The name the role should have, if different than the repo name')
import_parser.add_argument('--status', dest='check_status', action='store_true', default=False,
help='Check the status of the most recent import request for given github_'
'user/github_repo.')
def add_setup_options(self, parser, parents=None):
setup_parser = parser.add_parser('setup', parents=parents,
help='Manage the integration between Galaxy and the given source.')
setup_parser.set_defaults(func=self.execute_setup)
setup_parser.add_argument('--remove', dest='remove_id', default=None,
help='Remove the integration matching the provided ID value. Use --list to see '
'ID values.')
setup_parser.add_argument('--list', dest="setup_list", action='store_true', default=False,
help='List all of your integrations.')
setup_parser.add_argument('source', help='Source')
setup_parser.add_argument('github_user', help='GitHub username')
setup_parser.add_argument('github_repo', help='GitHub repository')
setup_parser.add_argument('secret', help='Secret')
def add_login_options(self, parser, parents=None):
login_parser = parser.add_parser('login', parents=parents,
help="Login to api.github.com server in order to use ansible-galaxy role sub "
"command such as 'import', 'delete', 'publish', and 'setup'")
login_parser.set_defaults(func=self.execute_login)
login_parser.add_argument('--github-token', dest='token', default=None,
help='Identify with github token rather than username and password.')
def add_info_options(self, parser, parents=None):
info_parser = parser.add_parser('info', parents=parents, help='View more details about a specific role.')
info_parser.set_defaults(func=self.execute_info)
info_parser.add_argument('args', nargs='+', help='role', metavar='role_name[,version]')
def add_verify_options(self, parser, parents=None):
galaxy_type = 'collection'
verify_parser = parser.add_parser('verify', parents=parents, help='Compare checksums with the collection(s) '
'found on the server and the installed copy. This does not verify dependencies.')
verify_parser.set_defaults(func=self.execute_verify)
verify_parser.add_argument('args', metavar='{0}_name'.format(galaxy_type), nargs='*', help='The collection(s) name or '
'path/url to a tar.gz collection artifact. This is mutually exclusive with --requirements-file.')
verify_parser.add_argument('-i', '--ignore-errors', dest='ignore_errors', action='store_true', default=False,
help='Ignore errors during verification and continue with the next specified collection.')
verify_parser.add_argument('-r', '--requirements-file', dest='requirements',
help='A file containing a list of collections to be verified.')
def add_install_options(self, parser, parents=None):
galaxy_type = 'collection' if parser.metavar == 'COLLECTION_ACTION' else 'role'
args_kwargs = {}
if galaxy_type == 'collection':
args_kwargs['help'] = 'The collection(s) name or path/url to a tar.gz collection artifact. This is ' \
'mutually exclusive with --requirements-file.'
ignore_errors_help = 'Ignore errors during installation and continue with the next specified ' \
'collection. This will not ignore dependency conflict errors.'
else:
args_kwargs['help'] = 'Role name, URL or tar file'
ignore_errors_help = 'Ignore errors and continue with the next specified role.'
install_parser = parser.add_parser('install', parents=parents,
help='Install {0}(s) from file(s), URL(s) or Ansible '
'Galaxy'.format(galaxy_type))
install_parser.set_defaults(func=self.execute_install)
install_parser.add_argument('args', metavar='{0}_name'.format(galaxy_type), nargs='*', **args_kwargs)
install_parser.add_argument('-i', '--ignore-errors', dest='ignore_errors', action='store_true', default=False,
help=ignore_errors_help)
install_exclusive = install_parser.add_mutually_exclusive_group()
install_exclusive.add_argument('-n', '--no-deps', dest='no_deps', action='store_true', default=False,
help="Don't download {0}s listed as dependencies.".format(galaxy_type))
install_exclusive.add_argument('--force-with-deps', dest='force_with_deps', action='store_true', default=False,
help="Force overwriting an existing {0} and its "
"dependencies.".format(galaxy_type))
if galaxy_type == 'collection':
install_parser.add_argument('-p', '--collections-path', dest='collections_path',
default=C.COLLECTIONS_PATHS[0],
help='The path to the directory containing your collections.')
install_parser.add_argument('-r', '--requirements-file', dest='requirements',
help='A file containing a list of collections to be installed.')
install_parser.add_argument('--pre', dest='allow_pre_release', action='store_true',
help='Include pre-release versions. Semantic versioning pre-releases are ignored by default')
else:
install_parser.add_argument('-r', '--role-file', dest='role_file',
help='A file containing a list of roles to be imported.')
install_parser.add_argument('-g', '--keep-scm-meta', dest='keep_scm_meta', action='store_true',
default=False,
help='Use tar instead of the scm archive option when packaging the role.')
def add_build_options(self, parser, parents=None):
build_parser = parser.add_parser('build', parents=parents,
help='Build an Ansible collection artifact that can be publish to Ansible '
'Galaxy.')
build_parser.set_defaults(func=self.execute_build)
build_parser.add_argument('args', metavar='collection', nargs='*', default=('.',),
help='Path to the collection(s) directory to build. This should be the directory '
'that contains the galaxy.yml file. The default is the current working '
'directory.')
build_parser.add_argument('--output-path', dest='output_path', default='./',
help='The path in which the collection is built to. The default is the current '
'working directory.')
def add_publish_options(self, parser, parents=None):
publish_parser = parser.add_parser('publish', parents=parents,
help='Publish a collection artifact to Ansible Galaxy.')
publish_parser.set_defaults(func=self.execute_publish)
publish_parser.add_argument('args', metavar='collection_path',
help='The path to the collection tarball to publish.')
publish_parser.add_argument('--no-wait', dest='wait', action='store_false', default=True,
help="Don't wait for import validation results.")
publish_parser.add_argument('--import-timeout', dest='import_timeout', type=int, default=0,
help="The time to wait for the collection import process to finish.")
def post_process_args(self, options):
options = super(GalaxyCLI, self).post_process_args(options)
display.verbosity = options.verbosity
return options
def run(self):
super(GalaxyCLI, self).run()
self.galaxy = Galaxy()
def server_config_def(section, key, required):
return {
'description': 'The %s of the %s Galaxy server' % (key, section),
'ini': [
{
'section': 'galaxy_server.%s' % section,
'key': key,
}
],
'env': [
{'name': 'ANSIBLE_GALAXY_SERVER_%s_%s' % (section.upper(), key.upper())},
],
'required': required,
}
server_def = [('url', True), ('username', False), ('password', False), ('token', False),
('auth_url', False)]
config_servers = []
# Need to filter out empty strings or non truthy values as an empty server list env var is equal to [''].
server_list = [s for s in C.GALAXY_SERVER_LIST or [] if s]
for server_key in server_list:
# Config definitions are looked up dynamically based on the C.GALAXY_SERVER_LIST entry. We look up the
# section [galaxy_server.<server>] for the values url, username, password, and token.
config_dict = dict((k, server_config_def(server_key, k, req)) for k, req in server_def)
defs = AnsibleLoader(yaml.safe_dump(config_dict)).get_single_data()
C.config.initialize_plugin_configuration_definitions('galaxy_server', server_key, defs)
server_options = C.config.get_plugin_options('galaxy_server', server_key)
# auth_url is used to create the token, but not directly by GalaxyAPI, so
# it doesn't need to be passed as kwarg to GalaxyApi
auth_url = server_options.pop('auth_url', None)
token_val = server_options['token'] or NoTokenSentinel
username = server_options['username']
# default case if no auth info is provided.
server_options['token'] = None
if username:
server_options['token'] = BasicAuthToken(username,
server_options['password'])
else:
if token_val:
if auth_url:
server_options['token'] = KeycloakToken(access_token=token_val,
auth_url=auth_url,
validate_certs=not context.CLIARGS['ignore_certs'])
else:
# The galaxy v1 / github / django / 'Token'
server_options['token'] = GalaxyToken(token=token_val)
config_servers.append(GalaxyAPI(self.galaxy, server_key, **server_options))
cmd_server = context.CLIARGS['api_server']
cmd_token = GalaxyToken(token=context.CLIARGS['api_key'])
if cmd_server:
# Cmd args take precedence over the config entry but fist check if the arg was a name and use that config
# entry, otherwise create a new API entry for the server specified.
config_server = next((s for s in config_servers if s.name == cmd_server), None)
if config_server:
self.api_servers.append(config_server)
else:
self.api_servers.append(GalaxyAPI(self.galaxy, 'cmd_arg', cmd_server, token=cmd_token))
else:
self.api_servers = config_servers
# Default to C.GALAXY_SERVER if no servers were defined
if len(self.api_servers) == 0:
self.api_servers.append(GalaxyAPI(self.galaxy, 'default', C.GALAXY_SERVER, token=cmd_token))
context.CLIARGS['func']()
@property
def api(self):
return self.api_servers[0]
def _parse_requirements_file(self, requirements_file, allow_old_format=True):
"""
Parses an Ansible requirement.yml file and returns all the roles and/or collections defined in it. There are 2
requirements file format:
# v1 (roles only)
- src: The source of the role, required if include is not set. Can be Galaxy role name, URL to a SCM repo or tarball.
name: Downloads the role to the specified name, defaults to Galaxy name from Galaxy or name of repo if src is a URL.
scm: If src is a URL, specify the SCM. Only git or hd are supported and defaults ot git.
version: The version of the role to download. Can also be tag, commit, or branch name and defaults to master.
include: Path to additional requirements.yml files.
# v2 (roles and collections)
---
roles:
# Same as v1 format just under the roles key
collections:
- namespace.collection
- name: namespace.collection
version: version identifier, multiple identifiers are separated by ','
source: the URL or a predefined source name that relates to C.GALAXY_SERVER_LIST
:param requirements_file: The path to the requirements file.
:param allow_old_format: Will fail if a v1 requirements file is found and this is set to False.
:return: a dict containing roles and collections to found in the requirements file.
"""
requirements = {
'roles': [],
'collections': [],
}
b_requirements_file = to_bytes(requirements_file, errors='surrogate_or_strict')
if not os.path.exists(b_requirements_file):
raise AnsibleError("The requirements file '%s' does not exist." % to_native(requirements_file))
display.vvv("Reading requirement file at '%s'" % requirements_file)
with open(b_requirements_file, 'rb') as req_obj:
try:
file_requirements = yaml.safe_load(req_obj)
except YAMLError as err:
raise AnsibleError(
"Failed to parse the requirements yml at '%s' with the following error:\n%s"
% (to_native(requirements_file), to_native(err)))
if file_requirements is None:
raise AnsibleError("No requirements found in file '%s'" % to_native(requirements_file))
def parse_role_req(requirement):
if "include" not in requirement:
role = RoleRequirement.role_yaml_parse(requirement)
display.vvv("found role %s in yaml file" % to_text(role))
if "name" not in role and "src" not in role:
raise AnsibleError("Must specify name or src for role")
return [GalaxyRole(self.galaxy, self.api, **role)]
else:
b_include_path = to_bytes(requirement["include"], errors="surrogate_or_strict")
if not os.path.isfile(b_include_path):
raise AnsibleError("Failed to find include requirements file '%s' in '%s'"
% (to_native(b_include_path), to_native(requirements_file)))
with open(b_include_path, 'rb') as f_include:
try:
return [GalaxyRole(self.galaxy, self.api, **r) for r in
(RoleRequirement.role_yaml_parse(i) for i in yaml.safe_load(f_include))]
except Exception as e:
raise AnsibleError("Unable to load data from include requirements file: %s %s"
% (to_native(requirements_file), to_native(e)))
if isinstance(file_requirements, list):
# Older format that contains only roles
if not allow_old_format:
raise AnsibleError("Expecting requirements file to be a dict with the key 'collections' that contains "
"a list of collections to install")
for role_req in file_requirements:
requirements['roles'] += parse_role_req(role_req)
else:
# Newer format with a collections and/or roles key
extra_keys = set(file_requirements.keys()).difference(set(['roles', 'collections']))
if extra_keys:
raise AnsibleError("Expecting only 'roles' and/or 'collections' as base keys in the requirements "
"file. Found: %s" % (to_native(", ".join(extra_keys))))
for role_req in file_requirements.get('roles', []):
requirements['roles'] += parse_role_req(role_req)
for collection_req in file_requirements.get('collections', []):
if isinstance(collection_req, dict):
req_name = collection_req.get('name', None)
if req_name is None:
raise AnsibleError("Collections requirement entry should contain the key name.")
req_version = collection_req.get('version', '*')
req_source = collection_req.get('source', None)
if req_source:
# Try and match up the requirement source with our list of Galaxy API servers defined in the
# config, otherwise create a server with that URL without any auth.
req_source = next(iter([a for a in self.api_servers if req_source in [a.name, a.api_server]]),
GalaxyAPI(self.galaxy, "explicit_requirement_%s" % req_name, req_source))
requirements['collections'].append((req_name, req_version, req_source))
else:
requirements['collections'].append((collection_req, '*', None))
return requirements
@staticmethod
def exit_without_ignore(rc=1):
"""
Exits with the specified return code unless the
option --ignore-errors was specified
"""
if not context.CLIARGS['ignore_errors']:
raise AnsibleError('- you can use --ignore-errors to skip failed roles and finish processing the list.')
@staticmethod
def _display_role_info(role_info):
text = [u"", u"Role: %s" % to_text(role_info['name'])]
text.append(u"\tdescription: %s" % role_info.get('description', ''))
for k in sorted(role_info.keys()):
if k in GalaxyCLI.SKIP_INFO_KEYS:
continue
if isinstance(role_info[k], dict):
text.append(u"\t%s:" % (k))
for key in sorted(role_info[k].keys()):
if key in GalaxyCLI.SKIP_INFO_KEYS:
continue
text.append(u"\t\t%s: %s" % (key, role_info[k][key]))
else:
text.append(u"\t%s: %s" % (k, role_info[k]))
return u'\n'.join(text)
@staticmethod
def _resolve_path(path):
return os.path.abspath(os.path.expanduser(os.path.expandvars(path)))
@staticmethod
def _get_skeleton_galaxy_yml(template_path, inject_data):
with open(to_bytes(template_path, errors='surrogate_or_strict'), 'rb') as template_obj:
meta_template = to_text(template_obj.read(), errors='surrogate_or_strict')
galaxy_meta = get_collections_galaxy_meta_info()
required_config = []
optional_config = []
for meta_entry in galaxy_meta:
config_list = required_config if meta_entry.get('required', False) else optional_config
value = inject_data.get(meta_entry['key'], None)
if not value:
meta_type = meta_entry.get('type', 'str')
if meta_type == 'str':
value = ''
elif meta_type == 'list':
value = []
elif meta_type == 'dict':
value = {}
meta_entry['value'] = value
config_list.append(meta_entry)
link_pattern = re.compile(r"L\(([^)]+),\s+([^)]+)\)")
const_pattern = re.compile(r"C\(([^)]+)\)")
def comment_ify(v):
if isinstance(v, list):
v = ". ".join([l.rstrip('.') for l in v])
v = link_pattern.sub(r"\1 <\2>", v)
v = const_pattern.sub(r"'\1'", v)
return textwrap.fill(v, width=117, initial_indent="# ", subsequent_indent="# ", break_on_hyphens=False)
def to_yaml(v):
return yaml.safe_dump(v, default_flow_style=False).rstrip()
env = Environment(loader=BaseLoader)
env.filters['comment_ify'] = comment_ify
env.filters['to_yaml'] = to_yaml
template = env.from_string(meta_template)
meta_value = template.render({'required_config': required_config, 'optional_config': optional_config})
return meta_value
def _require_one_of_collections_requirements(self, collections, requirements_file):
if collections and requirements_file:
raise AnsibleError("The positional collection_name arg and --requirements-file are mutually exclusive.")
elif not collections and not requirements_file:
raise AnsibleError("You must specify a collection name or a requirements file.")
elif requirements_file:
requirements_file = GalaxyCLI._resolve_path(requirements_file)
requirements = self._parse_requirements_file(requirements_file, allow_old_format=False)['collections']
else:
requirements = []
for collection_input in collections:
requirement = None
if os.path.isfile(to_bytes(collection_input, errors='surrogate_or_strict')) or \
urlparse(collection_input).scheme.lower() in ['http', 'https']:
# Arg is a file path or URL to a collection
name = collection_input
else:
name, dummy, requirement = collection_input.partition(':')
requirements.append((name, requirement or '*', None))
return requirements
############################
# execute actions
############################
def execute_role(self):
"""
Perform the action on an Ansible Galaxy role. Must be combined with a further action like delete/install/init
as listed below.
"""
# To satisfy doc build
pass
def execute_collection(self):
"""
Perform the action on an Ansible Galaxy collection. Must be combined with a further action like init/install as
listed below.
"""
# To satisfy doc build
pass
def execute_build(self):
"""
Build an Ansible Galaxy collection artifact that can be stored in a central repository like Ansible Galaxy.
By default, this command builds from the current working directory. You can optionally pass in the
collection input path (where the ``galaxy.yml`` file is).
"""
force = context.CLIARGS['force']
output_path = GalaxyCLI._resolve_path(context.CLIARGS['output_path'])
b_output_path = to_bytes(output_path, errors='surrogate_or_strict')
if not os.path.exists(b_output_path):
os.makedirs(b_output_path)
elif os.path.isfile(b_output_path):
raise AnsibleError("- the output collection directory %s is a file - aborting" % to_native(output_path))
for collection_path in context.CLIARGS['args']:
collection_path = GalaxyCLI._resolve_path(collection_path)
build_collection(collection_path, output_path, force)
def execute_init(self):
"""
Creates the skeleton framework of a role or collection that complies with the Galaxy metadata format.
Requires a role or collection name. The collection name must be in the format ``<namespace>.<collection>``.
"""
galaxy_type = context.CLIARGS['type']
init_path = context.CLIARGS['init_path']
force = context.CLIARGS['force']
obj_skeleton = context.CLIARGS['{0}_skeleton'.format(galaxy_type)]
obj_name = context.CLIARGS['{0}_name'.format(galaxy_type)]
inject_data = dict(
description='your {0} description'.format(galaxy_type),
ansible_plugin_list_dir=get_versioned_doclink('plugins/plugins.html'),
)
if galaxy_type == 'role':
inject_data.update(dict(
author='your name',
company='your company (optional)',
license='license (GPL-2.0-or-later, MIT, etc)',
role_name=obj_name,
role_type=context.CLIARGS['role_type'],
issue_tracker_url='http://example.com/issue/tracker',
repository_url='http://example.com/repository',
documentation_url='http://docs.example.com',
homepage_url='http://example.com',
min_ansible_version=ansible_version[:3], # x.y
))
obj_path = os.path.join(init_path, obj_name)
elif galaxy_type == 'collection':
namespace, collection_name = obj_name.split('.', 1)
inject_data.update(dict(
namespace=namespace,
collection_name=collection_name,
version='1.0.0',
readme='README.md',
authors=['your name <[email protected]>'],
license=['GPL-2.0-or-later'],
repository='http://example.com/repository',
documentation='http://docs.example.com',
homepage='http://example.com',
issues='http://example.com/issue/tracker',
build_ignore=[],
))
obj_path = os.path.join(init_path, namespace, collection_name)
b_obj_path = to_bytes(obj_path, errors='surrogate_or_strict')
if os.path.exists(b_obj_path):
if os.path.isfile(obj_path):
raise AnsibleError("- the path %s already exists, but is a file - aborting" % to_native(obj_path))
elif not force:
raise AnsibleError("- the directory %s already exists. "
"You can use --force to re-initialize this directory,\n"
"however it will reset any main.yml files that may have\n"
"been modified there already." % to_native(obj_path))
if obj_skeleton is not None:
own_skeleton = False
skeleton_ignore_expressions = C.GALAXY_ROLE_SKELETON_IGNORE
else:
own_skeleton = True
obj_skeleton = self.galaxy.default_role_skeleton_path
skeleton_ignore_expressions = ['^.*/.git_keep$']
obj_skeleton = os.path.expanduser(obj_skeleton)
skeleton_ignore_re = [re.compile(x) for x in skeleton_ignore_expressions]
if not os.path.exists(obj_skeleton):
raise AnsibleError("- the skeleton path '{0}' does not exist, cannot init {1}".format(
to_native(obj_skeleton), galaxy_type)
)
template_env = Environment(loader=FileSystemLoader(obj_skeleton))
# create role directory
if not os.path.exists(b_obj_path):
os.makedirs(b_obj_path)
for root, dirs, files in os.walk(obj_skeleton, topdown=True):
rel_root = os.path.relpath(root, obj_skeleton)
rel_dirs = rel_root.split(os.sep)
rel_root_dir = rel_dirs[0]
if galaxy_type == 'collection':
# A collection can contain templates in playbooks/*/templates and roles/*/templates
in_templates_dir = rel_root_dir in ['playbooks', 'roles'] and 'templates' in rel_dirs
else:
in_templates_dir = rel_root_dir == 'templates'
dirs[:] = [d for d in dirs if not any(r.match(d) for r in skeleton_ignore_re)]
for f in files:
filename, ext = os.path.splitext(f)
if any(r.match(os.path.join(rel_root, f)) for r in skeleton_ignore_re):
continue
if galaxy_type == 'collection' and own_skeleton and rel_root == '.' and f == 'galaxy.yml.j2':
# Special use case for galaxy.yml.j2 in our own default collection skeleton. We build the options
# dynamically which requires special options to be set.
# The templated data's keys must match the key name but the inject data contains collection_name
# instead of name. We just make a copy and change the key back to name for this file.
template_data = inject_data.copy()
template_data['name'] = template_data.pop('collection_name')
meta_value = GalaxyCLI._get_skeleton_galaxy_yml(os.path.join(root, rel_root, f), template_data)
b_dest_file = to_bytes(os.path.join(obj_path, rel_root, filename), errors='surrogate_or_strict')
with open(b_dest_file, 'wb') as galaxy_obj:
galaxy_obj.write(to_bytes(meta_value, errors='surrogate_or_strict'))
elif ext == ".j2" and not in_templates_dir:
src_template = os.path.join(rel_root, f)
dest_file = os.path.join(obj_path, rel_root, filename)
template_env.get_template(src_template).stream(inject_data).dump(dest_file, encoding='utf-8')
else:
f_rel_path = os.path.relpath(os.path.join(root, f), obj_skeleton)
shutil.copyfile(os.path.join(root, f), os.path.join(obj_path, f_rel_path))
for d in dirs:
b_dir_path = to_bytes(os.path.join(obj_path, rel_root, d), errors='surrogate_or_strict')
if not os.path.exists(b_dir_path):
os.makedirs(b_dir_path)
display.display("- %s %s was created successfully" % (galaxy_type.title(), obj_name))
def execute_info(self):
"""
prints out detailed information about an installed role as well as info available from the galaxy API.
"""
roles_path = context.CLIARGS['roles_path']
data = ''
for role in context.CLIARGS['args']:
role_info = {'path': roles_path}
gr = GalaxyRole(self.galaxy, self.api, role)
install_info = gr.install_info
if install_info:
if 'version' in install_info:
install_info['installed_version'] = install_info['version']
del install_info['version']
role_info.update(install_info)
remote_data = False
if not context.CLIARGS['offline']:
remote_data = self.api.lookup_role_by_name(role, False)
if remote_data:
role_info.update(remote_data)
if gr.metadata:
role_info.update(gr.metadata)
req = RoleRequirement()
role_spec = req.role_yaml_parse({'role': role})
if role_spec:
role_info.update(role_spec)
data = self._display_role_info(role_info)
# FIXME: This is broken in both 1.9 and 2.0 as
# _display_role_info() always returns something
if not data:
data = u"\n- the role %s was not found" % role
self.pager(data)
def execute_verify(self):
collections = context.CLIARGS['args']
search_paths = context.CLIARGS['collections_path']
ignore_certs = context.CLIARGS['ignore_certs']
ignore_errors = context.CLIARGS['ignore_errors']
requirements_file = context.CLIARGS['requirements']
requirements = self._require_one_of_collections_requirements(collections, requirements_file)
resolved_paths = [validate_collection_path(GalaxyCLI._resolve_path(path)) for path in search_paths]
verify_collections(requirements, resolved_paths, self.api_servers, (not ignore_certs), ignore_errors,
allow_pre_release=True)
return 0
def execute_install(self):
"""
Install one or more roles(``ansible-galaxy role install``), or one or more collections(``ansible-galaxy collection install``).
You can pass in a list (roles or collections) or use the file
option listed below (these are mutually exclusive). If you pass in a list, it
can be a name (which will be downloaded via the galaxy API and github), or it can be a local tar archive file.
"""
if context.CLIARGS['type'] == 'collection':
collections = context.CLIARGS['args']
force = context.CLIARGS['force']
output_path = context.CLIARGS['collections_path']
ignore_certs = context.CLIARGS['ignore_certs']
ignore_errors = context.CLIARGS['ignore_errors']
requirements_file = context.CLIARGS['requirements']
no_deps = context.CLIARGS['no_deps']
force_deps = context.CLIARGS['force_with_deps']
if collections and requirements_file:
raise AnsibleError("The positional collection_name arg and --requirements-file are mutually exclusive.")
elif not collections and not requirements_file:
raise AnsibleError("You must specify a collection name or a requirements file.")
if requirements_file:
requirements_file = GalaxyCLI._resolve_path(requirements_file)
requirements = self._require_one_of_collections_requirements(collections, requirements_file)
output_path = GalaxyCLI._resolve_path(output_path)
collections_path = C.COLLECTIONS_PATHS
if len([p for p in collections_path if p.startswith(output_path)]) == 0:
display.warning("The specified collections path '%s' is not part of the configured Ansible "
"collections paths '%s'. The installed collection won't be picked up in an Ansible "
"run." % (to_text(output_path), to_text(":".join(collections_path))))
output_path = validate_collection_path(output_path)
b_output_path = to_bytes(output_path, errors='surrogate_or_strict')
if not os.path.exists(b_output_path):
os.makedirs(b_output_path)
install_collections(requirements, output_path, self.api_servers, (not ignore_certs), ignore_errors,
no_deps, force, force_deps, context.CLIARGS['allow_pre_release'])
return 0
role_file = context.CLIARGS['role_file']
if not context.CLIARGS['args'] and role_file is None:
# the user needs to specify one of either --role-file or specify a single user/role name
raise AnsibleOptionsError("- you must specify a user/role name or a roles file")
no_deps = context.CLIARGS['no_deps']
force_deps = context.CLIARGS['force_with_deps']
force = context.CLIARGS['force'] or force_deps
roles_left = []
if role_file:
if not (role_file.endswith('.yaml') or role_file.endswith('.yml')):
raise AnsibleError("Invalid role requirements file, it must end with a .yml or .yaml extension")
roles_left = self._parse_requirements_file(role_file)['roles']
else:
# roles were specified directly, so we'll just go out grab them
# (and their dependencies, unless the user doesn't want us to).
for rname in context.CLIARGS['args']:
role = RoleRequirement.role_yaml_parse(rname.strip())
roles_left.append(GalaxyRole(self.galaxy, self.api, **role))
for role in roles_left:
# only process roles in roles files when names matches if given
if role_file and context.CLIARGS['args'] and role.name not in context.CLIARGS['args']:
display.vvv('Skipping role %s' % role.name)
continue
display.vvv('Processing role %s ' % role.name)
# query the galaxy API for the role data
if role.install_info is not None:
if role.install_info['version'] != role.version or force:
if force:
display.display('- changing role %s from %s to %s' %
(role.name, role.install_info['version'], role.version or "unspecified"))
role.remove()
else:
display.warning('- %s (%s) is already installed - use --force to change version to %s' %
(role.name, role.install_info['version'], role.version or "unspecified"))
continue
else:
if not force:
display.display('- %s is already installed, skipping.' % str(role))
continue
try:
installed = role.install()
except AnsibleError as e:
display.warning(u"- %s was NOT installed successfully: %s " % (role.name, to_text(e)))
self.exit_without_ignore()
continue
# install dependencies, if we want them
if not no_deps and installed:
if not role.metadata:
display.warning("Meta file %s is empty. Skipping dependencies." % role.path)
else:
role_dependencies = role.metadata.get('dependencies') or []
for dep in role_dependencies:
display.debug('Installing dep %s' % dep)
dep_req = RoleRequirement()
dep_info = dep_req.role_yaml_parse(dep)
dep_role = GalaxyRole(self.galaxy, self.api, **dep_info)
if '.' not in dep_role.name and '.' not in dep_role.src and dep_role.scm is None:
# we know we can skip this, as it's not going to
# be found on galaxy.ansible.com
continue
if dep_role.install_info is None:
if dep_role not in roles_left:
display.display('- adding dependency: %s' % to_text(dep_role))
roles_left.append(dep_role)
else:
display.display('- dependency %s already pending installation.' % dep_role.name)
else:
if dep_role.install_info['version'] != dep_role.version:
if force_deps:
display.display('- changing dependant role %s from %s to %s' %
(dep_role.name, dep_role.install_info['version'], dep_role.version or "unspecified"))
dep_role.remove()
roles_left.append(dep_role)
else:
display.warning('- dependency %s (%s) from role %s differs from already installed version (%s), skipping' %
(to_text(dep_role), dep_role.version, role.name, dep_role.install_info['version']))
else:
if force_deps:
roles_left.append(dep_role)
else:
display.display('- dependency %s is already installed, skipping.' % dep_role.name)
if not installed:
display.warning("- %s was NOT installed successfully." % role.name)
self.exit_without_ignore()
return 0
def execute_remove(self):
"""
removes the list of roles passed as arguments from the local system.
"""
if not context.CLIARGS['args']:
raise AnsibleOptionsError('- you must specify at least one role to remove.')
for role_name in context.CLIARGS['args']:
role = GalaxyRole(self.galaxy, self.api, role_name)
try:
if role.remove():
display.display('- successfully removed %s' % role_name)
else:
display.display('- %s is not installed, skipping.' % role_name)
except Exception as e:
raise AnsibleError("Failed to remove role %s: %s" % (role_name, to_native(e)))
return 0
def execute_list(self):
"""
List installed collections or roles
"""
if context.CLIARGS['type'] == 'role':
self.execute_list_role()
elif context.CLIARGS['type'] == 'collection':
self.execute_list_collection()
def execute_list_role(self):
"""
List all roles installed on the local system or a specific role
"""
path_found = False
role_found = False
warnings = []
roles_search_paths = context.CLIARGS['roles_path']
role_name = context.CLIARGS['role']
for path in roles_search_paths:
role_path = GalaxyCLI._resolve_path(path)
if os.path.isdir(path):
path_found = True
else:
warnings.append("- the configured path {0} does not exist.".format(path))
continue
if role_name:
# show the requested role, if it exists
gr = GalaxyRole(self.galaxy, self.api, role_name, path=os.path.join(role_path, role_name))
if os.path.isdir(gr.path):
role_found = True
display.display('# %s' % os.path.dirname(gr.path))
_display_role(gr)
break
warnings.append("- the role %s was not found" % role_name)
else:
if not os.path.exists(role_path):
warnings.append("- the configured path %s does not exist." % role_path)
continue
if not os.path.isdir(role_path):
warnings.append("- the configured path %s, exists, but it is not a directory." % role_path)
continue
display.display('# %s' % role_path)
path_files = os.listdir(role_path)
for path_file in path_files:
gr = GalaxyRole(self.galaxy, self.api, path_file, path=path)
if gr.metadata:
_display_role(gr)
# Do not warn if the role was found in any of the search paths
if role_found and role_name:
warnings = []
for w in warnings:
display.warning(w)
if not path_found:
raise AnsibleOptionsError("- None of the provided paths were usable. Please specify a valid path with --{0}s-path".format(context.CLIARGS['type']))
return 0
def execute_list_collection(self):
"""
List all collections installed on the local system
"""
collections_search_paths = set(context.CLIARGS['collections_path'])
collection_name = context.CLIARGS['collection']
default_collections_path = C.config.get_configuration_definition('COLLECTIONS_PATHS').get('default')
warnings = []
path_found = False
collection_found = False
for path in collections_search_paths:
collection_path = GalaxyCLI._resolve_path(path)
if not os.path.exists(path):
if path in default_collections_path:
# don't warn for missing default paths
continue
warnings.append("- the configured path {0} does not exist.".format(collection_path))
continue
if not os.path.isdir(collection_path):
warnings.append("- the configured path {0}, exists, but it is not a directory.".format(collection_path))
continue
path_found = True
if collection_name:
# list a specific collection
validate_collection_name(collection_name)
namespace, collection = collection_name.split('.')
collection_path = validate_collection_path(collection_path)
b_collection_path = to_bytes(os.path.join(collection_path, namespace, collection), errors='surrogate_or_strict')
if not os.path.exists(b_collection_path):
warnings.append("- unable to find {0} in collection paths".format(collection_name))
continue
if not os.path.isdir(collection_path):
warnings.append("- the configured path {0}, exists, but it is not a directory.".format(collection_path))
continue
collection_found = True
collection = CollectionRequirement.from_path(b_collection_path, False)
fqcn_width, version_width = _get_collection_widths(collection)
_display_header(collection_path, 'Collection', 'Version', fqcn_width, version_width)
_display_collection(collection, fqcn_width, version_width)
else:
# list all collections
collection_path = validate_collection_path(path)
if os.path.isdir(collection_path):
display.vvv("Searching {0} for collections".format(collection_path))
collections = find_existing_collections(collection_path)
else:
# There was no 'ansible_collections/' directory in the path, so there
# or no collections here.
display.vvv("No 'ansible_collections' directory found at {0}".format(collection_path))
continue
if not collections:
display.vvv("No collections found at {0}".format(collection_path))
continue
# Display header
fqcn_width, version_width = _get_collection_widths(collections)
_display_header(collection_path, 'Collection', 'Version', fqcn_width, version_width)
# Sort collections by the namespace and name
collections.sort(key=to_text)
for collection in collections:
_display_collection(collection, fqcn_width, version_width)
# Do not warn if the specific collection was found in any of the search paths
if collection_found and collection_name:
warnings = []
for w in warnings:
display.warning(w)
if not path_found:
raise AnsibleOptionsError("- None of the provided paths were usable. Please specify a valid path with --{0}s-path".format(context.CLIARGS['type']))
return 0
def execute_publish(self):
"""
Publish a collection into Ansible Galaxy. Requires the path to the collection tarball to publish.
"""
collection_path = GalaxyCLI._resolve_path(context.CLIARGS['args'])
wait = context.CLIARGS['wait']
timeout = context.CLIARGS['import_timeout']
publish_collection(collection_path, self.api, wait, timeout)
def execute_search(self):
''' searches for roles on the Ansible Galaxy server'''
page_size = 1000
search = None
if context.CLIARGS['args']:
search = '+'.join(context.CLIARGS['args'])
if not search and not context.CLIARGS['platforms'] and not context.CLIARGS['galaxy_tags'] and not context.CLIARGS['author']:
raise AnsibleError("Invalid query. At least one search term, platform, galaxy tag or author must be provided.")
response = self.api.search_roles(search, platforms=context.CLIARGS['platforms'],
tags=context.CLIARGS['galaxy_tags'], author=context.CLIARGS['author'], page_size=page_size)
if response['count'] == 0:
display.display("No roles match your search.", color=C.COLOR_ERROR)
return True
data = [u'']
if response['count'] > page_size:
data.append(u"Found %d roles matching your search. Showing first %s." % (response['count'], page_size))
else:
data.append(u"Found %d roles matching your search:" % response['count'])
max_len = []
for role in response['results']:
max_len.append(len(role['username'] + '.' + role['name']))
name_len = max(max_len)
format_str = u" %%-%ds %%s" % name_len
data.append(u'')
data.append(format_str % (u"Name", u"Description"))
data.append(format_str % (u"----", u"-----------"))
for role in response['results']:
data.append(format_str % (u'%s.%s' % (role['username'], role['name']), role['description']))
data = u'\n'.join(data)
self.pager(data)
return True
def execute_login(self):
"""
verify user's identify via Github and retrieve an auth token from Ansible Galaxy.
"""
# Authenticate with github and retrieve a token
if context.CLIARGS['token'] is None:
if C.GALAXY_TOKEN:
github_token = C.GALAXY_TOKEN
else:
login = GalaxyLogin(self.galaxy)
github_token = login.create_github_token()
else:
github_token = context.CLIARGS['token']
galaxy_response = self.api.authenticate(github_token)
if context.CLIARGS['token'] is None and C.GALAXY_TOKEN is None:
# Remove the token we created
login.remove_github_token()
# Store the Galaxy token
token = GalaxyToken()
token.set(galaxy_response['token'])
display.display("Successfully logged into Galaxy as %s" % galaxy_response['username'])
return 0
def execute_import(self):
""" used to import a role into Ansible Galaxy """
colors = {
'INFO': 'normal',
'WARNING': C.COLOR_WARN,
'ERROR': C.COLOR_ERROR,
'SUCCESS': C.COLOR_OK,
'FAILED': C.COLOR_ERROR,
}
github_user = to_text(context.CLIARGS['github_user'], errors='surrogate_or_strict')
github_repo = to_text(context.CLIARGS['github_repo'], errors='surrogate_or_strict')
if context.CLIARGS['check_status']:
task = self.api.get_import_task(github_user=github_user, github_repo=github_repo)
else:
# Submit an import request
task = self.api.create_import_task(github_user, github_repo,
reference=context.CLIARGS['reference'],
role_name=context.CLIARGS['role_name'])
if len(task) > 1:
# found multiple roles associated with github_user/github_repo
display.display("WARNING: More than one Galaxy role associated with Github repo %s/%s." % (github_user, github_repo),
color='yellow')
display.display("The following Galaxy roles are being updated:" + u'\n', color=C.COLOR_CHANGED)
for t in task:
display.display('%s.%s' % (t['summary_fields']['role']['namespace'], t['summary_fields']['role']['name']), color=C.COLOR_CHANGED)
display.display(u'\nTo properly namespace this role, remove each of the above and re-import %s/%s from scratch' % (github_user, github_repo),
color=C.COLOR_CHANGED)
return 0
# found a single role as expected
display.display("Successfully submitted import request %d" % task[0]['id'])
if not context.CLIARGS['wait']:
display.display("Role name: %s" % task[0]['summary_fields']['role']['name'])
display.display("Repo: %s/%s" % (task[0]['github_user'], task[0]['github_repo']))
if context.CLIARGS['check_status'] or context.CLIARGS['wait']:
# Get the status of the import
msg_list = []
finished = False
while not finished:
task = self.api.get_import_task(task_id=task[0]['id'])
for msg in task[0]['summary_fields']['task_messages']:
if msg['id'] not in msg_list:
display.display(msg['message_text'], color=colors[msg['message_type']])
msg_list.append(msg['id'])
if task[0]['state'] in ['SUCCESS', 'FAILED']:
finished = True
else:
time.sleep(10)
return 0
def execute_setup(self):
""" Setup an integration from Github or Travis for Ansible Galaxy roles"""
if context.CLIARGS['setup_list']:
# List existing integration secrets
secrets = self.api.list_secrets()
if len(secrets) == 0:
# None found
display.display("No integrations found.")
return 0
display.display(u'\n' + "ID Source Repo", color=C.COLOR_OK)
display.display("---------- ---------- ----------", color=C.COLOR_OK)
for secret in secrets:
display.display("%-10s %-10s %s/%s" % (secret['id'], secret['source'], secret['github_user'],
secret['github_repo']), color=C.COLOR_OK)
return 0
if context.CLIARGS['remove_id']:
# Remove a secret
self.api.remove_secret(context.CLIARGS['remove_id'])
display.display("Secret removed. Integrations using this secret will not longer work.", color=C.COLOR_OK)
return 0
source = context.CLIARGS['source']
github_user = context.CLIARGS['github_user']
github_repo = context.CLIARGS['github_repo']
secret = context.CLIARGS['secret']
resp = self.api.add_secret(source, github_user, github_repo, secret)
display.display("Added integration for %s %s/%s" % (resp['source'], resp['github_user'], resp['github_repo']))
return 0
def execute_delete(self):
""" Delete a role from Ansible Galaxy. """
github_user = context.CLIARGS['github_user']
github_repo = context.CLIARGS['github_repo']
resp = self.api.delete_role(github_user, github_repo)
if len(resp['deleted_roles']) > 1:
display.display("Deleted the following roles:")
display.display("ID User Name")
display.display("------ --------------- ----------")
for role in resp['deleted_roles']:
display.display("%-8s %-15s %s" % (role.id, role.namespace, role.name))
display.display(resp['status'])
return True
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 66,534 |
ansible-galaxy cli should have an option to keep the collection tarball when installing collection
|
<!--- Verify first that your feature was not already discussed on GitHub -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Describe the new feature/improvement briefly below -->
Use case is for customers on air-gapped customers who want to take the tarball using a flash drive inside their network.
Currently that is not possible because cli does not keep the tarball after installing the collection.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ansible-galaxy cli
##### ADDITIONAL INFORMATION
<!--- Describe how the feature would be used, why it is needed and what it would solve -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can also paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/66534
|
https://github.com/ansible/ansible/pull/67632
|
28f8b8976022728b24534cae871d2b3c8724ecce
|
a2deeb8fa27633194d12dfd8e8768ab57100e6d1
| 2020-01-16T15:02:42Z |
python
| 2020-03-24T22:32:43Z |
lib/ansible/galaxy/collection.py
|
# Copyright: (c) 2019, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import fnmatch
import json
import operator
import os
import shutil
import stat
import sys
import tarfile
import tempfile
import threading
import time
import yaml
from collections import namedtuple
from contextlib import contextmanager
from distutils.version import LooseVersion
from hashlib import sha256
from io import BytesIO
from yaml.error import YAMLError
try:
import queue
except ImportError:
import Queue as queue # Python 2
import ansible.constants as C
from ansible.errors import AnsibleError
from ansible.galaxy import get_collections_galaxy_meta_info
from ansible.galaxy.api import CollectionVersionMetadata, GalaxyError
from ansible.galaxy.user_agent import user_agent
from ansible.module_utils import six
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.utils.collection_loader import AnsibleCollectionRef
from ansible.utils.display import Display
from ansible.utils.hashing import secure_hash, secure_hash_s
from ansible.utils.version import SemanticVersion
from ansible.module_utils.urls import open_url
urlparse = six.moves.urllib.parse.urlparse
urllib_error = six.moves.urllib.error
display = Display()
MANIFEST_FORMAT = 1
ModifiedContent = namedtuple('ModifiedContent', ['filename', 'expected', 'installed'])
class CollectionRequirement:
_FILE_MAPPING = [(b'MANIFEST.json', 'manifest_file'), (b'FILES.json', 'files_file')]
def __init__(self, namespace, name, b_path, api, versions, requirement, force, parent=None, metadata=None,
files=None, skip=False, allow_pre_releases=False):
"""
Represents a collection requirement, the versions that are available to be installed as well as any
dependencies the collection has.
:param namespace: The collection namespace.
:param name: The collection name.
:param b_path: Byte str of the path to the collection tarball if it has already been downloaded.
:param api: The GalaxyAPI to use if the collection is from Galaxy.
:param versions: A list of versions of the collection that are available.
:param requirement: The version requirement string used to verify the list of versions fit the requirements.
:param force: Whether the force flag applied to the collection.
:param parent: The name of the parent the collection is a dependency of.
:param metadata: The galaxy.api.CollectionVersionMetadata that has already been retrieved from the Galaxy
server.
:param files: The files that exist inside the collection. This is based on the FILES.json file inside the
collection artifact.
:param skip: Whether to skip installing the collection. Should be set if the collection is already installed
and force is not set.
:param allow_pre_releases: Whether to skip pre-release versions of collections.
"""
self.namespace = namespace
self.name = name
self.b_path = b_path
self.api = api
self._versions = set(versions)
self.force = force
self.skip = skip
self.required_by = []
self.allow_pre_releases = allow_pre_releases
self._metadata = metadata
self._files = files
self.add_requirement(parent, requirement)
def __str__(self):
return to_native("%s.%s" % (self.namespace, self.name))
def __unicode__(self):
return u"%s.%s" % (self.namespace, self.name)
@property
def metadata(self):
self._get_metadata()
return self._metadata
@property
def versions(self):
if self.allow_pre_releases:
return self._versions
return set(v for v in self._versions if v == '*' or not SemanticVersion(v).is_prerelease)
@versions.setter
def versions(self, value):
self._versions = set(value)
@property
def pre_releases(self):
return set(v for v in self._versions if SemanticVersion(v).is_prerelease)
@property
def latest_version(self):
try:
return max([v for v in self.versions if v != '*'], key=SemanticVersion)
except ValueError: # ValueError: max() arg is an empty sequence
return '*'
@property
def dependencies(self):
if not self._metadata:
if len(self.versions) > 1:
return {}
self._get_metadata()
dependencies = self._metadata.dependencies
if dependencies is None:
return {}
return dependencies
def add_requirement(self, parent, requirement):
self.required_by.append((parent, requirement))
new_versions = set(v for v in self.versions if self._meets_requirements(v, requirement, parent))
if len(new_versions) == 0:
if self.skip:
force_flag = '--force-with-deps' if parent else '--force'
version = self.latest_version if self.latest_version != '*' else 'unknown'
msg = "Cannot meet requirement %s:%s as it is already installed at version '%s'. Use %s to overwrite" \
% (to_text(self), requirement, version, force_flag)
raise AnsibleError(msg)
elif parent is None:
msg = "Cannot meet requirement %s for dependency %s" % (requirement, to_text(self))
else:
msg = "Cannot meet dependency requirement '%s:%s' for collection %s" \
% (to_text(self), requirement, parent)
collection_source = to_text(self.b_path, nonstring='passthru') or self.api.api_server
req_by = "\n".join(
"\t%s - '%s:%s'" % (to_text(p) if p else 'base', to_text(self), r)
for p, r in self.required_by
)
versions = ", ".join(sorted(self.versions, key=SemanticVersion))
if not self.versions and self.pre_releases:
pre_release_msg = (
'\nThis collection only contains pre-releases. Utilize `--pre` to install pre-releases, or '
'explicitly provide the pre-release version.'
)
else:
pre_release_msg = ''
raise AnsibleError(
"%s from source '%s'. Available versions before last requirement added: %s\nRequirements from:\n%s%s"
% (msg, collection_source, versions, req_by, pre_release_msg)
)
self.versions = new_versions
def install(self, path, b_temp_path):
if self.skip:
display.display("Skipping '%s' as it is already installed" % to_text(self))
return
# Install if it is not
collection_path = os.path.join(path, self.namespace, self.name)
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
display.display("Installing '%s:%s' to '%s'" % (to_text(self), self.latest_version, collection_path))
if self.b_path is None:
download_url = self._metadata.download_url
artifact_hash = self._metadata.artifact_sha256
headers = {}
self.api._add_auth_token(headers, download_url, required=False)
self.b_path = _download_file(download_url, b_temp_path, artifact_hash, self.api.validate_certs,
headers=headers)
if os.path.exists(b_collection_path):
shutil.rmtree(b_collection_path)
os.makedirs(b_collection_path)
with tarfile.open(self.b_path, mode='r') as collection_tar:
files_member_obj = collection_tar.getmember('FILES.json')
with _tarfile_extract(collection_tar, files_member_obj) as files_obj:
files = json.loads(to_text(files_obj.read(), errors='surrogate_or_strict'))
_extract_tar_file(collection_tar, 'MANIFEST.json', b_collection_path, b_temp_path)
_extract_tar_file(collection_tar, 'FILES.json', b_collection_path, b_temp_path)
for file_info in files['files']:
file_name = file_info['name']
if file_name == '.':
continue
if file_info['ftype'] == 'file':
_extract_tar_file(collection_tar, file_name, b_collection_path, b_temp_path,
expected_hash=file_info['chksum_sha256'])
else:
os.makedirs(os.path.join(b_collection_path, to_bytes(file_name, errors='surrogate_or_strict')))
def set_latest_version(self):
self.versions = set([self.latest_version])
self._get_metadata()
def verify(self, remote_collection, path, b_temp_tar_path):
if not self.skip:
display.display("'%s' has not been installed, nothing to verify" % (to_text(self)))
return
collection_path = os.path.join(path, self.namespace, self.name)
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
display.vvv("Verifying '%s:%s'." % (to_text(self), self.latest_version))
display.vvv("Installed collection found at '%s'" % collection_path)
display.vvv("Remote collection found at '%s'" % remote_collection.metadata.download_url)
# Compare installed version versus requirement version
if self.latest_version != remote_collection.latest_version:
err = "%s has the version '%s' but is being compared to '%s'" % (to_text(self), self.latest_version, remote_collection.latest_version)
display.display(err)
return
modified_content = []
# Verify the manifest hash matches before verifying the file manifest
expected_hash = _get_tar_file_hash(b_temp_tar_path, 'MANIFEST.json')
self._verify_file_hash(b_collection_path, 'MANIFEST.json', expected_hash, modified_content)
manifest = _get_json_from_tar_file(b_temp_tar_path, 'MANIFEST.json')
# Use the manifest to verify the file manifest checksum
file_manifest_data = manifest['file_manifest_file']
file_manifest_filename = file_manifest_data['name']
expected_hash = file_manifest_data['chksum_%s' % file_manifest_data['chksum_type']]
# Verify the file manifest before using it to verify individual files
self._verify_file_hash(b_collection_path, file_manifest_filename, expected_hash, modified_content)
file_manifest = _get_json_from_tar_file(b_temp_tar_path, file_manifest_filename)
# Use the file manifest to verify individual file checksums
for manifest_data in file_manifest['files']:
if manifest_data['ftype'] == 'file':
expected_hash = manifest_data['chksum_%s' % manifest_data['chksum_type']]
self._verify_file_hash(b_collection_path, manifest_data['name'], expected_hash, modified_content)
if modified_content:
display.display("Collection %s contains modified content in the following files:" % to_text(self))
display.display(to_text(self))
display.vvv(to_text(self.b_path))
for content_change in modified_content:
display.display(' %s' % content_change.filename)
display.vvv(" Expected: %s\n Found: %s" % (content_change.expected, content_change.installed))
else:
display.vvv("Successfully verified that checksums for '%s:%s' match the remote collection" % (to_text(self), self.latest_version))
def _verify_file_hash(self, b_path, filename, expected_hash, error_queue):
b_file_path = to_bytes(os.path.join(to_text(b_path), filename), errors='surrogate_or_strict')
if not os.path.isfile(b_file_path):
actual_hash = None
else:
with open(b_file_path, mode='rb') as file_object:
actual_hash = _consume_file(file_object)
if expected_hash != actual_hash:
error_queue.append(ModifiedContent(filename=filename, expected=expected_hash, installed=actual_hash))
def _get_metadata(self):
if self._metadata:
return
self._metadata = self.api.get_collection_version_metadata(self.namespace, self.name, self.latest_version)
def _meets_requirements(self, version, requirements, parent):
"""
Supports version identifiers can be '==', '!=', '>', '>=', '<', '<=', '*'. Each requirement is delimited by ','
"""
op_map = {
'!=': operator.ne,
'==': operator.eq,
'=': operator.eq,
'>=': operator.ge,
'>': operator.gt,
'<=': operator.le,
'<': operator.lt,
}
for req in list(requirements.split(',')):
op_pos = 2 if len(req) > 1 and req[1] == '=' else 1
op = op_map.get(req[:op_pos])
requirement = req[op_pos:]
if not op:
requirement = req
op = operator.eq
# In the case we are checking a new requirement on a base requirement (parent != None) we can't accept
# version as '*' (unknown version) unless the requirement is also '*'.
if parent and version == '*' and requirement != '*':
display.warning("Failed to validate the collection requirement '%s:%s' for %s when the existing "
"install does not have a version set, the collection may not work."
% (to_text(self), req, parent))
continue
elif requirement == '*' or version == '*':
continue
if not op(SemanticVersion(version), SemanticVersion.from_loose_version(LooseVersion(requirement))):
break
else:
return True
# The loop was broken early, it does not meet all the requirements
return False
@staticmethod
def from_tar(b_path, force, parent=None):
if not tarfile.is_tarfile(b_path):
raise AnsibleError("Collection artifact at '%s' is not a valid tar file." % to_native(b_path))
info = {}
with tarfile.open(b_path, mode='r') as collection_tar:
for b_member_name, property_name in CollectionRequirement._FILE_MAPPING:
n_member_name = to_native(b_member_name)
try:
member = collection_tar.getmember(n_member_name)
except KeyError:
raise AnsibleError("Collection at '%s' does not contain the required file %s."
% (to_native(b_path), n_member_name))
with _tarfile_extract(collection_tar, member) as member_obj:
try:
info[property_name] = json.loads(to_text(member_obj.read(), errors='surrogate_or_strict'))
except ValueError:
raise AnsibleError("Collection tar file member %s does not contain a valid json string."
% n_member_name)
meta = info['manifest_file']['collection_info']
files = info['files_file']['files']
namespace = meta['namespace']
name = meta['name']
version = meta['version']
meta = CollectionVersionMetadata(namespace, name, version, None, None, meta['dependencies'])
if SemanticVersion(version).is_prerelease:
allow_pre_release = True
else:
allow_pre_release = False
return CollectionRequirement(namespace, name, b_path, None, [version], version, force, parent=parent,
metadata=meta, files=files, allow_pre_releases=allow_pre_release)
@staticmethod
def from_path(b_path, force, parent=None):
info = {}
for b_file_name, property_name in CollectionRequirement._FILE_MAPPING:
b_file_path = os.path.join(b_path, b_file_name)
if not os.path.exists(b_file_path):
continue
with open(b_file_path, 'rb') as file_obj:
try:
info[property_name] = json.loads(to_text(file_obj.read(), errors='surrogate_or_strict'))
except ValueError:
raise AnsibleError("Collection file at '%s' does not contain a valid json string."
% to_native(b_file_path))
allow_pre_release = False
if 'manifest_file' in info:
manifest = info['manifest_file']['collection_info']
namespace = manifest['namespace']
name = manifest['name']
version = to_text(manifest['version'], errors='surrogate_or_strict')
try:
_v = SemanticVersion()
_v.parse(version)
if _v.is_prerelease:
allow_pre_release = True
except ValueError:
display.warning("Collection at '%s' does not have a valid version set, falling back to '*'. Found "
"version: '%s'" % (to_text(b_path), version))
version = '*'
dependencies = manifest['dependencies']
else:
display.warning("Collection at '%s' does not have a MANIFEST.json file, cannot detect version."
% to_text(b_path))
parent_dir, name = os.path.split(to_text(b_path, errors='surrogate_or_strict'))
namespace = os.path.split(parent_dir)[1]
version = '*'
dependencies = {}
meta = CollectionVersionMetadata(namespace, name, version, None, None, dependencies)
files = info.get('files_file', {}).get('files', {})
return CollectionRequirement(namespace, name, b_path, None, [version], version, force, parent=parent,
metadata=meta, files=files, skip=True, allow_pre_releases=allow_pre_release)
@staticmethod
def from_name(collection, apis, requirement, force, parent=None, allow_pre_release=False):
namespace, name = collection.split('.', 1)
galaxy_meta = None
for api in apis:
try:
if not (requirement == '*' or requirement.startswith('<') or requirement.startswith('>') or
requirement.startswith('!=')):
# Exact requirement
allow_pre_release = True
if requirement.startswith('='):
requirement = requirement.lstrip('=')
resp = api.get_collection_version_metadata(namespace, name, requirement)
galaxy_meta = resp
versions = [resp.version]
else:
versions = api.get_collection_versions(namespace, name)
except GalaxyError as err:
if err.http_code == 404:
display.vvv("Collection '%s' is not available from server %s %s"
% (collection, api.name, api.api_server))
continue
raise
display.vvv("Collection '%s' obtained from server %s %s" % (collection, api.name, api.api_server))
break
else:
raise AnsibleError("Failed to find collection %s:%s" % (collection, requirement))
req = CollectionRequirement(namespace, name, None, api, versions, requirement, force, parent=parent,
metadata=galaxy_meta, allow_pre_releases=allow_pre_release)
return req
def build_collection(collection_path, output_path, force):
"""
Creates the Ansible collection artifact in a .tar.gz file.
:param collection_path: The path to the collection to build. This should be the directory that contains the
galaxy.yml file.
:param output_path: The path to create the collection build artifact. This should be a directory.
:param force: Whether to overwrite an existing collection build artifact or fail.
:return: The path to the collection build artifact.
"""
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
b_galaxy_path = os.path.join(b_collection_path, b'galaxy.yml')
if not os.path.exists(b_galaxy_path):
raise AnsibleError("The collection galaxy.yml path '%s' does not exist." % to_native(b_galaxy_path))
collection_meta = _get_galaxy_yml(b_galaxy_path)
file_manifest = _build_files_manifest(b_collection_path, collection_meta['namespace'], collection_meta['name'],
collection_meta['build_ignore'])
collection_manifest = _build_manifest(**collection_meta)
collection_output = os.path.join(output_path, "%s-%s-%s.tar.gz" % (collection_meta['namespace'],
collection_meta['name'],
collection_meta['version']))
b_collection_output = to_bytes(collection_output, errors='surrogate_or_strict')
if os.path.exists(b_collection_output):
if os.path.isdir(b_collection_output):
raise AnsibleError("The output collection artifact '%s' already exists, "
"but is a directory - aborting" % to_native(collection_output))
elif not force:
raise AnsibleError("The file '%s' already exists. You can use --force to re-create "
"the collection artifact." % to_native(collection_output))
_build_collection_tar(b_collection_path, b_collection_output, collection_manifest, file_manifest)
def publish_collection(collection_path, api, wait, timeout):
"""
Publish an Ansible collection tarball into an Ansible Galaxy server.
:param collection_path: The path to the collection tarball to publish.
:param api: A GalaxyAPI to publish the collection to.
:param wait: Whether to wait until the import process is complete.
:param timeout: The time in seconds to wait for the import process to finish, 0 is indefinite.
"""
import_uri = api.publish_collection(collection_path)
if wait:
# Galaxy returns a url fragment which differs between v2 and v3. The second to last entry is
# always the task_id, though.
# v2: {"task": "https://galaxy-dev.ansible.com/api/v2/collection-imports/35573/"}
# v3: {"task": "/api/automation-hub/v3/imports/collections/838d1308-a8f4-402c-95cb-7823f3806cd8/"}
task_id = None
for path_segment in reversed(import_uri.split('/')):
if path_segment:
task_id = path_segment
break
if not task_id:
raise AnsibleError("Publishing the collection did not return valid task info. Cannot wait for task status. Returned task info: '%s'" % import_uri)
display.display("Collection has been published to the Galaxy server %s %s" % (api.name, api.api_server))
with _display_progress():
api.wait_import_task(task_id, timeout)
display.display("Collection has been successfully published and imported to the Galaxy server %s %s"
% (api.name, api.api_server))
else:
display.display("Collection has been pushed to the Galaxy server %s %s, not waiting until import has "
"completed due to --no-wait being set. Import task results can be found at %s"
% (api.name, api.api_server, import_uri))
def install_collections(collections, output_path, apis, validate_certs, ignore_errors, no_deps, force, force_deps,
allow_pre_release=False):
"""
Install Ansible collections to the path specified.
:param collections: The collections to install, should be a list of tuples with (name, requirement, Galaxy server).
:param output_path: The path to install the collections to.
:param apis: A list of GalaxyAPIs to query when searching for a collection.
:param validate_certs: Whether to validate the certificates if downloading a tarball.
:param ignore_errors: Whether to ignore any errors when installing the collection.
:param no_deps: Ignore any collection dependencies and only install the base requirements.
:param force: Re-install a collection if it has already been installed.
:param force_deps: Re-install a collection as well as its dependencies if they have already been installed.
"""
existing_collections = find_existing_collections(output_path)
with _tempdir() as b_temp_path:
display.display("Process install dependency map")
with _display_progress():
dependency_map = _build_dependency_map(collections, existing_collections, b_temp_path, apis,
validate_certs, force, force_deps, no_deps,
allow_pre_release=allow_pre_release)
display.display("Starting collection install process")
with _display_progress():
for collection in dependency_map.values():
try:
collection.install(output_path, b_temp_path)
except AnsibleError as err:
if ignore_errors:
display.warning("Failed to install collection %s but skipping due to --ignore-errors being set. "
"Error: %s" % (to_text(collection), to_text(err)))
else:
raise
def validate_collection_name(name):
"""
Validates the collection name as an input from the user or a requirements file fit the requirements.
:param name: The input name with optional range specifier split by ':'.
:return: The input value, required for argparse validation.
"""
collection, dummy, dummy = name.partition(':')
if AnsibleCollectionRef.is_valid_collection_name(collection):
return name
raise AnsibleError("Invalid collection name '%s', "
"name must be in the format <namespace>.<collection>. \n"
"Please make sure namespace and collection name contains "
"characters from [a-zA-Z0-9_] only." % name)
def validate_collection_path(collection_path):
""" Ensure a given path ends with 'ansible_collections'
:param collection_path: The path that should end in 'ansible_collections'
:return: collection_path ending in 'ansible_collections' if it does not already.
"""
if os.path.split(collection_path)[1] != 'ansible_collections':
return os.path.join(collection_path, 'ansible_collections')
return collection_path
def verify_collections(collections, search_paths, apis, validate_certs, ignore_errors, allow_pre_release=False):
with _display_progress():
with _tempdir() as b_temp_path:
for collection in collections:
try:
local_collection = None
b_collection = to_bytes(collection[0], errors='surrogate_or_strict')
if os.path.isfile(b_collection) or urlparse(collection[0]).scheme.lower() in ['http', 'https'] or len(collection[0].split('.')) != 2:
raise AnsibleError(message="'%s' is not a valid collection name. The format namespace.name is expected." % collection[0])
collection_name = collection[0]
namespace, name = collection_name.split('.')
collection_version = collection[1]
# Verify local collection exists before downloading it from a galaxy server
for search_path in search_paths:
b_search_path = to_bytes(os.path.join(search_path, namespace, name), errors='surrogate_or_strict')
if os.path.isdir(b_search_path):
local_collection = CollectionRequirement.from_path(b_search_path, False)
break
if local_collection is None:
raise AnsibleError(message='Collection %s is not installed in any of the collection paths.' % collection_name)
# Download collection on a galaxy server for comparison
try:
remote_collection = CollectionRequirement.from_name(collection_name, apis, collection_version, False, parent=None,
allow_pre_release=allow_pre_release)
except AnsibleError as e:
if e.message == 'Failed to find collection %s:%s' % (collection[0], collection[1]):
raise AnsibleError('Failed to find remote collection %s:%s on any of the galaxy servers' % (collection[0], collection[1]))
raise
download_url = remote_collection.metadata.download_url
headers = {}
remote_collection.api._add_auth_token(headers, download_url, required=False)
b_temp_tar_path = _download_file(download_url, b_temp_path, None, validate_certs, headers=headers)
local_collection.verify(remote_collection, search_path, b_temp_tar_path)
except AnsibleError as err:
if ignore_errors:
display.warning("Failed to verify collection %s but skipping due to --ignore-errors being set. "
"Error: %s" % (collection[0], to_text(err)))
else:
raise
@contextmanager
def _tempdir():
b_temp_path = tempfile.mkdtemp(dir=to_bytes(C.DEFAULT_LOCAL_TMP, errors='surrogate_or_strict'))
yield b_temp_path
shutil.rmtree(b_temp_path)
@contextmanager
def _tarfile_extract(tar, member):
tar_obj = tar.extractfile(member)
yield tar_obj
tar_obj.close()
@contextmanager
def _display_progress():
config_display = C.GALAXY_DISPLAY_PROGRESS
display_wheel = sys.stdout.isatty() if config_display is None else config_display
if not display_wheel:
yield
return
def progress(display_queue, actual_display):
actual_display.debug("Starting display_progress display thread")
t = threading.current_thread()
while True:
for c in "|/-\\":
actual_display.display(c + "\b", newline=False)
time.sleep(0.1)
# Display a message from the main thread
while True:
try:
method, args, kwargs = display_queue.get(block=False, timeout=0.1)
except queue.Empty:
break
else:
func = getattr(actual_display, method)
func(*args, **kwargs)
if getattr(t, "finish", False):
actual_display.debug("Received end signal for display_progress display thread")
return
class DisplayThread(object):
def __init__(self, display_queue):
self.display_queue = display_queue
def __getattr__(self, attr):
def call_display(*args, **kwargs):
self.display_queue.put((attr, args, kwargs))
return call_display
# Temporary override the global display class with our own which add the calls to a queue for the thread to call.
global display
old_display = display
try:
display_queue = queue.Queue()
display = DisplayThread(display_queue)
t = threading.Thread(target=progress, args=(display_queue, old_display))
t.daemon = True
t.start()
try:
yield
finally:
t.finish = True
t.join()
except Exception:
# The exception is re-raised so we can sure the thread is finished and not using the display anymore
raise
finally:
display = old_display
def _get_galaxy_yml(b_galaxy_yml_path):
meta_info = get_collections_galaxy_meta_info()
mandatory_keys = set()
string_keys = set()
list_keys = set()
dict_keys = set()
for info in meta_info:
if info.get('required', False):
mandatory_keys.add(info['key'])
key_list_type = {
'str': string_keys,
'list': list_keys,
'dict': dict_keys,
}[info.get('type', 'str')]
key_list_type.add(info['key'])
all_keys = frozenset(list(mandatory_keys) + list(string_keys) + list(list_keys) + list(dict_keys))
try:
with open(b_galaxy_yml_path, 'rb') as g_yaml:
galaxy_yml = yaml.safe_load(g_yaml)
except YAMLError as err:
raise AnsibleError("Failed to parse the galaxy.yml at '%s' with the following error:\n%s"
% (to_native(b_galaxy_yml_path), to_native(err)))
set_keys = set(galaxy_yml.keys())
missing_keys = mandatory_keys.difference(set_keys)
if missing_keys:
raise AnsibleError("The collection galaxy.yml at '%s' is missing the following mandatory keys: %s"
% (to_native(b_galaxy_yml_path), ", ".join(sorted(missing_keys))))
extra_keys = set_keys.difference(all_keys)
if len(extra_keys) > 0:
display.warning("Found unknown keys in collection galaxy.yml at '%s': %s"
% (to_text(b_galaxy_yml_path), ", ".join(extra_keys)))
# Add the defaults if they have not been set
for optional_string in string_keys:
if optional_string not in galaxy_yml:
galaxy_yml[optional_string] = None
for optional_list in list_keys:
list_val = galaxy_yml.get(optional_list, None)
if list_val is None:
galaxy_yml[optional_list] = []
elif not isinstance(list_val, list):
galaxy_yml[optional_list] = [list_val]
for optional_dict in dict_keys:
if optional_dict not in galaxy_yml:
galaxy_yml[optional_dict] = {}
# license is a builtin var in Python, to avoid confusion we just rename it to license_ids
galaxy_yml['license_ids'] = galaxy_yml['license']
del galaxy_yml['license']
return galaxy_yml
def _build_files_manifest(b_collection_path, namespace, name, ignore_patterns):
# We always ignore .pyc and .retry files as well as some well known version control directories. The ignore
# patterns can be extended by the build_ignore key in galaxy.yml
b_ignore_patterns = [
b'galaxy.yml',
b'*.pyc',
b'*.retry',
b'tests/output', # Ignore ansible-test result output directory.
to_bytes('{0}-{1}-*.tar.gz'.format(namespace, name)), # Ignores previously built artifacts in the root dir.
]
b_ignore_patterns += [to_bytes(p) for p in ignore_patterns]
b_ignore_dirs = frozenset([b'CVS', b'.bzr', b'.hg', b'.git', b'.svn', b'__pycache__', b'.tox'])
entry_template = {
'name': None,
'ftype': None,
'chksum_type': None,
'chksum_sha256': None,
'format': MANIFEST_FORMAT
}
manifest = {
'files': [
{
'name': '.',
'ftype': 'dir',
'chksum_type': None,
'chksum_sha256': None,
'format': MANIFEST_FORMAT,
},
],
'format': MANIFEST_FORMAT,
}
def _walk(b_path, b_top_level_dir):
for b_item in os.listdir(b_path):
b_abs_path = os.path.join(b_path, b_item)
b_rel_base_dir = b'' if b_path == b_top_level_dir else b_path[len(b_top_level_dir) + 1:]
b_rel_path = os.path.join(b_rel_base_dir, b_item)
rel_path = to_text(b_rel_path, errors='surrogate_or_strict')
if os.path.isdir(b_abs_path):
if any(b_item == b_path for b_path in b_ignore_dirs) or \
any(fnmatch.fnmatch(b_rel_path, b_pattern) for b_pattern in b_ignore_patterns):
display.vvv("Skipping '%s' for collection build" % to_text(b_abs_path))
continue
if os.path.islink(b_abs_path):
b_link_target = os.path.realpath(b_abs_path)
if not b_link_target.startswith(b_top_level_dir):
display.warning("Skipping '%s' as it is a symbolic link to a directory outside the collection"
% to_text(b_abs_path))
continue
manifest_entry = entry_template.copy()
manifest_entry['name'] = rel_path
manifest_entry['ftype'] = 'dir'
manifest['files'].append(manifest_entry)
_walk(b_abs_path, b_top_level_dir)
else:
if any(fnmatch.fnmatch(b_rel_path, b_pattern) for b_pattern in b_ignore_patterns):
display.vvv("Skipping '%s' for collection build" % to_text(b_abs_path))
continue
manifest_entry = entry_template.copy()
manifest_entry['name'] = rel_path
manifest_entry['ftype'] = 'file'
manifest_entry['chksum_type'] = 'sha256'
manifest_entry['chksum_sha256'] = secure_hash(b_abs_path, hash_func=sha256)
manifest['files'].append(manifest_entry)
_walk(b_collection_path, b_collection_path)
return manifest
def _build_manifest(namespace, name, version, authors, readme, tags, description, license_ids, license_file,
dependencies, repository, documentation, homepage, issues, **kwargs):
manifest = {
'collection_info': {
'namespace': namespace,
'name': name,
'version': version,
'authors': authors,
'readme': readme,
'tags': tags,
'description': description,
'license': license_ids,
'license_file': license_file if license_file else None, # Handle galaxy.yml having an empty string (None)
'dependencies': dependencies,
'repository': repository,
'documentation': documentation,
'homepage': homepage,
'issues': issues,
},
'file_manifest_file': {
'name': 'FILES.json',
'ftype': 'file',
'chksum_type': 'sha256',
'chksum_sha256': None, # Filled out in _build_collection_tar
'format': MANIFEST_FORMAT
},
'format': MANIFEST_FORMAT,
}
return manifest
def _build_collection_tar(b_collection_path, b_tar_path, collection_manifest, file_manifest):
files_manifest_json = to_bytes(json.dumps(file_manifest, indent=True), errors='surrogate_or_strict')
collection_manifest['file_manifest_file']['chksum_sha256'] = secure_hash_s(files_manifest_json, hash_func=sha256)
collection_manifest_json = to_bytes(json.dumps(collection_manifest, indent=True), errors='surrogate_or_strict')
with _tempdir() as b_temp_path:
b_tar_filepath = os.path.join(b_temp_path, os.path.basename(b_tar_path))
with tarfile.open(b_tar_filepath, mode='w:gz') as tar_file:
# Add the MANIFEST.json and FILES.json file to the archive
for name, b in [('MANIFEST.json', collection_manifest_json), ('FILES.json', files_manifest_json)]:
b_io = BytesIO(b)
tar_info = tarfile.TarInfo(name)
tar_info.size = len(b)
tar_info.mtime = time.time()
tar_info.mode = 0o0644
tar_file.addfile(tarinfo=tar_info, fileobj=b_io)
for file_info in file_manifest['files']:
if file_info['name'] == '.':
continue
# arcname expects a native string, cannot be bytes
filename = to_native(file_info['name'], errors='surrogate_or_strict')
b_src_path = os.path.join(b_collection_path, to_bytes(filename, errors='surrogate_or_strict'))
def reset_stat(tarinfo):
existing_is_exec = tarinfo.mode & stat.S_IXUSR
tarinfo.mode = 0o0755 if existing_is_exec or tarinfo.isdir() else 0o0644
tarinfo.uid = tarinfo.gid = 0
tarinfo.uname = tarinfo.gname = ''
return tarinfo
tar_file.add(os.path.realpath(b_src_path), arcname=filename, recursive=False, filter=reset_stat)
shutil.copy(b_tar_filepath, b_tar_path)
collection_name = "%s.%s" % (collection_manifest['collection_info']['namespace'],
collection_manifest['collection_info']['name'])
display.display('Created collection for %s at %s' % (collection_name, to_text(b_tar_path)))
def find_existing_collections(path):
collections = []
b_path = to_bytes(path, errors='surrogate_or_strict')
for b_namespace in os.listdir(b_path):
b_namespace_path = os.path.join(b_path, b_namespace)
if os.path.isfile(b_namespace_path):
continue
for b_collection in os.listdir(b_namespace_path):
b_collection_path = os.path.join(b_namespace_path, b_collection)
if os.path.isdir(b_collection_path):
req = CollectionRequirement.from_path(b_collection_path, False)
display.vvv("Found installed collection %s:%s at '%s'" % (to_text(req), req.latest_version,
to_text(b_collection_path)))
collections.append(req)
return collections
def _build_dependency_map(collections, existing_collections, b_temp_path, apis, validate_certs, force, force_deps,
no_deps, allow_pre_release=False):
dependency_map = {}
# First build the dependency map on the actual requirements
for name, version, source in collections:
_get_collection_info(dependency_map, existing_collections, name, version, source, b_temp_path, apis,
validate_certs, (force or force_deps), allow_pre_release=allow_pre_release)
checked_parents = set([to_text(c) for c in dependency_map.values() if c.skip])
while len(dependency_map) != len(checked_parents):
while not no_deps: # Only parse dependencies if no_deps was not set
parents_to_check = set(dependency_map.keys()).difference(checked_parents)
deps_exhausted = True
for parent in parents_to_check:
parent_info = dependency_map[parent]
if parent_info.dependencies:
deps_exhausted = False
for dep_name, dep_requirement in parent_info.dependencies.items():
_get_collection_info(dependency_map, existing_collections, dep_name, dep_requirement,
parent_info.api, b_temp_path, apis, validate_certs, force_deps,
parent=parent, allow_pre_release=allow_pre_release)
checked_parents.add(parent)
# No extra dependencies were resolved, exit loop
if deps_exhausted:
break
# Now we have resolved the deps to our best extent, now select the latest version for collections with
# multiple versions found and go from there
deps_not_checked = set(dependency_map.keys()).difference(checked_parents)
for collection in deps_not_checked:
dependency_map[collection].set_latest_version()
if no_deps or len(dependency_map[collection].dependencies) == 0:
checked_parents.add(collection)
return dependency_map
def _get_collection_info(dep_map, existing_collections, collection, requirement, source, b_temp_path, apis,
validate_certs, force, parent=None, allow_pre_release=False):
dep_msg = ""
if parent:
dep_msg = " - as dependency of %s" % parent
display.vvv("Processing requirement collection '%s'%s" % (to_text(collection), dep_msg))
b_tar_path = None
if os.path.isfile(to_bytes(collection, errors='surrogate_or_strict')):
display.vvvv("Collection requirement '%s' is a tar artifact" % to_text(collection))
b_tar_path = to_bytes(collection, errors='surrogate_or_strict')
elif urlparse(collection).scheme.lower() in ['http', 'https']:
display.vvvv("Collection requirement '%s' is a URL to a tar artifact" % collection)
try:
b_tar_path = _download_file(collection, b_temp_path, None, validate_certs)
except urllib_error.URLError as err:
raise AnsibleError("Failed to download collection tar from '%s': %s"
% (to_native(collection), to_native(err)))
if b_tar_path:
req = CollectionRequirement.from_tar(b_tar_path, force, parent=parent)
collection_name = to_text(req)
if collection_name in dep_map:
collection_info = dep_map[collection_name]
collection_info.add_requirement(None, req.latest_version)
else:
collection_info = req
else:
validate_collection_name(collection)
display.vvvv("Collection requirement '%s' is the name of a collection" % collection)
if collection in dep_map:
collection_info = dep_map[collection]
collection_info.add_requirement(parent, requirement)
else:
apis = [source] if source else apis
collection_info = CollectionRequirement.from_name(collection, apis, requirement, force, parent=parent,
allow_pre_release=allow_pre_release)
existing = [c for c in existing_collections if to_text(c) == to_text(collection_info)]
if existing and not collection_info.force:
# Test that the installed collection fits the requirement
existing[0].add_requirement(parent, requirement)
collection_info = existing[0]
dep_map[to_text(collection_info)] = collection_info
def _download_file(url, b_path, expected_hash, validate_certs, headers=None):
urlsplit = os.path.splitext(to_text(url.rsplit('/', 1)[1]))
b_file_name = to_bytes(urlsplit[0], errors='surrogate_or_strict')
b_file_ext = to_bytes(urlsplit[1], errors='surrogate_or_strict')
b_file_path = tempfile.NamedTemporaryFile(dir=b_path, prefix=b_file_name, suffix=b_file_ext, delete=False).name
display.vvv("Downloading %s to %s" % (url, to_text(b_path)))
# Galaxy redirs downloads to S3 which reject the request if an Authorization header is attached so don't redir that
resp = open_url(to_native(url, errors='surrogate_or_strict'), validate_certs=validate_certs, headers=headers,
unredirected_headers=['Authorization'], http_agent=user_agent())
with open(b_file_path, 'wb') as download_file:
actual_hash = _consume_file(resp, download_file)
if expected_hash:
display.vvvv("Validating downloaded file hash %s with expected hash %s" % (actual_hash, expected_hash))
if expected_hash != actual_hash:
raise AnsibleError("Mismatch artifact hash with downloaded file")
return b_file_path
def _extract_tar_file(tar, filename, b_dest, b_temp_path, expected_hash=None):
with _get_tar_file_member(tar, filename) as tar_obj:
with tempfile.NamedTemporaryFile(dir=b_temp_path, delete=False) as tmpfile_obj:
actual_hash = _consume_file(tar_obj, tmpfile_obj)
if expected_hash and actual_hash != expected_hash:
raise AnsibleError("Checksum mismatch for '%s' inside collection at '%s'"
% (to_native(filename, errors='surrogate_or_strict'), to_native(tar.name)))
b_dest_filepath = os.path.join(b_dest, to_bytes(filename, errors='surrogate_or_strict'))
b_parent_dir = os.path.split(b_dest_filepath)[0]
if not os.path.exists(b_parent_dir):
# Seems like Galaxy does not validate if all file entries have a corresponding dir ftype entry. This check
# makes sure we create the parent directory even if it wasn't set in the metadata.
os.makedirs(b_parent_dir, mode=0o0755)
shutil.move(to_bytes(tmpfile_obj.name, errors='surrogate_or_strict'), b_dest_filepath)
# Default to rw-r--r-- and only add execute if the tar file has execute.
tar_member = tar.getmember(to_native(filename, errors='surrogate_or_strict'))
new_mode = 0o644
if stat.S_IMODE(tar_member.mode) & stat.S_IXUSR:
new_mode |= 0o0111
os.chmod(b_dest_filepath, new_mode)
def _get_tar_file_member(tar, filename):
n_filename = to_native(filename, errors='surrogate_or_strict')
try:
member = tar.getmember(n_filename)
except KeyError:
raise AnsibleError("Collection tar at '%s' does not contain the expected file '%s'." % (
to_native(tar.name),
n_filename))
return _tarfile_extract(tar, member)
def _get_json_from_tar_file(b_path, filename):
file_contents = ''
with tarfile.open(b_path, mode='r') as collection_tar:
with _get_tar_file_member(collection_tar, filename) as tar_obj:
bufsize = 65536
data = tar_obj.read(bufsize)
while data:
file_contents += to_text(data)
data = tar_obj.read(bufsize)
return json.loads(file_contents)
def _get_tar_file_hash(b_path, filename):
with tarfile.open(b_path, mode='r') as collection_tar:
with _get_tar_file_member(collection_tar, filename) as tar_obj:
return _consume_file(tar_obj)
def _consume_file(read_from, write_to=None):
bufsize = 65536
sha256_digest = sha256()
data = read_from.read(bufsize)
while data:
if write_to is not None:
write_to.write(data)
write_to.flush()
sha256_digest.update(data)
data = read_from.read(bufsize)
return sha256_digest.hexdigest()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 66,534 |
ansible-galaxy cli should have an option to keep the collection tarball when installing collection
|
<!--- Verify first that your feature was not already discussed on GitHub -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Describe the new feature/improvement briefly below -->
Use case is for customers on air-gapped customers who want to take the tarball using a flash drive inside their network.
Currently that is not possible because cli does not keep the tarball after installing the collection.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ansible-galaxy cli
##### ADDITIONAL INFORMATION
<!--- Describe how the feature would be used, why it is needed and what it would solve -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can also paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/66534
|
https://github.com/ansible/ansible/pull/67632
|
28f8b8976022728b24534cae871d2b3c8724ecce
|
a2deeb8fa27633194d12dfd8e8768ab57100e6d1
| 2020-01-16T15:02:42Z |
python
| 2020-03-24T22:32:43Z |
test/integration/targets/ansible-galaxy-collection/tasks/download.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 66,534 |
ansible-galaxy cli should have an option to keep the collection tarball when installing collection
|
<!--- Verify first that your feature was not already discussed on GitHub -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Describe the new feature/improvement briefly below -->
Use case is for customers on air-gapped customers who want to take the tarball using a flash drive inside their network.
Currently that is not possible because cli does not keep the tarball after installing the collection.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ansible-galaxy cli
##### ADDITIONAL INFORMATION
<!--- Describe how the feature would be used, why it is needed and what it would solve -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can also paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/66534
|
https://github.com/ansible/ansible/pull/67632
|
28f8b8976022728b24534cae871d2b3c8724ecce
|
a2deeb8fa27633194d12dfd8e8768ab57100e6d1
| 2020-01-16T15:02:42Z |
python
| 2020-03-24T22:32:43Z |
test/integration/targets/ansible-galaxy-collection/tasks/main.yml
|
---
- name: set some facts for tests
set_fact:
galaxy_dir: "{{ remote_tmp_dir }}/galaxy"
- name: create scratch dir used for testing
file:
path: '{{ galaxy_dir }}/scratch'
state: directory
- name: run ansible-galaxy collection init tests
import_tasks: init.yml
- name: run ansible-galaxy collection build tests
import_tasks: build.yml
- name: create test ansible.cfg that contains the Galaxy server list
template:
src: ansible.cfg.j2
dest: '{{ galaxy_dir }}/ansible.cfg'
- name: run ansible-galaxy collection publish tests for {{ test_name }}
include_tasks: publish.yml
vars:
test_name: '{{ item.name }}'
test_server: '{{ item.server }}'
with_items:
- name: galaxy
server: '{{ fallaxy_galaxy_server }}'
- name: automation_hub
server: '{{ fallaxy_ah_server }}'
# We use a module for this so we can speed up the test time.
- name: setup test collections for install test
setup_collections:
server: '{{ fallaxy_galaxy_server }}'
token: '{{ fallaxy_token }}'
collections:
# Scenario to test out pre-release being ignored unless explicitly set and version pagination.
- namespace: namespace1
name: name1
version: 0.0.1
- namespace: namespace1
name: name1
version: 0.0.2
- namespace: namespace1
name: name1
version: 0.0.3
- namespace: namespace1
name: name1
version: 0.0.4
- namespace: namespace1
name: name1
version: 0.0.5
- namespace: namespace1
name: name1
version: 0.0.6
- namespace: namespace1
name: name1
version: 0.0.7
- namespace: namespace1
name: name1
version: 0.0.8
- namespace: namespace1
name: name1
version: 0.0.9
- namespace: namespace1
name: name1
version: 0.0.10
- namespace: namespace1
name: name1
version: 0.1.0
- namespace: namespace1
name: name1
version: 1.0.0
- namespace: namespace1
name: name1
version: 1.0.9
- namespace: namespace1
name: name1
version: 1.1.0-beta.1
# Pad out number of namespaces for pagination testing
- namespace: namespace2
name: name
- namespace: namespace3
name: name
- namespace: namespace4
name: name
- namespace: namespace5
name: name
- namespace: namespace6
name: name
- namespace: namespace7
name: name
- namespace: namespace8
name: name
- namespace: namespace9
name: name
# Complex dependency resolution
- namespace: parent_dep
name: parent_collection
dependencies:
child_dep.child_collection: '>=0.5.0,<1.0.0'
- namespace: child_dep
name: child_collection
version: 0.4.0
- namespace: child_dep
name: child_collection
version: 0.5.0
- namespace: child_dep
name: child_collection
version: 0.9.9
dependencies:
child_dep.child_dep2: '!=1.2.3'
- namespace: child_dep
name: child_collection
- namespace: child_dep
name: child_dep2
version: 1.2.2
- namespace: child_dep
name: child_dep2
version: 1.2.3
# Dep resolution failure
- namespace: fail_namespace
name: fail_collection
version: 2.1.2
dependencies:
fail_dep.name: '0.0.5'
fail_dep2.name: '<0.0.5'
- namespace: fail_dep
name: name
version: '0.0.5'
dependencies:
fail_dep2.name: '>0.0.5'
- namespace: fail_dep2
name: name
- name: run ansible-galaxy collection install tests for {{ test_name }}
include_tasks: install.yml
vars:
test_name: '{{ item.name }}'
test_server: '{{ item.server }}'
with_items:
- name: galaxy
server: '{{ fallaxy_galaxy_server }}'
- name: automation_hub
server: '{{ fallaxy_ah_server }}'
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,060 |
Extend FQCN behavior to ActionModule._execute_module()
|
<!--- Verify first that your feature was not already discussed on GitHub -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Describe the new feature/improvement briefly below -->
Action plugins should be able to utilize modules provided in separate collections. I can't seem to get the desired behavior using standard methods.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ActionModule._execute_module()
##### ADDITIONAL INFORMATION
<!--- Describe how the feature would be used, why it is needed and what it would solve -->
Collections may benefit from the ability to leverage another module (which is located in a separate collection) as part of their own action plugins.
<!--- Paste example playbooks or commands between quotes below -->
```python
class ActionModule(ActionBase):
def run(self, tmp=None, task_vars=None):
''' handler for file transfer operations '''
if task_vars is None:
task_vars = dict()
result = super(ActionModule, self).run(tmp, task_vars)
if result.get('skipped'):
return result
module_args = self._task.args.copy()
result.update(
self.ActionModule._execute_module(
module_name='ansible.collectionname.modulename',
module_args=module_args,
task_vars=task_vars,
)
)
return result
```
<!--- HINT: You can also paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/68060
|
https://github.com/ansible/ansible/pull/68080
|
6acaf9fa9521fb6462d8e89e5e8f0248dca80383
|
ecd66a6a6e4195a7e7fe734701a8762a059132c1
| 2020-03-06T00:18:15Z |
python
| 2020-03-25T15:57:53Z |
test/integration/targets/collections/collections/ansible_collections/me/mycoll1/plugins/action/action1.py
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,060 |
Extend FQCN behavior to ActionModule._execute_module()
|
<!--- Verify first that your feature was not already discussed on GitHub -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Describe the new feature/improvement briefly below -->
Action plugins should be able to utilize modules provided in separate collections. I can't seem to get the desired behavior using standard methods.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ActionModule._execute_module()
##### ADDITIONAL INFORMATION
<!--- Describe how the feature would be used, why it is needed and what it would solve -->
Collections may benefit from the ability to leverage another module (which is located in a separate collection) as part of their own action plugins.
<!--- Paste example playbooks or commands between quotes below -->
```python
class ActionModule(ActionBase):
def run(self, tmp=None, task_vars=None):
''' handler for file transfer operations '''
if task_vars is None:
task_vars = dict()
result = super(ActionModule, self).run(tmp, task_vars)
if result.get('skipped'):
return result
module_args = self._task.args.copy()
result.update(
self.ActionModule._execute_module(
module_name='ansible.collectionname.modulename',
module_args=module_args,
task_vars=task_vars,
)
)
return result
```
<!--- HINT: You can also paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/68060
|
https://github.com/ansible/ansible/pull/68080
|
6acaf9fa9521fb6462d8e89e5e8f0248dca80383
|
ecd66a6a6e4195a7e7fe734701a8762a059132c1
| 2020-03-06T00:18:15Z |
python
| 2020-03-25T15:57:53Z |
test/integration/targets/collections/collections/ansible_collections/me/mycoll1/plugins/modules/action1.py
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,060 |
Extend FQCN behavior to ActionModule._execute_module()
|
<!--- Verify first that your feature was not already discussed on GitHub -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Describe the new feature/improvement briefly below -->
Action plugins should be able to utilize modules provided in separate collections. I can't seem to get the desired behavior using standard methods.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ActionModule._execute_module()
##### ADDITIONAL INFORMATION
<!--- Describe how the feature would be used, why it is needed and what it would solve -->
Collections may benefit from the ability to leverage another module (which is located in a separate collection) as part of their own action plugins.
<!--- Paste example playbooks or commands between quotes below -->
```python
class ActionModule(ActionBase):
def run(self, tmp=None, task_vars=None):
''' handler for file transfer operations '''
if task_vars is None:
task_vars = dict()
result = super(ActionModule, self).run(tmp, task_vars)
if result.get('skipped'):
return result
module_args = self._task.args.copy()
result.update(
self.ActionModule._execute_module(
module_name='ansible.collectionname.modulename',
module_args=module_args,
task_vars=task_vars,
)
)
return result
```
<!--- HINT: You can also paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/68060
|
https://github.com/ansible/ansible/pull/68080
|
6acaf9fa9521fb6462d8e89e5e8f0248dca80383
|
ecd66a6a6e4195a7e7fe734701a8762a059132c1
| 2020-03-06T00:18:15Z |
python
| 2020-03-25T15:57:53Z |
test/integration/targets/collections/collections/ansible_collections/me/mycoll2/plugins/modules/module1.py
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,060 |
Extend FQCN behavior to ActionModule._execute_module()
|
<!--- Verify first that your feature was not already discussed on GitHub -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Describe the new feature/improvement briefly below -->
Action plugins should be able to utilize modules provided in separate collections. I can't seem to get the desired behavior using standard methods.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ActionModule._execute_module()
##### ADDITIONAL INFORMATION
<!--- Describe how the feature would be used, why it is needed and what it would solve -->
Collections may benefit from the ability to leverage another module (which is located in a separate collection) as part of their own action plugins.
<!--- Paste example playbooks or commands between quotes below -->
```python
class ActionModule(ActionBase):
def run(self, tmp=None, task_vars=None):
''' handler for file transfer operations '''
if task_vars is None:
task_vars = dict()
result = super(ActionModule, self).run(tmp, task_vars)
if result.get('skipped'):
return result
module_args = self._task.args.copy()
result.update(
self.ActionModule._execute_module(
module_name='ansible.collectionname.modulename',
module_args=module_args,
task_vars=task_vars,
)
)
return result
```
<!--- HINT: You can also paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/68060
|
https://github.com/ansible/ansible/pull/68080
|
6acaf9fa9521fb6462d8e89e5e8f0248dca80383
|
ecd66a6a6e4195a7e7fe734701a8762a059132c1
| 2020-03-06T00:18:15Z |
python
| 2020-03-25T15:57:53Z |
test/integration/targets/collections/invocation_tests.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,060 |
Extend FQCN behavior to ActionModule._execute_module()
|
<!--- Verify first that your feature was not already discussed on GitHub -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Describe the new feature/improvement briefly below -->
Action plugins should be able to utilize modules provided in separate collections. I can't seem to get the desired behavior using standard methods.
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ActionModule._execute_module()
##### ADDITIONAL INFORMATION
<!--- Describe how the feature would be used, why it is needed and what it would solve -->
Collections may benefit from the ability to leverage another module (which is located in a separate collection) as part of their own action plugins.
<!--- Paste example playbooks or commands between quotes below -->
```python
class ActionModule(ActionBase):
def run(self, tmp=None, task_vars=None):
''' handler for file transfer operations '''
if task_vars is None:
task_vars = dict()
result = super(ActionModule, self).run(tmp, task_vars)
if result.get('skipped'):
return result
module_args = self._task.args.copy()
result.update(
self.ActionModule._execute_module(
module_name='ansible.collectionname.modulename',
module_args=module_args,
task_vars=task_vars,
)
)
return result
```
<!--- HINT: You can also paste gist.github.com links for larger files -->
|
https://github.com/ansible/ansible/issues/68060
|
https://github.com/ansible/ansible/pull/68080
|
6acaf9fa9521fb6462d8e89e5e8f0248dca80383
|
ecd66a6a6e4195a7e7fe734701a8762a059132c1
| 2020-03-06T00:18:15Z |
python
| 2020-03-25T15:57:53Z |
test/integration/targets/collections/runme.sh
|
#!/usr/bin/env bash
set -eux
export ANSIBLE_COLLECTIONS_PATHS=$PWD/collection_root_user:$PWD/collection_root_sys
export ANSIBLE_GATHERING=explicit
export ANSIBLE_GATHER_SUBSET=minimal
export ANSIBLE_HOST_PATTERN_MISMATCH=error
# FUTURE: just use INVENTORY_PATH as-is once ansible-test sets the right dir
ipath=../../$(basename "${INVENTORY_PATH}")
export INVENTORY_PATH="$ipath"
# test callback
ANSIBLE_CALLBACK_WHITELIST=testns.testcoll.usercallback ansible localhost -m ping | grep "usercallback says ok"
# test documentation
ansible-doc testns.testcoll.testmodule -vvv | grep -- "- normal_doc_frag"
# test adhoc default collection resolution (use unqualified collection module with playbook dir under its collection)
echo "testing adhoc default collection support with explicit playbook dir"
ANSIBLE_PLAYBOOK_DIR=./collection_root_user/ansible_collections/testns/testcoll ansible localhost -m testmodule
echo "testing bad doc_fragments (expected ERROR message follows)"
# test documentation failure
ansible-doc testns.testcoll.testmodule_bad_docfrags -vvv 2>&1 | grep -- "unknown doc_fragment"
# we need multiple plays, and conditional import_playbook is noisy and causes problems, so choose here which one to use...
if [[ ${INVENTORY_PATH} == *.winrm ]]; then
export TEST_PLAYBOOK=windows.yml
else
export TEST_PLAYBOOK=posix.yml
echo "testing default collection support"
ansible-playbook -i "${INVENTORY_PATH}" collection_root_user/ansible_collections/testns/testcoll/playbooks/default_collection_playbook.yml
fi
# run test playbook
ansible-playbook -i "${INVENTORY_PATH}" -i ./a.statichost.yml -v "${TEST_PLAYBOOK}" "$@"
# test adjacent with --playbook-dir
export ANSIBLE_COLLECTIONS_PATHS=''
ANSIBLE_INVENTORY_ANY_UNPARSED_IS_FAILED=1 ansible-inventory -i a.statichost.yml --list --export --playbook-dir=. -v "$@"
./vars_plugin_tests.sh
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,185 |
Ansible Playbook on Windows fails if a UNC Path is present in PSModulePath
|
### SUMMARY
Ansible playbook on Windows fails with something like:
```
powershell The 'Out-String' command was found in the module 'Microsoft.PowerShell.Utility', but the module could not be loaded. For more information, run 'Import-Module Microsoft.PowerShell.Utility'.
```
It was discovered, that a [UNC Path is present in PSModulePath](https://support.microsoft.com/en-us/help/4076842), which seems to trigger the double hop problem with certain authentication methods, such as Kerberos.
The PSModulePath contents can be displayed with `$env:PSModulePath`.
This could be documented e.g. in ansible/docs/docsite/rst/user_guide/windows_setup.rst
##### ISSUE TYPE
- Documentation Report
##### COMPONENT NAME
windows_setup.rst
##### ANSIBLE VERSION
```paste below
ansible 2.7.7
config file = /home/administrator/Documents/BGH/automation/ansible/ansible.cfg
configured module search path = ['/home/administrator/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 3.7.3 (default, Dec 20 2019, 18:57:59) [GCC 8.3.0]
```
##### CONFIGURATION
```paste below
DEFAULT_FORKS(/home/administrator/Documents/BGH/automation/ansible/ansible.cfg) = 25
DEFAULT_HOST_LIST(/home/administrator/Documents/BGH/automation/ansible/ansible.cfg) = ['/home/administrator/Documents/BGH/automation/ansible/hosts']
DEFAULT_STDOUT_CALLBACK(/home/administrator/Documents/BGH/automation/ansible/ansible.cfg) = yaml
```
##### OS / ENVIRONMENT
Debian 10, connecting with Ansible to Windows Server 2008 R2 with current updates, .Net 4.8 and PowerShell 5.1
##### ADDITIONAL INFORMATION
Deeper explanation of the double hopping issues, such as this one, could probably save many people lots of time and make Ansible with Windows more approachable. Not everybody can immediately connect all the dots, when somebody writes just IMHO hints.
|
https://github.com/ansible/ansible/issues/68185
|
https://github.com/ansible/ansible/pull/68421
|
02e36fbfc27c0948f0e6929d7cecbbca999256dc
|
7ec0d59c30ae508302320d390285785096779004
| 2020-03-12T09:46:43Z |
python
| 2020-03-25T18:09:32Z |
docs/docsite/rst/user_guide/windows_setup.rst
|
.. _windows_setup:
Setting up a Windows Host
=========================
This document discusses the setup that is required before Ansible can communicate with a Microsoft Windows host.
.. contents::
:local:
Host Requirements
`````````````````
For Ansible to communicate to a Windows host and use Windows modules, the
Windows host must meet these requirements:
* Ansible can generally manage Windows versions under current
and extended support from Microsoft. Ansible can manage desktop OSs including
Windows 7, 8.1, and 10, and server OSs including Windows Server 2008,
2008 R2, 2012, 2012 R2, 2016, and 2019.
* Ansible requires PowerShell 3.0 or newer and at least .NET 4.0 to be
installed on the Windows host.
* A WinRM listener should be created and activated. More details for this can be
found below.
.. Note:: While these are the base requirements for Ansible connectivity, some Ansible
modules have additional requirements, such as a newer OS or PowerShell
version. Please consult the module's documentation page
to determine whether a host meets those requirements.
Upgrading PowerShell and .NET Framework
---------------------------------------
Ansible requires PowerShell version 3.0 and .NET Framework 4.0 or newer to function on older operating systems like Server 2008 and Windows 7. The base image does not meet this
requirement. You can use the `Upgrade-PowerShell.ps1 <https://github.com/jborean93/ansible-windows/blob/master/scripts/Upgrade-PowerShell.ps1>`_ script to update these.
This is an example of how to run this script from PowerShell:
.. code-block:: powershell
$url = "https://raw.githubusercontent.com/jborean93/ansible-windows/master/scripts/Upgrade-PowerShell.ps1"
$file = "$env:temp\Upgrade-PowerShell.ps1"
$username = "Administrator"
$password = "Password"
(New-Object -TypeName System.Net.WebClient).DownloadFile($url, $file)
Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Force
# Version can be 3.0, 4.0 or 5.1
&$file -Version 5.1 -Username $username -Password $password -Verbose
Once completed, you will need to remove auto logon
and set the execution policy back to the default of ``Restricted``. You can
do this with the following PowerShell commands:
.. code-block:: powershell
# This isn't needed but is a good security practice to complete
Set-ExecutionPolicy -ExecutionPolicy Restricted -Force
$reg_winlogon_path = "HKLM:\Software\Microsoft\Windows NT\CurrentVersion\Winlogon"
Set-ItemProperty -Path $reg_winlogon_path -Name AutoAdminLogon -Value 0
Remove-ItemProperty -Path $reg_winlogon_path -Name DefaultUserName -ErrorAction SilentlyContinue
Remove-ItemProperty -Path $reg_winlogon_path -Name DefaultPassword -ErrorAction SilentlyContinue
The script works by checking to see what programs need to be installed
(such as .NET Framework 4.5.2) and what PowerShell version is required. If a reboot
is required and the ``username`` and ``password`` parameters are set, the
script will automatically reboot and logon when it comes back up from the
reboot. The script will continue until no more actions are required and the
PowerShell version matches the target version. If the ``username`` and
``password`` parameters are not set, the script will prompt the user to
manually reboot and logon when required. When the user is next logged in, the
script will continue where it left off and the process continues until no more
actions are required.
.. Note:: If running on Server 2008, then SP2 must be installed. If running on
Server 2008 R2 or Windows 7, then SP1 must be installed.
.. Note:: Windows Server 2008 can only install PowerShell 3.0; specifying a
newer version will result in the script failing.
.. Note:: The ``username`` and ``password`` parameters are stored in plain text
in the registry. Make sure the cleanup commands are run after the script finishes
to ensure no credentials are still stored on the host.
WinRM Memory Hotfix
-------------------
When running on PowerShell v3.0, there is a bug with the WinRM service that
limits the amount of memory available to WinRM. Without this hotfix installed,
Ansible will fail to execute certain commands on the Windows host. These
hotfixes should installed as part of the system bootstrapping or
imaging process. The script `Install-WMF3Hotfix.ps1 <https://github.com/jborean93/ansible-windows/blob/master/scripts/Install-WMF3Hotfix.ps1>`_ can be used to install the hotfix on affected hosts.
The following PowerShell command will install the hotfix:
.. code-block:: powershell
$url = "https://raw.githubusercontent.com/jborean93/ansible-windows/master/scripts/Install-WMF3Hotfix.ps1"
$file = "$env:temp\Install-WMF3Hotfix.ps1"
(New-Object -TypeName System.Net.WebClient).DownloadFile($url, $file)
powershell.exe -ExecutionPolicy ByPass -File $file -Verbose
For more details, please refer to the `Hotfix document <https://support.microsoft.com/en-us/help/2842230/out-of-memory-error-on-a-computer-that-has-a-customized-maxmemorypersh>`_ from Microsoft.
WinRM Setup
```````````
Once Powershell has been upgraded to at least version 3.0, the final step is for the
WinRM service to be configured so that Ansible can connect to it. There are two
main components of the WinRM service that governs how Ansible can interface with
the Windows host: the ``listener`` and the ``service`` configuration settings.
Details about each component can be read below, but the script
`ConfigureRemotingForAnsible.ps1 <https://github.com/ansible/ansible/blob/devel/examples/scripts/ConfigureRemotingForAnsible.ps1>`_
can be used to set up the basics. This script sets up both HTTP and HTTPS
listeners with a self-signed certificate and enables the ``Basic``
authentication option on the service.
To use this script, run the following in PowerShell:
.. code-block:: powershell
$url = "https://raw.githubusercontent.com/ansible/ansible/devel/examples/scripts/ConfigureRemotingForAnsible.ps1"
$file = "$env:temp\ConfigureRemotingForAnsible.ps1"
(New-Object -TypeName System.Net.WebClient).DownloadFile($url, $file)
powershell.exe -ExecutionPolicy ByPass -File $file
There are different switches and parameters (like ``-EnableCredSSP`` and
``-ForceNewSSLCert``) that can be set alongside this script. The documentation
for these options are located at the top of the script itself.
.. Note:: The ConfigureRemotingForAnsible.ps1 script is intended for training and
development purposes only and should not be used in a
production environment, since it enables settings (like ``Basic`` authentication)
that can be inherently insecure.
WinRM Listener
--------------
The WinRM services listens for requests on one or more ports. Each of these ports must have a
listener created and configured.
To view the current listeners that are running on the WinRM service, run the
following command:
.. code-block:: powershell
winrm enumerate winrm/config/Listener
This will output something like::
Listener
Address = *
Transport = HTTP
Port = 5985
Hostname
Enabled = true
URLPrefix = wsman
CertificateThumbprint
ListeningOn = 10.0.2.15, 127.0.0.1, 192.168.56.155, ::1, fe80::5efe:10.0.2.15%6, fe80::5efe:192.168.56.155%8, fe80::
ffff:ffff:fffe%2, fe80::203d:7d97:c2ed:ec78%3, fe80::e8ea:d765:2c69:7756%7
Listener
Address = *
Transport = HTTPS
Port = 5986
Hostname = SERVER2016
Enabled = true
URLPrefix = wsman
CertificateThumbprint = E6CDAA82EEAF2ECE8546E05DB7F3E01AA47D76CE
ListeningOn = 10.0.2.15, 127.0.0.1, 192.168.56.155, ::1, fe80::5efe:10.0.2.15%6, fe80::5efe:192.168.56.155%8, fe80::
ffff:ffff:fffe%2, fe80::203d:7d97:c2ed:ec78%3, fe80::e8ea:d765:2c69:7756%7
In the example above there are two listeners activated; one is listening on
port 5985 over HTTP and the other is listening on port 5986 over HTTPS. Some of
the key options that are useful to understand are:
* ``Transport``: Whether the listener is run over HTTP or HTTPS, it is
recommended to use a listener over HTTPS as the data is encrypted without
any further changes required.
* ``Port``: The port the listener runs on, by default it is ``5985`` for HTTP
and ``5986`` for HTTPS. This port can be changed to whatever is required and
corresponds to the host var ``ansible_port``.
* ``URLPrefix``: The URL prefix to listen on, by default it is ``wsman``. If
this is changed, the host var ``ansible_winrm_path`` must be set to the same
value.
* ``CertificateThumbprint``: If running over an HTTPS listener, this is the
thumbprint of the certificate in the Windows Certificate Store that is used
in the connection. To get the details of the certificate itself, run this
command with the relevant certificate thumbprint in PowerShell::
$thumbprint = "E6CDAA82EEAF2ECE8546E05DB7F3E01AA47D76CE"
Get-ChildItem -Path cert:\LocalMachine\My -Recurse | Where-Object { $_.Thumbprint -eq $thumbprint } | Select-Object *
Setup WinRM Listener
++++++++++++++++++++
There are three ways to set up a WinRM listener:
* Using ``winrm quickconfig`` for HTTP or
``winrm quickconfig -transport:https`` for HTTPS. This is the easiest option
to use when running outside of a domain environment and a simple listener is
required. Unlike the other options, this process also has the added benefit of
opening up the Firewall for the ports required and starts the WinRM service.
* Using Group Policy Objects. This is the best way to create a listener when the
host is a member of a domain because the configuration is done automatically
without any user input. For more information on group policy objects, see the
`Group Policy Objects documentation <https://msdn.microsoft.com/en-us/library/aa374162(v=vs.85).aspx>`_.
* Using PowerShell to create the listener with a specific configuration. This
can be done by running the following PowerShell commands:
.. code-block:: powershell
$selector_set = @{
Address = "*"
Transport = "HTTPS"
}
$value_set = @{
CertificateThumbprint = "E6CDAA82EEAF2ECE8546E05DB7F3E01AA47D76CE"
}
New-WSManInstance -ResourceURI "winrm/config/Listener" -SelectorSet $selector_set -ValueSet $value_set
To see the other options with this PowerShell cmdlet, see
`New-WSManInstance <https://docs.microsoft.com/en-us/powershell/module/microsoft.wsman.management/new-wsmaninstance?view=powershell-5.1>`_.
.. Note:: When creating an HTTPS listener, an existing certificate needs to be
created and stored in the ``LocalMachine\My`` certificate store. Without a
certificate being present in this store, most commands will fail.
Delete WinRM Listener
+++++++++++++++++++++
To remove a WinRM listener::
# Remove all listeners
Remove-Item -Path WSMan:\localhost\Listener\* -Recurse -Force
# Only remove listeners that are run over HTTPS
Get-ChildItem -Path WSMan:\localhost\Listener | Where-Object { $_.Keys -contains "Transport=HTTPS" } | Remove-Item -Recurse -Force
.. Note:: The ``Keys`` object is an array of strings, so it can contain different
values. By default it contains a key for ``Transport=`` and ``Address=``
which correspond to the values from winrm enumerate winrm/config/Listeners.
WinRM Service Options
---------------------
There are a number of options that can be set to control the behavior of the WinRM service component,
including authentication options and memory settings.
To get an output of the current service configuration options, run the
following command:
.. code-block:: powershell
winrm get winrm/config/Service
winrm get winrm/config/Winrs
This will output something like::
Service
RootSDDL = O:NSG:BAD:P(A;;GA;;;BA)(A;;GR;;;IU)S:P(AU;FA;GA;;;WD)(AU;SA;GXGW;;;WD)
MaxConcurrentOperations = 4294967295
MaxConcurrentOperationsPerUser = 1500
EnumerationTimeoutms = 240000
MaxConnections = 300
MaxPacketRetrievalTimeSeconds = 120
AllowUnencrypted = false
Auth
Basic = true
Kerberos = true
Negotiate = true
Certificate = true
CredSSP = true
CbtHardeningLevel = Relaxed
DefaultPorts
HTTP = 5985
HTTPS = 5986
IPv4Filter = *
IPv6Filter = *
EnableCompatibilityHttpListener = false
EnableCompatibilityHttpsListener = false
CertificateThumbprint
AllowRemoteAccess = true
Winrs
AllowRemoteShellAccess = true
IdleTimeout = 7200000
MaxConcurrentUsers = 2147483647
MaxShellRunTime = 2147483647
MaxProcessesPerShell = 2147483647
MaxMemoryPerShellMB = 2147483647
MaxShellsPerUser = 2147483647
While many of these options should rarely be changed, a few can easily impact
the operations over WinRM and are useful to understand. Some of the important
options are:
* ``Service\AllowUnencrypted``: This option defines whether WinRM will allow
traffic that is run over HTTP without message encryption. Message level
encryption is only possible when ``ansible_winrm_transport`` is ``ntlm``,
``kerberos`` or ``credssp``. By default this is ``false`` and should only be
set to ``true`` when debugging WinRM messages.
* ``Service\Auth\*``: These flags define what authentication
options are allowed with the WinRM service. By default, ``Negotiate (NTLM)``
and ``Kerberos`` are enabled.
* ``Service\Auth\CbtHardeningLevel``: Specifies whether channel binding tokens are
not verified (None), verified but not required (Relaxed), or verified and
required (Strict). CBT is only used when connecting with NTLM or Kerberos
over HTTPS.
* ``Service\CertificateThumbprint``: This is the thumbprint of the certificate
used to encrypt the TLS channel used with CredSSP authentication. By default
this is empty; a self-signed certificate is generated when the WinRM service
starts and is used in the TLS process.
* ``Winrs\MaxShellRunTime``: This is the maximum time, in milliseconds, that a
remote command is allowed to execute.
* ``Winrs\MaxMemoryPerShellMB``: This is the maximum amount of memory allocated
per shell, including the shell's child processes.
To modify a setting under the ``Service`` key in PowerShell::
# substitute {path} with the path to the option after winrm/config/Service
Set-Item -Path WSMan:\localhost\Service\{path} -Value "value here"
# for example, to change Service\Auth\CbtHardeningLevel run
Set-Item -Path WSMan:\localhost\Service\Auth\CbtHardeningLevel -Value Strict
To modify a setting under the ``Winrs`` key in PowerShell::
# Substitute {path} with the path to the option after winrm/config/Winrs
Set-Item -Path WSMan:\localhost\Shell\{path} -Value "value here"
# For example, to change Winrs\MaxShellRunTime run
Set-Item -Path WSMan:\localhost\Shell\MaxShellRunTime -Value 2147483647
.. Note:: If running in a domain environment, some of these options are set by
GPO and cannot be changed on the host itself. When a key has been
configured with GPO, it contains the text ``[Source="GPO"]`` next to the value.
Common WinRM Issues
-------------------
Because WinRM has a wide range of configuration options, it can be difficult
to setup and configure. Because of this complexity, issues that are shown by Ansible
could in fact be issues with the host setup instead.
One easy way to determine whether a problem is a host issue is to
run the following command from another Windows host to connect to the
target Windows host::
# Test out HTTP
winrs -r:http://server:5985/wsman -u:Username -p:Password ipconfig
# Test out HTTPS (will fail if the cert is not verifiable)
winrs -r:https://server:5986/wsman -u:Username -p:Password -ssl ipconfig
# Test out HTTPS, ignoring certificate verification
$username = "Username"
$password = ConvertTo-SecureString -String "Password" -AsPlainText -Force
$cred = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $username, $password
$session_option = New-PSSessionOption -SkipCACheck -SkipCNCheck -SkipRevocationCheck
Invoke-Command -ComputerName server -UseSSL -ScriptBlock { ipconfig } -Credential $cred -SessionOption $session_option
If this fails, the issue is probably related to the WinRM setup. If it works, the issue may not be related to the WinRM setup; please continue reading for more troubleshooting suggestions.
HTTP 401/Credentials Rejected
+++++++++++++++++++++++++++++
A HTTP 401 error indicates the authentication process failed during the initial
connection. Some things to check for this are:
* Verify that the credentials are correct and set properly in your inventory with
``ansible_user`` and ``ansible_password``
* Ensure that the user is a member of the local Administrators group or has been explicitly
granted access (a connection test with the ``winrs`` command can be used to
rule this out).
* Make sure that the authentication option set by ``ansible_winrm_transport`` is enabled under
``Service\Auth\*``
* If running over HTTP and not HTTPS, use ``ntlm``, ``kerberos`` or ``credssp``
with ``ansible_winrm_message_encryption: auto`` to enable message encryption.
If using another authentication option or if the installed pywinrm version cannot be
upgraded, the ``Service\AllowUnencrypted`` can be set to ``true`` but this is
only recommended for troubleshooting
* Ensure the downstream packages ``pywinrm``, ``requests-ntlm``,
``requests-kerberos``, and/or ``requests-credssp`` are up to date using ``pip``.
* If using Kerberos authentication, ensure that ``Service\Auth\CbtHardeningLevel`` is
not set to ``Strict``.
* When using Basic or Certificate authentication, make sure that the user is a local account and
not a domain account. Domain accounts do not work with Basic and Certificate
authentication.
HTTP 500 Error
++++++++++++++
These indicate an error has occurred with the WinRM service. Some things
to check for include:
* Verify that the number of current open shells has not exceeded either
``WinRsMaxShellsPerUser`` or any of the other Winrs quotas haven't been
exceeded.
Timeout Errors
+++++++++++++++
These usually indicate an error with the network connection where
Ansible is unable to reach the host. Some things to check for include:
* Make sure the firewall is not set to block the configured WinRM listener ports
* Ensure that a WinRM listener is enabled on the port and path set by the host vars
* Ensure that the ``winrm`` service is running on the Windows host and configured for
automatic start
Connection Refused Errors
+++++++++++++++++++++++++
These usually indicate an error when trying to communicate with the
WinRM service on the host. Some things to check for:
* Ensure that the WinRM service is up and running on the host. Use
``(Get-Service -Name winrm).Status`` to get the status of the service.
* Check that the host firewall is allowing traffic over the WinRM port. By default
this is ``5985`` for HTTP and ``5986`` for HTTPS.
Sometimes an installer may restart the WinRM or HTTP service and cause this error. The
best way to deal with this is to use ``win_psexec`` from another
Windows host.
Windows SSH Setup
`````````````````
Ansible 2.8 has added an experimental SSH connection for Windows managed nodes.
.. warning::
Use this feature at your own risk!
Using SSH with Windows is experimental, the implementation may make
backwards incompatible changes in feature releases. The server side
components can be unreliable depending on the version that is installed.
Installing Win32-OpenSSH
------------------------
The first step to using SSH with Windows is to install the `Win32-OpenSSH <https://github.com/PowerShell/Win32-OpenSSH>`_
service on the Windows host. Microsoft offers a way to install ``Win32-OpenSSH`` through a Windows
capability but currently the version that is installed through this process is
too old to work with Ansible. To install ``Win32-OpenSSH`` for use with
Ansible, select one of these three installation options:
* Manually install the service, following the `install instructions <https://github.com/PowerShell/Win32-OpenSSH/wiki/Install-Win32-OpenSSH>`_
from Microsoft.
* Use ``win_chocolatey`` to install the service::
- name: install the Win32-OpenSSH service
win_chocolatey:
name: openssh
package_params: /SSHServerFeature
state: present
* Use an existing Ansible Galaxy role like `jborean93.win_openssh <https://galaxy.ansible.com/jborean93/win_openssh>`_::
# Make sure the role has been downloaded first
ansible-galaxy install jborean93.win_openssh
# main.yml
- name: install Win32-OpenSSH service
hosts: windows
gather_facts: no
roles:
- role: jborean93.win_openssh
opt_openssh_setup_service: True
.. note:: ``Win32-OpenSSH`` is still a beta product and is constantly
being updated to include new features and bugfixes. If you are using SSH as
a connection option for Windows, it is highly recommend you install the
latest release from one of the 3 methods above.
Configuring the Win32-OpenSSH shell
-----------------------------------
By default ``Win32-OpenSSH`` will use ``cmd.exe`` as a shell. To configure a
different shell, use an Ansible task to define the registry setting::
- name: set the default shell to PowerShell
win_regedit:
path: HKLM:\SOFTWARE\OpenSSH
name: DefaultShell
data: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
type: string
state: present
# Or revert the settings back to the default, cmd
- name: set the default shell to cmd
win_regedit:
path: HKLM:\SOFTWARE\OpenSSH
name: DefaultShell
state: absent
Win32-OpenSSH Authentication
----------------------------
Win32-OpenSSH authentication with Windows is similar to SSH
authentication on Unix/Linux hosts. You can use a plaintext password or
SSH public key authentication, add public keys to an ``authorized_key`` file
in the ``.ssh`` folder of the user's profile directory, and configure the
service using the ``sshd_config`` file used by the SSH service as you would on
a Unix/Linux host.
When using SSH key authentication with Ansible, the remote session won't have access to the
user's credentials and will fail when attempting to access a network resource.
This is also known as the double-hop or credential delegation issue. There are
two ways to work around this issue:
* Use plaintext password auth by setting ``ansible_password``
* Use ``become`` on the task with the credentials of the user that needs access to the remote resource
Configuring Ansible for SSH on Windows
--------------------------------------
To configure Ansible to use SSH for Windows hosts, you must set two connection variables:
* set ``ansible_connection`` to ``ssh``
* set ``ansible_shell_type`` to ``cmd`` or ``powershell``
The ``ansible_shell_type`` variable should reflect the ``DefaultShell``
configured on the Windows host. Set to ``cmd`` for the default shell or set to
``powershell`` if the ``DefaultShell`` has been changed to PowerShell.
Known issues with SSH on Windows
--------------------------------
Using SSH with Windows is experimental, and we expect to uncover more issues.
Here are the known ones:
* Win32-OpenSSH versions older than ``v7.9.0.0p1-Beta`` do not work when ``powershell`` is the shell type
* While SCP should work, SFTP is the recommended SSH file transfer mechanism to use when copying or fetching a file
.. seealso::
:ref:`about_playbooks`
An introduction to playbooks
:ref:`playbooks_best_practices`
Best practices advice
:ref:`List of Windows Modules <windows_modules>`
Windows specific module list, all implemented in PowerShell
`User Mailing List <https://groups.google.com/group/ansible-project>`_
Have a question? Stop by the google group!
`irc.freenode.net <http://irc.freenode.net>`_
#ansible IRC chat channel
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,495 |
ansible-test: strange failures on CI when adding new module_utils directory
|
##### SUMMARY
In ansible-collections/community.general#49 I am adding a new module_utils directory, a Python module in there, and at the same time use that new module_utils in some Ansible modules.
Example commit: 7341b4a1e48bf6e652dd5f70c5a5a5808d73de7e
CI run: https://app.shippable.com/github/ansible-collections/community.general/runs/172/summary/console
All CI nodes (at least the ones I randomly checked) fail with the same error:
```
00:45 Detected changes in 4 file(s).
00:45 plugins/module_utils/compat/__init__.py
00:45 plugins/module_utils/compat/ipaddress.py
00:45 plugins/modules/cloud/scaleway/scaleway_security_group_rule.py
00:45 plugins/modules/net_tools/hetzner_firewall.py
00:45 Analyzing python module_utils imports...
00:59 WARNING: No imports found which use the "ansible_collections.community.general.plugins.module_utils.f5_utils" module_util.
00:59 WARNING: No imports found which use the "ansible_collections.community.general.plugins.module_utils.network.aos.aos" module_util.
00:59 WARNING: No imports found which use the "ansible_collections.community.general.plugins.module_utils.network.f5.iworkflow" module_util.
00:59 WARNING: No imports found which use the "ansible_collections.community.general.plugins.module_utils.network.panos.panos" module_util.
00:59 Processed 116 python module_utils in 13 second(s).
00:59 Traceback (most recent call last):
00:59 File "/root/venv/bin/ansible-test", line 28, in <module>
00:59 main()
00:59 File "/root/venv/bin/ansible-test", line 24, in main
00:59 cli_main()
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/cli.py", line 173, in main
00:59 args.func(config)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/sanity/__init__.py", line 85, in command_sanity
00:59 changes = get_changes_filter(args)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/executor.py", line 1493, in get_changes_filter
00:59 changes = categorize_changes(args, paths, args.command)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 89, in categorize_changes
00:59 dependent_paths = mapper.get_dependent_paths(path)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 236, in get_dependent_paths
00:59 unprocessed_paths = set(self.get_dependent_paths_non_recursive(path))
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 258, in get_dependent_paths_non_recursive
00:59 paths = self.get_dependent_paths_internal(path)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 273, in get_dependent_paths_internal
00:59 return self.get_python_module_utils_usage(path)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 306, in get_python_module_utils_usage
00:59 return sorted(self.python_module_utils_imports[name])
00:59 KeyError: u'ansible_collections.community.general.plugins.module_utils.compat'
```
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-test
##### ANSIBLE VERSION
```paste below
devel
```
|
https://github.com/ansible/ansible/issues/68495
|
https://github.com/ansible/ansible/pull/68519
|
d8b5c11a638737a855b8c59aa5c5202ef2807cc5
|
53a3d1ffdb14a7f367945606d6ca240d47fe5e04
| 2020-03-26T16:45:59Z |
python
| 2020-03-27T22:56:02Z |
changelogs/fragments/ansible-test-change-detection-empty-python.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,495 |
ansible-test: strange failures on CI when adding new module_utils directory
|
##### SUMMARY
In ansible-collections/community.general#49 I am adding a new module_utils directory, a Python module in there, and at the same time use that new module_utils in some Ansible modules.
Example commit: 7341b4a1e48bf6e652dd5f70c5a5a5808d73de7e
CI run: https://app.shippable.com/github/ansible-collections/community.general/runs/172/summary/console
All CI nodes (at least the ones I randomly checked) fail with the same error:
```
00:45 Detected changes in 4 file(s).
00:45 plugins/module_utils/compat/__init__.py
00:45 plugins/module_utils/compat/ipaddress.py
00:45 plugins/modules/cloud/scaleway/scaleway_security_group_rule.py
00:45 plugins/modules/net_tools/hetzner_firewall.py
00:45 Analyzing python module_utils imports...
00:59 WARNING: No imports found which use the "ansible_collections.community.general.plugins.module_utils.f5_utils" module_util.
00:59 WARNING: No imports found which use the "ansible_collections.community.general.plugins.module_utils.network.aos.aos" module_util.
00:59 WARNING: No imports found which use the "ansible_collections.community.general.plugins.module_utils.network.f5.iworkflow" module_util.
00:59 WARNING: No imports found which use the "ansible_collections.community.general.plugins.module_utils.network.panos.panos" module_util.
00:59 Processed 116 python module_utils in 13 second(s).
00:59 Traceback (most recent call last):
00:59 File "/root/venv/bin/ansible-test", line 28, in <module>
00:59 main()
00:59 File "/root/venv/bin/ansible-test", line 24, in main
00:59 cli_main()
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/cli.py", line 173, in main
00:59 args.func(config)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/sanity/__init__.py", line 85, in command_sanity
00:59 changes = get_changes_filter(args)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/executor.py", line 1493, in get_changes_filter
00:59 changes = categorize_changes(args, paths, args.command)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 89, in categorize_changes
00:59 dependent_paths = mapper.get_dependent_paths(path)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 236, in get_dependent_paths
00:59 unprocessed_paths = set(self.get_dependent_paths_non_recursive(path))
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 258, in get_dependent_paths_non_recursive
00:59 paths = self.get_dependent_paths_internal(path)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 273, in get_dependent_paths_internal
00:59 return self.get_python_module_utils_usage(path)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 306, in get_python_module_utils_usage
00:59 return sorted(self.python_module_utils_imports[name])
00:59 KeyError: u'ansible_collections.community.general.plugins.module_utils.compat'
```
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-test
##### ANSIBLE VERSION
```paste below
devel
```
|
https://github.com/ansible/ansible/issues/68495
|
https://github.com/ansible/ansible/pull/68519
|
d8b5c11a638737a855b8c59aa5c5202ef2807cc5
|
53a3d1ffdb14a7f367945606d6ca240d47fe5e04
| 2020-03-26T16:45:59Z |
python
| 2020-03-27T22:56:02Z |
test/lib/ansible_test/_internal/classification.py
|
"""Classify changes in Ansible code."""
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import collections
import os
import re
import time
from . import types as t
from .target import (
walk_module_targets,
walk_integration_targets,
walk_units_targets,
walk_compile_targets,
walk_sanity_targets,
load_integration_prefixes,
analyze_integration_target_dependencies,
)
from .util import (
display,
is_subdir,
)
from .import_analysis import (
get_python_module_utils_imports,
get_python_module_utils_name,
)
from .csharp_import_analysis import (
get_csharp_module_utils_imports,
get_csharp_module_utils_name,
)
from .powershell_import_analysis import (
get_powershell_module_utils_imports,
get_powershell_module_utils_name,
)
from .config import (
TestConfig,
IntegrationConfig,
)
from .metadata import (
ChangeDescription,
)
from .data import (
data_context,
)
FOCUSED_TARGET = '__focused__'
def categorize_changes(args, paths, verbose_command=None):
"""
:type args: TestConfig
:type paths: list[str]
:type verbose_command: str
:rtype: ChangeDescription
"""
mapper = PathMapper(args)
commands = {
'sanity': set(),
'units': set(),
'integration': set(),
'windows-integration': set(),
'network-integration': set(),
}
focused_commands = collections.defaultdict(set)
deleted_paths = set()
original_paths = set()
additional_paths = set()
no_integration_paths = set()
for path in paths:
if not os.path.exists(path):
deleted_paths.add(path)
continue
original_paths.add(path)
dependent_paths = mapper.get_dependent_paths(path)
if not dependent_paths:
continue
display.info('Expanded "%s" to %d dependent file(s):' % (path, len(dependent_paths)), verbosity=2)
for dependent_path in dependent_paths:
display.info(dependent_path, verbosity=2)
additional_paths.add(dependent_path)
additional_paths -= set(paths) # don't count changed paths as additional paths
if additional_paths:
display.info('Expanded %d changed file(s) into %d additional dependent file(s).' % (len(paths), len(additional_paths)))
paths = sorted(set(paths) | additional_paths)
display.info('Mapping %d changed file(s) to tests.' % len(paths))
none_count = 0
for path in paths:
tests = mapper.classify(path)
if tests is None:
focused_target = False
display.info('%s -> all' % path, verbosity=1)
tests = all_tests(args) # not categorized, run all tests
display.warning('Path not categorized: %s' % path)
else:
focused_target = tests.pop(FOCUSED_TARGET, False) and path in original_paths
tests = dict((key, value) for key, value in tests.items() if value)
if focused_target and not any('integration' in command for command in tests):
no_integration_paths.add(path) # path triggers no integration tests
if verbose_command:
result = '%s: %s' % (verbose_command, tests.get(verbose_command) or 'none')
# identify targeted integration tests (those which only target a single integration command)
if 'integration' in verbose_command and tests.get(verbose_command):
if not any('integration' in command for command in tests if command != verbose_command):
if focused_target:
result += ' (focused)'
result += ' (targeted)'
else:
result = '%s' % tests
if not tests.get(verbose_command):
# minimize excessive output from potentially thousands of files which do not trigger tests
none_count += 1
verbosity = 2
else:
verbosity = 1
if args.verbosity >= verbosity:
display.info('%s -> %s' % (path, result), verbosity=1)
for command, target in tests.items():
commands[command].add(target)
if focused_target:
focused_commands[command].add(target)
if none_count > 0 and args.verbosity < 2:
display.notice('Omitted %d file(s) that triggered no tests.' % none_count)
for command in commands:
commands[command].discard('none')
if any(target == 'all' for target in commands[command]):
commands[command] = set(['all'])
commands = dict((c, sorted(commands[c])) for c in commands if commands[c])
focused_commands = dict((c, sorted(focused_commands[c])) for c in focused_commands)
for command in commands:
if commands[command] == ['all']:
commands[command] = [] # changes require testing all targets, do not filter targets
changes = ChangeDescription()
changes.command = verbose_command
changes.changed_paths = sorted(original_paths)
changes.deleted_paths = sorted(deleted_paths)
changes.regular_command_targets = commands
changes.focused_command_targets = focused_commands
changes.no_integration_paths = sorted(no_integration_paths)
return changes
class PathMapper:
"""Map file paths to test commands and targets."""
def __init__(self, args):
"""
:type args: TestConfig
"""
self.args = args
self.integration_all_target = get_integration_all_target(self.args)
self.integration_targets = list(walk_integration_targets())
self.module_targets = list(walk_module_targets())
self.compile_targets = list(walk_compile_targets())
self.units_targets = list(walk_units_targets())
self.sanity_targets = list(walk_sanity_targets())
self.powershell_targets = [target for target in self.sanity_targets if os.path.splitext(target.path)[1] in ('.ps1', '.psm1')]
self.csharp_targets = [target for target in self.sanity_targets if os.path.splitext(target.path)[1] == '.cs']
self.units_modules = set(target.module for target in self.units_targets if target.module)
self.units_paths = set(a for target in self.units_targets for a in target.aliases)
self.sanity_paths = set(target.path for target in self.sanity_targets)
self.module_names_by_path = dict((target.path, target.module) for target in self.module_targets)
self.integration_targets_by_name = dict((target.name, target) for target in self.integration_targets)
self.integration_targets_by_alias = dict((a, target) for target in self.integration_targets for a in target.aliases)
self.posix_integration_by_module = dict((m, target.name) for target in self.integration_targets
if 'posix/' in target.aliases for m in target.modules)
self.windows_integration_by_module = dict((m, target.name) for target in self.integration_targets
if 'windows/' in target.aliases for m in target.modules)
self.network_integration_by_module = dict((m, target.name) for target in self.integration_targets
if 'network/' in target.aliases for m in target.modules)
self.prefixes = load_integration_prefixes()
self.integration_dependencies = analyze_integration_target_dependencies(self.integration_targets)
self.python_module_utils_imports = {} # populated on first use to reduce overhead when not needed
self.powershell_module_utils_imports = {} # populated on first use to reduce overhead when not needed
self.csharp_module_utils_imports = {} # populated on first use to reduce overhead when not needed
self.paths_to_dependent_targets = {}
for target in self.integration_targets:
for path in target.needs_file:
if path not in self.paths_to_dependent_targets:
self.paths_to_dependent_targets[path] = set()
self.paths_to_dependent_targets[path].add(target)
def get_dependent_paths(self, path):
"""
:type path: str
:rtype: list[str]
"""
unprocessed_paths = set(self.get_dependent_paths_non_recursive(path))
paths = set()
while unprocessed_paths:
queued_paths = list(unprocessed_paths)
paths |= unprocessed_paths
unprocessed_paths = set()
for queued_path in queued_paths:
new_paths = self.get_dependent_paths_non_recursive(queued_path)
for new_path in new_paths:
if new_path not in paths:
unprocessed_paths.add(new_path)
return sorted(paths)
def get_dependent_paths_non_recursive(self, path):
"""
:type path: str
:rtype: list[str]
"""
paths = self.get_dependent_paths_internal(path)
paths += [target.path + '/' for target in self.paths_to_dependent_targets.get(path, set())]
paths = sorted(set(paths))
return paths
def get_dependent_paths_internal(self, path):
"""
:type path: str
:rtype: list[str]
"""
ext = os.path.splitext(os.path.split(path)[1])[1]
if is_subdir(path, data_context().content.module_utils_path):
if ext == '.py':
return self.get_python_module_utils_usage(path)
if ext == '.psm1':
return self.get_powershell_module_utils_usage(path)
if ext == '.cs':
return self.get_csharp_module_utils_usage(path)
if is_subdir(path, data_context().content.integration_targets_path):
return self.get_integration_target_usage(path)
return []
def get_python_module_utils_usage(self, path):
"""
:type path: str
:rtype: list[str]
"""
if path == 'lib/ansible/module_utils/__init__.py':
return []
if path == 'plugins/module_utils/__init__.py':
return []
if not self.python_module_utils_imports:
display.info('Analyzing python module_utils imports...')
before = time.time()
self.python_module_utils_imports = get_python_module_utils_imports(self.compile_targets)
after = time.time()
display.info('Processed %d python module_utils in %d second(s).' % (len(self.python_module_utils_imports), after - before))
name = get_python_module_utils_name(path)
return sorted(self.python_module_utils_imports[name])
def get_powershell_module_utils_usage(self, path):
"""
:type path: str
:rtype: list[str]
"""
if not self.powershell_module_utils_imports:
display.info('Analyzing powershell module_utils imports...')
before = time.time()
self.powershell_module_utils_imports = get_powershell_module_utils_imports(self.powershell_targets)
after = time.time()
display.info('Processed %d powershell module_utils in %d second(s).' % (len(self.powershell_module_utils_imports), after - before))
name = get_powershell_module_utils_name(path)
return sorted(self.powershell_module_utils_imports[name])
def get_csharp_module_utils_usage(self, path):
"""
:type path: str
:rtype: list[str]
"""
if not self.csharp_module_utils_imports:
display.info('Analyzing C# module_utils imports...')
before = time.time()
self.csharp_module_utils_imports = get_csharp_module_utils_imports(self.powershell_targets, self.csharp_targets)
after = time.time()
display.info('Processed %d C# module_utils in %d second(s).' % (len(self.csharp_module_utils_imports), after - before))
name = get_csharp_module_utils_name(path)
return sorted(self.csharp_module_utils_imports[name])
def get_integration_target_usage(self, path):
"""
:type path: str
:rtype: list[str]
"""
target_name = path.split('/')[3]
dependents = [os.path.join(data_context().content.integration_targets_path, target) + os.path.sep
for target in sorted(self.integration_dependencies.get(target_name, set()))]
return dependents
def classify(self, path):
"""
:type path: str
:rtype: dict[str, str] | None
"""
result = self._classify(path)
# run all tests when no result given
if result is None:
return None
# run sanity on path unless result specified otherwise
if path in self.sanity_paths and 'sanity' not in result:
result['sanity'] = path
return result
def _classify(self, path): # type: (str) -> t.Optional[t.Dict[str, str]]
"""Return the classification for the given path."""
if data_context().content.is_ansible:
return self._classify_ansible(path)
if data_context().content.collection:
return self._classify_collection(path)
return None
def _classify_common(self, path): # type: (str) -> t.Optional[t.Dict[str, str]]
"""Return the classification for the given path using rules common to all layouts."""
dirname = os.path.dirname(path)
filename = os.path.basename(path)
name, ext = os.path.splitext(filename)
minimal = {}
if is_subdir(path, '.github'):
return minimal
if is_subdir(path, data_context().content.integration_targets_path):
if not os.path.exists(path):
return minimal
target = self.integration_targets_by_name.get(path.split('/')[3])
if not target:
display.warning('Unexpected non-target found: %s' % path)
return minimal
if 'hidden/' in target.aliases:
return minimal # already expanded using get_dependent_paths
return {
'integration': target.name if 'posix/' in target.aliases else None,
'windows-integration': target.name if 'windows/' in target.aliases else None,
'network-integration': target.name if 'network/' in target.aliases else None,
FOCUSED_TARGET: True,
}
if is_subdir(path, data_context().content.integration_path):
if dirname == data_context().content.integration_path:
for command in (
'integration',
'windows-integration',
'network-integration',
):
if name == command and ext == '.cfg':
return {
command: self.integration_all_target,
}
if name == command + '.requirements' and ext == '.txt':
return {
command: self.integration_all_target,
}
return {
'integration': self.integration_all_target,
'windows-integration': self.integration_all_target,
'network-integration': self.integration_all_target,
}
if is_subdir(path, data_context().content.sanity_path):
return {
'sanity': 'all', # test infrastructure, run all sanity checks
}
if is_subdir(path, data_context().content.unit_path):
if path in self.units_paths:
return {
'units': path,
}
# changes to files which are not unit tests should trigger tests from the nearest parent directory
test_path = os.path.dirname(path)
while test_path:
if test_path + '/' in self.units_paths:
return {
'units': test_path + '/',
}
test_path = os.path.dirname(test_path)
if is_subdir(path, data_context().content.module_path):
module_name = self.module_names_by_path.get(path)
if module_name:
return {
'units': module_name if module_name in self.units_modules else None,
'integration': self.posix_integration_by_module.get(module_name) if ext == '.py' else None,
'windows-integration': self.windows_integration_by_module.get(module_name) if ext in ['.cs', '.ps1'] else None,
'network-integration': self.network_integration_by_module.get(module_name),
FOCUSED_TARGET: True,
}
return minimal
if is_subdir(path, data_context().content.module_utils_path):
if ext == '.cs':
return minimal # already expanded using get_dependent_paths
if ext == '.psm1':
return minimal # already expanded using get_dependent_paths
if ext == '.py':
return minimal # already expanded using get_dependent_paths
if is_subdir(path, data_context().content.plugin_paths['action']):
if ext == '.py':
if name.startswith('net_'):
network_target = 'network/.*_%s' % name[4:]
if any(re.search(r'^%s$' % network_target, alias) for alias in self.integration_targets_by_alias):
return {
'network-integration': network_target,
'units': 'all',
}
return {
'network-integration': self.integration_all_target,
'units': 'all',
}
if self.prefixes.get(name) == 'network':
network_platform = name
elif name.endswith('_config') and self.prefixes.get(name[:-7]) == 'network':
network_platform = name[:-7]
elif name.endswith('_template') and self.prefixes.get(name[:-9]) == 'network':
network_platform = name[:-9]
else:
network_platform = None
if network_platform:
network_target = 'network/%s/' % network_platform
if network_target in self.integration_targets_by_alias:
return {
'network-integration': network_target,
'units': 'all',
}
display.warning('Integration tests for "%s" not found.' % network_target, unique=True)
return {
'units': 'all',
}
if is_subdir(path, data_context().content.plugin_paths['connection']):
if name == '__init__':
return {
'integration': self.integration_all_target,
'windows-integration': self.integration_all_target,
'network-integration': self.integration_all_target,
'units': 'test/units/plugins/connection/',
}
units_path = 'test/units/plugins/connection/test_%s.py' % name
if units_path not in self.units_paths:
units_path = None
integration_name = 'connection_%s' % name
if integration_name not in self.integration_targets_by_name:
integration_name = None
windows_integration_name = 'connection_windows_%s' % name
if windows_integration_name not in self.integration_targets_by_name:
windows_integration_name = None
# entire integration test commands depend on these connection plugins
if name in ['winrm', 'psrp']:
return {
'windows-integration': self.integration_all_target,
'units': units_path,
}
if name == 'local':
return {
'integration': self.integration_all_target,
'network-integration': self.integration_all_target,
'units': units_path,
}
if name == 'network_cli':
return {
'network-integration': self.integration_all_target,
'units': units_path,
}
if name == 'paramiko_ssh':
return {
'integration': integration_name,
'network-integration': self.integration_all_target,
'units': units_path,
}
# other connection plugins have isolated integration and unit tests
return {
'integration': integration_name,
'windows-integration': windows_integration_name,
'units': units_path,
}
if is_subdir(path, data_context().content.plugin_paths['doc_fragments']):
return {
'sanity': 'all',
}
if is_subdir(path, data_context().content.plugin_paths['inventory']):
if name == '__init__':
return all_tests(self.args) # broad impact, run all tests
# These inventory plugins are enabled by default (see INVENTORY_ENABLED).
# Without dedicated integration tests for these we must rely on the incidental coverage from other tests.
test_all = [
'host_list',
'script',
'yaml',
'ini',
'auto',
]
if name in test_all:
posix_integration_fallback = get_integration_all_target(self.args)
else:
posix_integration_fallback = None
target = self.integration_targets_by_name.get('inventory_%s' % name)
units_path = 'test/units/plugins/inventory/test_%s.py' % name
if units_path not in self.units_paths:
units_path = None
return {
'integration': target.name if target and 'posix/' in target.aliases else posix_integration_fallback,
'windows-integration': target.name if target and 'windows/' in target.aliases else None,
'network-integration': target.name if target and 'network/' in target.aliases else None,
'units': units_path,
FOCUSED_TARGET: target is not None,
}
if (is_subdir(path, data_context().content.plugin_paths['terminal']) or
is_subdir(path, data_context().content.plugin_paths['cliconf']) or
is_subdir(path, data_context().content.plugin_paths['netconf'])):
if ext == '.py':
if name in self.prefixes and self.prefixes[name] == 'network':
network_target = 'network/%s/' % name
if network_target in self.integration_targets_by_alias:
return {
'network-integration': network_target,
'units': 'all',
}
display.warning('Integration tests for "%s" not found.' % network_target, unique=True)
return {
'units': 'all',
}
return {
'network-integration': self.integration_all_target,
'units': 'all',
}
return None
def _classify_collection(self, path): # type: (str) -> t.Optional[t.Dict[str, str]]
"""Return the classification for the given path using rules specific to collections."""
result = self._classify_common(path)
if result is not None:
return result
return None
def _classify_ansible(self, path): # type: (str) -> t.Optional[t.Dict[str, str]]
"""Return the classification for the given path using rules specific to Ansible."""
if path.startswith('test/units/compat/'):
return {
'units': 'test/units/',
}
result = self._classify_common(path)
if result is not None:
return result
dirname = os.path.dirname(path)
filename = os.path.basename(path)
name, ext = os.path.splitext(filename)
minimal = {}
if path.startswith('bin/'):
return all_tests(self.args) # broad impact, run all tests
if path.startswith('changelogs/'):
return minimal
if path.startswith('contrib/'):
return {
'units': 'test/units/contrib/'
}
if path.startswith('docs/'):
return minimal
if path.startswith('examples/'):
if path == 'examples/scripts/ConfigureRemotingForAnsible.ps1':
return {
'windows-integration': 'connection_winrm',
}
return minimal
if path.startswith('hacking/'):
return minimal
if path.startswith('lib/ansible/executor/powershell/'):
units_path = 'test/units/executor/powershell/'
if units_path not in self.units_paths:
units_path = None
return {
'windows-integration': self.integration_all_target,
'units': units_path,
}
if path.startswith('lib/ansible/'):
return all_tests(self.args) # broad impact, run all tests
if path.startswith('licenses/'):
return minimal
if path.startswith('packaging/'):
if path.startswith('packaging/requirements/'):
if name.startswith('requirements-') and ext == '.txt':
component = name.split('-', 1)[1]
candidates = (
'cloud/%s/' % component,
)
for candidate in candidates:
if candidate in self.integration_targets_by_alias:
return {
'integration': candidate,
}
return all_tests(self.args) # broad impact, run all tests
return minimal
if path.startswith('test/ansible_test/'):
return minimal # these tests are not invoked from ansible-test
if path.startswith('test/lib/ansible_test/config/'):
if name.startswith('cloud-config-'):
# noinspection PyTypeChecker
cloud_target = 'cloud/%s/' % name.split('-')[2].split('.')[0]
if cloud_target in self.integration_targets_by_alias:
return {
'integration': cloud_target,
}
if path.startswith('test/lib/ansible_test/_data/completion/'):
if path == 'test/lib/ansible_test/_data/completion/docker.txt':
return all_tests(self.args, force=True) # force all tests due to risk of breaking changes in new test environment
if path.startswith('test/lib/ansible_test/_internal/cloud/'):
cloud_target = 'cloud/%s/' % name
if cloud_target in self.integration_targets_by_alias:
return {
'integration': cloud_target,
}
return all_tests(self.args) # test infrastructure, run all tests
if path.startswith('test/lib/ansible_test/_internal/sanity/'):
return {
'sanity': 'all', # test infrastructure, run all sanity checks
}
if path.startswith('test/lib/ansible_test/_data/sanity/'):
return {
'sanity': 'all', # test infrastructure, run all sanity checks
}
if path.startswith('test/lib/ansible_test/_internal/units/'):
return {
'units': 'all', # test infrastructure, run all unit tests
}
if path.startswith('test/lib/ansible_test/_data/units/'):
return {
'units': 'all', # test infrastructure, run all unit tests
}
if path.startswith('test/lib/ansible_test/_data/pytest/'):
return {
'units': 'all', # test infrastructure, run all unit tests
}
if path.startswith('test/lib/ansible_test/_data/requirements/'):
if name in (
'integration',
'network-integration',
'windows-integration',
):
return {
name: self.integration_all_target,
}
if name in (
'sanity',
'units',
):
return {
name: 'all',
}
if name.startswith('integration.cloud.'):
cloud_target = 'cloud/%s/' % name.split('.')[2]
if cloud_target in self.integration_targets_by_alias:
return {
'integration': cloud_target,
}
if path.startswith('test/lib/'):
return all_tests(self.args) # test infrastructure, run all tests
if path.startswith('test/support/'):
return all_tests(self.args) # test infrastructure, run all tests
if path.startswith('test/utils/shippable/'):
if dirname == 'test/utils/shippable':
test_map = {
'cloud.sh': 'integration:cloud/',
'linux.sh': 'integration:all',
'network.sh': 'network-integration:all',
'remote.sh': 'integration:all',
'sanity.sh': 'sanity:all',
'units.sh': 'units:all',
'windows.sh': 'windows-integration:all',
}
test_match = test_map.get(filename)
if test_match:
test_command, test_target = test_match.split(':')
return {
test_command: test_target,
}
cloud_target = 'cloud/%s/' % name
if cloud_target in self.integration_targets_by_alias:
return {
'integration': cloud_target,
}
return all_tests(self.args) # test infrastructure, run all tests
if path.startswith('test/utils/'):
return minimal
if '/' not in path:
if path in (
'.gitattributes',
'.gitignore',
'.mailmap',
'COPYING',
'Makefile',
):
return minimal
if path in (
'setup.py',
'shippable.yml',
):
return all_tests(self.args) # broad impact, run all tests
if ext in (
'.in',
'.md',
'.rst',
'.toml',
'.txt',
):
return minimal
return None # unknown, will result in fall-back to run all tests
def all_tests(args, force=False):
"""
:type args: TestConfig
:type force: bool
:rtype: dict[str, str]
"""
if force:
integration_all_target = 'all'
else:
integration_all_target = get_integration_all_target(args)
return {
'sanity': 'all',
'units': 'all',
'integration': integration_all_target,
'windows-integration': integration_all_target,
'network-integration': integration_all_target,
}
def get_integration_all_target(args):
"""
:type args: TestConfig
:rtype: str
"""
if isinstance(args, IntegrationConfig):
return args.changed_all_target
return 'all'
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,495 |
ansible-test: strange failures on CI when adding new module_utils directory
|
##### SUMMARY
In ansible-collections/community.general#49 I am adding a new module_utils directory, a Python module in there, and at the same time use that new module_utils in some Ansible modules.
Example commit: 7341b4a1e48bf6e652dd5f70c5a5a5808d73de7e
CI run: https://app.shippable.com/github/ansible-collections/community.general/runs/172/summary/console
All CI nodes (at least the ones I randomly checked) fail with the same error:
```
00:45 Detected changes in 4 file(s).
00:45 plugins/module_utils/compat/__init__.py
00:45 plugins/module_utils/compat/ipaddress.py
00:45 plugins/modules/cloud/scaleway/scaleway_security_group_rule.py
00:45 plugins/modules/net_tools/hetzner_firewall.py
00:45 Analyzing python module_utils imports...
00:59 WARNING: No imports found which use the "ansible_collections.community.general.plugins.module_utils.f5_utils" module_util.
00:59 WARNING: No imports found which use the "ansible_collections.community.general.plugins.module_utils.network.aos.aos" module_util.
00:59 WARNING: No imports found which use the "ansible_collections.community.general.plugins.module_utils.network.f5.iworkflow" module_util.
00:59 WARNING: No imports found which use the "ansible_collections.community.general.plugins.module_utils.network.panos.panos" module_util.
00:59 Processed 116 python module_utils in 13 second(s).
00:59 Traceback (most recent call last):
00:59 File "/root/venv/bin/ansible-test", line 28, in <module>
00:59 main()
00:59 File "/root/venv/bin/ansible-test", line 24, in main
00:59 cli_main()
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/cli.py", line 173, in main
00:59 args.func(config)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/sanity/__init__.py", line 85, in command_sanity
00:59 changes = get_changes_filter(args)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/executor.py", line 1493, in get_changes_filter
00:59 changes = categorize_changes(args, paths, args.command)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 89, in categorize_changes
00:59 dependent_paths = mapper.get_dependent_paths(path)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 236, in get_dependent_paths
00:59 unprocessed_paths = set(self.get_dependent_paths_non_recursive(path))
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 258, in get_dependent_paths_non_recursive
00:59 paths = self.get_dependent_paths_internal(path)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 273, in get_dependent_paths_internal
00:59 return self.get_python_module_utils_usage(path)
00:59 File "/root/venv/lib/python2.7/site-packages/ansible_test/_internal/classification.py", line 306, in get_python_module_utils_usage
00:59 return sorted(self.python_module_utils_imports[name])
00:59 KeyError: u'ansible_collections.community.general.plugins.module_utils.compat'
```
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-test
##### ANSIBLE VERSION
```paste below
devel
```
|
https://github.com/ansible/ansible/issues/68495
|
https://github.com/ansible/ansible/pull/68519
|
d8b5c11a638737a855b8c59aa5c5202ef2807cc5
|
53a3d1ffdb14a7f367945606d6ca240d47fe5e04
| 2020-03-26T16:45:59Z |
python
| 2020-03-27T22:56:02Z |
test/lib/ansible_test/_internal/import_analysis.py
|
"""Analyze python import statements."""
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import ast
import os
from . import types as t
from .io import (
read_binary_file,
)
from .util import (
display,
ApplicationError,
is_subdir,
)
from .data import (
data_context,
)
VIRTUAL_PACKAGES = set([
'ansible.module_utils.six',
])
def get_python_module_utils_imports(compile_targets):
"""Return a dictionary of module_utils names mapped to sets of python file paths.
:type compile_targets: list[TestTarget]
:rtype: dict[str, set[str]]
"""
module_utils = enumerate_module_utils()
virtual_utils = set(m for m in module_utils if any(m.startswith('%s.' % v) for v in VIRTUAL_PACKAGES))
module_utils -= virtual_utils
imports_by_target_path = {}
for target in compile_targets:
imports_by_target_path[target.path] = extract_python_module_utils_imports(target.path, module_utils)
def recurse_import(import_name, depth=0, seen=None): # type: (str, int, t.Optional[t.Set[str]]) -> t.Set[str]
"""Recursively expand module_utils imports from module_utils files."""
display.info('module_utils import: %s%s' % (' ' * depth, import_name), verbosity=4)
if seen is None:
seen = set([import_name])
results = set([import_name])
# virtual packages depend on the modules they contain instead of the reverse
if import_name in VIRTUAL_PACKAGES:
for sub_import in sorted(virtual_utils):
if sub_import.startswith('%s.' % import_name):
if sub_import in seen:
continue
seen.add(sub_import)
matches = sorted(recurse_import(sub_import, depth + 1, seen))
for result in matches:
results.add(result)
import_path = get_import_path(import_name)
if import_path not in imports_by_target_path:
import_path = get_import_path(import_name, package=True)
if import_path not in imports_by_target_path:
raise ApplicationError('Cannot determine path for module_utils import: %s' % import_name)
# process imports in reverse so the deepest imports come first
for name in sorted(imports_by_target_path[import_path], reverse=True):
if name in virtual_utils:
continue
if name in seen:
continue
seen.add(name)
matches = sorted(recurse_import(name, depth + 1, seen))
for result in matches:
results.add(result)
return results
for module_util in module_utils:
# recurse over module_utils imports while excluding self
module_util_imports = recurse_import(module_util)
module_util_imports.remove(module_util)
# add recursive imports to all path entries which import this module_util
for target_path in imports_by_target_path:
if module_util in imports_by_target_path[target_path]:
for module_util_import in sorted(module_util_imports):
if module_util_import not in imports_by_target_path[target_path]:
display.info('%s inherits import %s via %s' % (target_path, module_util_import, module_util), verbosity=6)
imports_by_target_path[target_path].add(module_util_import)
imports = dict([(module_util, set()) for module_util in module_utils | virtual_utils])
for target_path in imports_by_target_path:
for module_util in imports_by_target_path[target_path]:
imports[module_util].add(target_path)
# for purposes of mapping module_utils to paths, treat imports of virtual utils the same as the parent package
for virtual_util in virtual_utils:
parent_package = '.'.join(virtual_util.split('.')[:-1])
imports[virtual_util] = imports[parent_package]
display.info('%s reports imports from parent package %s' % (virtual_util, parent_package), verbosity=6)
for module_util in sorted(imports):
if not imports[module_util]:
display.warning('No imports found which use the "%s" module_util.' % module_util)
return imports
def get_python_module_utils_name(path): # type: (str) -> str
"""Return a namespace and name from the given module_utils path."""
base_path = data_context().content.module_utils_path
if data_context().content.collection:
prefix = 'ansible_collections.' + data_context().content.collection.prefix + 'plugins.module_utils.'
else:
prefix = 'ansible.module_utils.'
if path.endswith('/__init__.py'):
path = os.path.dirname(path)
name = prefix + os.path.splitext(os.path.relpath(path, base_path))[0].replace(os.path.sep, '.')
return name
def enumerate_module_utils():
"""Return a list of available module_utils imports.
:rtype: set[str]
"""
module_utils = []
for path in data_context().content.walk_files(data_context().content.module_utils_path):
ext = os.path.splitext(path)[1]
if ext != '.py':
continue
if os.path.getsize(path) == 0:
continue
module_utils.append(get_python_module_utils_name(path))
return set(module_utils)
def extract_python_module_utils_imports(path, module_utils):
"""Return a list of module_utils imports found in the specified source file.
:type path: str
:type module_utils: set[str]
:rtype: set[str]
"""
# Python code must be read as bytes to avoid a SyntaxError when the source uses comments to declare the file encoding.
# See: https://www.python.org/dev/peps/pep-0263
# Specifically: If a Unicode string with a coding declaration is passed to compile(), a SyntaxError will be raised.
code = read_binary_file(path)
try:
tree = ast.parse(code)
except SyntaxError as ex:
# Treat this error as a warning so tests can be executed as best as possible.
# The compile test will detect and report this syntax error.
display.warning('%s:%s Syntax error extracting module_utils imports: %s' % (path, ex.lineno, ex.msg))
return set()
finder = ModuleUtilFinder(path, module_utils)
finder.visit(tree)
return finder.imports
def get_import_path(name, package=False): # type: (str, bool) -> str
"""Return a path from an import name."""
if package:
filename = os.path.join(name.replace('.', '/'), '__init__.py')
else:
filename = '%s.py' % name.replace('.', '/')
if name.startswith('ansible.module_utils.') or name == 'ansible.module_utils':
path = os.path.join('lib', filename)
elif data_context().content.collection and name.startswith('ansible_collections.%s.plugins.module_utils.' % data_context().content.collection.full_name):
path = '/'.join(filename.split('/')[3:])
else:
raise Exception('Unexpected import name: %s' % name)
return path
class ModuleUtilFinder(ast.NodeVisitor):
"""AST visitor to find valid module_utils imports."""
def __init__(self, path, module_utils):
"""Return a list of module_utils imports found in the specified source file.
:type path: str
:type module_utils: set[str]
"""
self.path = path
self.module_utils = module_utils
self.imports = set()
# implicitly import parent package
if path.endswith('/__init__.py'):
path = os.path.split(path)[0]
if path.startswith('lib/ansible/module_utils/'):
package = os.path.split(path)[0].replace('/', '.')[4:]
if package != 'ansible.module_utils' and package not in VIRTUAL_PACKAGES:
self.add_import(package, 0)
# noinspection PyPep8Naming
# pylint: disable=locally-disabled, invalid-name
def visit_Import(self, node):
"""
:type node: ast.Import
"""
self.generic_visit(node)
# import ansible.module_utils.MODULE[.MODULE]
# import ansible_collections.{ns}.{col}.plugins.module_utils.module_utils.MODULE[.MODULE]
self.add_imports([alias.name for alias in node.names], node.lineno)
# noinspection PyPep8Naming
# pylint: disable=locally-disabled, invalid-name
def visit_ImportFrom(self, node):
"""
:type node: ast.ImportFrom
"""
self.generic_visit(node)
if not node.module:
return
if not node.module.startswith('ansible'):
return
# from ansible.module_utils import MODULE[, MODULE]
# from ansible.module_utils.MODULE[.MODULE] import MODULE[, MODULE]
# from ansible_collections.{ns}.{col}.plugins.module_utils import MODULE[, MODULE]
# from ansible_collections.{ns}.{col}.plugins.module_utils.MODULE[.MODULE] import MODULE[, MODULE]
self.add_imports(['%s.%s' % (node.module, alias.name) for alias in node.names], node.lineno)
def add_import(self, name, line_number):
"""
:type name: str
:type line_number: int
"""
import_name = name
while self.is_module_util_name(name):
if name in self.module_utils:
if name not in self.imports:
display.info('%s:%d imports module_utils: %s' % (self.path, line_number, name), verbosity=5)
self.imports.add(name)
return # duplicate imports are ignored
name = '.'.join(name.split('.')[:-1])
if is_subdir(self.path, data_context().content.test_path):
return # invalid imports in tests are ignored
path = get_import_path(name, True)
if os.path.exists(path) and os.path.getsize(path) == 0:
return # zero length __init__.py files are ignored during earlier processing, do not warn about them now
# Treat this error as a warning so tests can be executed as best as possible.
# This error should be detected by unit or integration tests.
display.warning('%s:%d Invalid module_utils import: %s' % (self.path, line_number, import_name))
def add_imports(self, names, line_no): # type: (t.List[str], int) -> None
"""Add the given import names if they are module_utils imports."""
for name in names:
if self.is_module_util_name(name):
self.add_import(name, line_no)
@staticmethod
def is_module_util_name(name): # type: (str) -> bool
"""Return True if the given name is a module_util name for the content under test. External module_utils are ignored."""
if data_context().content.is_ansible and name.startswith('ansible.module_utils.'):
return True
if data_context().content.collection and name.startswith('ansible_collections.%s.plugins.module_utils.' % data_context().content.collection.full_name):
return True
return False
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,530 |
ansible-test sanity failing with "No module named 'jinja2'"
|
##### SUMMARY
In the Kubernetes collection's CI tests, the sanity check, which uses `ansible-test sanity --docker -v --color --python 3.6`, started failing with the following error:
```
ERROR: Command "importer.py" returned exit status 1.
>>> Standard Error
Traceback (most recent call last):
File "/root/ansible/ansible_collections/community/kubernetes/tests/output/.tmp/sanity/import/minimal-py36/bin/importer.py", line 447, in <module>
main()
File "/root/ansible/ansible_collections/community/kubernetes/tests/output/.tmp/sanity/import/minimal-py36/bin/importer.py", line 51, in main
from ansible.utils.collection_loader import AnsibleCollectionLoader
File "/root/ansible/lib/ansible/utils/collection_loader.py", line 15, in <module>
from ansible import constants as C
File "/root/ansible/lib/ansible/constants.py", line 12, in <module>
from jinja2 import Template
ModuleNotFoundError: No module named 'jinja2'
```
Nothing in the collection has changed in the past week, and no runs failed until this morning's CI run, so it seems something committed to ansible/ansible `devel` has caused this failure.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-test
##### ANSIBLE VERSION
devel (as of this morning)
##### CONFIGURATION
N/A - defaults
##### OS / ENVIRONMENT
Linux
##### STEPS TO REPRODUCE
1. Clone kubernetes collection repo
2. Install Ansible @ devel
3. Run `ansible-test sanity --docker -v --color --python 3.6`
##### EXPECTED RESULTS
Tests should pass, as they have for the past few weeks.
##### ACTUAL RESULTS
Test fail, with the message in this issue's summary.
|
https://github.com/ansible/ansible/issues/68530
|
https://github.com/ansible/ansible/pull/68531
|
0f5a63f1b99e586f86932907c49b1e8877128957
|
7777189954347e98310ac8d067f3141b81cf1c07
| 2020-03-28T15:23:03Z |
python
| 2020-03-28T18:21:08Z |
lib/ansible/cli/doc.py
|
# Copyright: (c) 2014, James Tanner <[email protected]>
# Copyright: (c) 2018, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import datetime
import json
import os
import textwrap
import traceback
import yaml
import ansible.plugins.loader as plugin_loader
from ansible import constants as C
from ansible import context
from ansible.cli import CLI
from ansible.cli.arguments import option_helpers as opt_help
from ansible.errors import AnsibleError, AnsibleOptionsError
from ansible.module_utils._text import to_native
from ansible.module_utils.common._collections_compat import Container, Sequence
from ansible.module_utils.six import string_types
from ansible.parsing.metadata import extract_metadata
from ansible.parsing.plugin_docs import read_docstub
from ansible.parsing.yaml.dumper import AnsibleDumper
from ansible.plugins.loader import action_loader, fragment_loader
from ansible.utils.collection_loader import set_collection_playbook_paths, list_collection_dirs, get_collection_name_from_path
from ansible.utils.display import Display
from ansible.utils.plugin_docs import BLACKLIST, get_docstring, get_versioned_doclink
display = Display()
def jdump(text):
display.display(json.dumps(text, sort_keys=True, indent=4))
def add_collection_plugins(plugin_list, plugin_type):
colldirs = list_collection_dirs()
for ns in colldirs.keys():
for path in colldirs[ns]:
collname = get_collection_name_from_path(path)
ptype = C.COLLECTION_PTYPE_COMPAT.get(plugin_type, plugin_type)
plugin_list.update(DocCLI.find_plugins(os.path.join(path, 'plugins', ptype), plugin_type, collname))
class RemovedPlugin(Exception):
pass
class PluginNotFound(Exception):
pass
class DocCLI(CLI):
''' displays information on modules installed in Ansible libraries.
It displays a terse listing of plugins and their short descriptions,
provides a printout of their DOCUMENTATION strings,
and it can create a short "snippet" which can be pasted into a playbook. '''
# default ignore list for detailed views
IGNORE = ('module', 'docuri', 'version_added', 'short_description', 'now_date', 'plainexamples', 'returndocs')
def __init__(self, args):
super(DocCLI, self).__init__(args)
self.plugin_list = set()
def init_parser(self):
super(DocCLI, self).init_parser(
desc="plugin documentation tool",
epilog="See man pages for Ansible CLI options or website for tutorials https://docs.ansible.com"
)
opt_help.add_module_options(self.parser)
opt_help.add_basedir_options(self.parser)
self.parser.add_argument('args', nargs='*', help='Plugin', metavar='plugin')
self.parser.add_argument("-t", "--type", action="store", default='module', dest='type',
help='Choose which plugin type (defaults to "module"). '
'Available plugin types are : {0}'.format(C.DOCUMENTABLE_PLUGINS),
choices=C.DOCUMENTABLE_PLUGINS)
self.parser.add_argument("-j", "--json", action="store_true", default=False, dest='json_format',
help='Change output into json format.')
exclusive = self.parser.add_mutually_exclusive_group()
exclusive.add_argument("-F", "--list_files", action="store_true", default=False, dest="list_files",
help='Show plugin names and their source files without summaries (implies --list)')
exclusive.add_argument("-l", "--list", action="store_true", default=False, dest='list_dir',
help='List available plugins')
exclusive.add_argument("-s", "--snippet", action="store_true", default=False, dest='show_snippet',
help='Show playbook snippet for specified plugin(s)')
exclusive.add_argument("--metadata-dump", action="store_true", default=False, dest='dump',
help='**For internal testing only** Dump json metadata for all plugins.')
def post_process_args(self, options):
options = super(DocCLI, self).post_process_args(options)
display.verbosity = options.verbosity
return options
def run(self):
super(DocCLI, self).run()
plugin_type = context.CLIARGS['type']
do_json = context.CLIARGS['json_format']
if plugin_type in C.DOCUMENTABLE_PLUGINS:
loader = getattr(plugin_loader, '%s_loader' % plugin_type)
else:
raise AnsibleOptionsError("Unknown or undocumentable plugin type: %s" % plugin_type)
# add to plugin paths from command line
basedir = context.CLIARGS['basedir']
if basedir:
set_collection_playbook_paths(basedir)
loader.add_directory(basedir, with_subdir=True)
if context.CLIARGS['module_path']:
for path in context.CLIARGS['module_path']:
if path:
loader.add_directory(path)
# save only top level paths for errors
search_paths = DocCLI.print_paths(loader)
loader._paths = None # reset so we can use subdirs below
# list plugins names and filepath for type
if context.CLIARGS['list_files']:
paths = loader._get_paths()
for path in paths:
self.plugin_list.update(DocCLI.find_plugins(path, plugin_type))
add_collection_plugins(self.plugin_list, plugin_type)
plugins = self._get_plugin_list_filenames(loader)
if do_json:
jdump(plugins)
else:
# format for user
displace = max(len(x) for x in self.plugin_list)
linelimit = display.columns - displace - 5
text = []
for plugin in plugins.keys():
filename = plugins[plugin]
text.append("%-*s %-*.*s" % (displace, plugin, linelimit, len(filename), filename))
DocCLI.pager("\n".join(text))
# list file plugins for type (does not read docs, very fast)
elif context.CLIARGS['list_dir']:
paths = loader._get_paths()
for path in paths:
self.plugin_list.update(DocCLI.find_plugins(path, plugin_type))
add_collection_plugins(self.plugin_list, plugin_type)
descs = self._get_plugin_list_descriptions(loader)
if do_json:
jdump(descs)
else:
displace = max(len(x) for x in self.plugin_list)
linelimit = display.columns - displace - 5
text = []
deprecated = []
for plugin in descs.keys():
desc = DocCLI.tty_ify(descs[plugin])
if len(desc) > linelimit:
desc = desc[:linelimit] + '...'
if plugin.startswith('_'): # Handle deprecated
deprecated.append("%-*s %-*.*s" % (displace, plugin[1:], linelimit, len(desc), desc))
else:
text.append("%-*s %-*.*s" % (displace, plugin, linelimit, len(desc), desc))
if len(deprecated) > 0:
text.append("\nDEPRECATED:")
text.extend(deprecated)
DocCLI.pager("\n".join(text))
# dump plugin desc/metadata as JSON
elif context.CLIARGS['dump']:
plugin_data = {}
plugin_names = DocCLI.get_all_plugins_of_type(plugin_type)
for plugin_name in plugin_names:
plugin_info = DocCLI.get_plugin_metadata(plugin_type, plugin_name)
if plugin_info is not None:
plugin_data[plugin_name] = plugin_info
jdump(plugin_data)
else:
# display specific plugin docs
if len(context.CLIARGS['args']) == 0:
raise AnsibleOptionsError("Incorrect options passed")
# get the docs for plugins in the command line list
plugin_docs = {}
for plugin in context.CLIARGS['args']:
try:
doc, plainexamples, returndocs, metadata = DocCLI._get_plugin_doc(plugin, loader, search_paths)
except PluginNotFound:
display.warning("%s %s not found in:\n%s\n" % (plugin_type, plugin, search_paths))
continue
except RemovedPlugin:
display.warning("%s %s has been removed\n" % (plugin_type, plugin))
continue
except Exception as e:
display.vvv(traceback.format_exc())
raise AnsibleError("%s %s missing documentation (or could not parse"
" documentation): %s\n" %
(plugin_type, plugin, to_native(e)))
if not doc:
# The doc section existed but was empty
continue
plugin_docs[plugin] = {'doc': doc, 'examples': plainexamples,
'return': returndocs, 'metadata': metadata}
if do_json:
# Some changes to how json docs are formatted
for plugin, doc_data in plugin_docs.items():
try:
doc_data['return'] = yaml.load(doc_data['return'])
except Exception:
pass
jdump(plugin_docs)
else:
# Some changes to how plain text docs are formatted
text = []
for plugin, doc_data in plugin_docs.items():
textret = DocCLI.format_plugin_doc(plugin, plugin_type,
doc_data['doc'], doc_data['examples'],
doc_data['return'], doc_data['metadata'])
if textret:
text.append(textret)
if text:
DocCLI.pager(''.join(text))
return 0
@staticmethod
def get_all_plugins_of_type(plugin_type):
loader = getattr(plugin_loader, '%s_loader' % plugin_type)
plugin_list = set()
paths = loader._get_paths()
for path in paths:
plugins_to_add = DocCLI.find_plugins(path, plugin_type)
plugin_list.update(plugins_to_add)
return sorted(set(plugin_list))
@staticmethod
def get_plugin_metadata(plugin_type, plugin_name):
# if the plugin lives in a non-python file (eg, win_X.ps1), require the corresponding python file for docs
loader = getattr(plugin_loader, '%s_loader' % plugin_type)
filename = loader.find_plugin(plugin_name, mod_type='.py', ignore_deprecated=True, check_aliases=True)
if filename is None:
raise AnsibleError("unable to load {0} plugin named {1} ".format(plugin_type, plugin_name))
try:
doc, __, __, metadata = get_docstring(filename, fragment_loader, verbose=(context.CLIARGS['verbosity'] > 0))
except Exception:
display.vvv(traceback.format_exc())
raise AnsibleError(
"%s %s at %s has a documentation error formatting or is missing documentation." %
(plugin_type, plugin_name, filename))
if doc is None:
if 'removed' not in metadata.get('status', []):
raise AnsibleError(
"%s %s at %s has a documentation error formatting or is missing documentation." %
(plugin_type, plugin_name, filename))
# Removed plugins don't have any documentation
return None
return dict(
name=plugin_name,
namespace=DocCLI.namespace_from_plugin_filepath(filename, plugin_name, loader.package_path),
description=doc.get('short_description', "UNKNOWN"),
version_added=doc.get('version_added', "UNKNOWN")
)
@staticmethod
def namespace_from_plugin_filepath(filepath, plugin_name, basedir):
if not basedir.endswith('/'):
basedir += '/'
rel_path = filepath.replace(basedir, '')
extension_free = os.path.splitext(rel_path)[0]
namespace_only = extension_free.rsplit(plugin_name, 1)[0].strip('/_')
clean_ns = namespace_only.replace('/', '.')
if clean_ns == '':
clean_ns = None
return clean_ns
@staticmethod
def _get_plugin_doc(plugin, loader, search_paths):
# if the plugin lives in a non-python file (eg, win_X.ps1), require the corresponding python file for docs
filename = loader.find_plugin(plugin, mod_type='.py', ignore_deprecated=True, check_aliases=True)
if filename is None:
raise PluginNotFound('%s was not found in %s' % (plugin, search_paths))
doc, plainexamples, returndocs, metadata = get_docstring(filename, fragment_loader, verbose=(context.CLIARGS['verbosity'] > 0))
# If the plugin existed but did not have a DOCUMENTATION element and was not removed, it's
# an error
if doc is None:
# doc may be None when the module has been removed. Calling code may choose to
# handle that but we can't.
if 'status' in metadata and isinstance(metadata['status'], Container):
if 'removed' in metadata['status']:
raise RemovedPlugin('%s has been removed' % plugin)
# Backwards compat: no documentation but valid metadata (or no metadata, which results in using the default metadata).
# Probably should make this an error in 2.10
return {}, {}, {}, metadata
else:
# If metadata is invalid, warn but don't error
display.warning(u'%s has an invalid ANSIBLE_METADATA field' % plugin)
raise ValueError('%s did not contain a DOCUMENTATION attribute' % plugin)
doc['filename'] = filename
return doc, plainexamples, returndocs, metadata
@staticmethod
def format_plugin_doc(plugin, plugin_type, doc, plainexamples, returndocs, metadata):
# assign from other sections
doc['plainexamples'] = plainexamples
doc['returndocs'] = returndocs
doc['metadata'] = metadata
# generate extra data
if plugin_type == 'module':
# is there corresponding action plugin?
if plugin in action_loader:
doc['action'] = True
else:
doc['action'] = False
doc['now_date'] = datetime.date.today().strftime('%Y-%m-%d')
if 'docuri' in doc:
doc['docuri'] = doc[plugin_type].replace('_', '-')
if context.CLIARGS['show_snippet'] and plugin_type == 'module':
text = DocCLI.get_snippet_text(doc)
else:
text = DocCLI.get_man_text(doc)
return text
@staticmethod
def find_plugins(path, ptype, collection=None):
display.vvvv("Searching %s for plugins" % path)
plugin_list = set()
if not os.path.exists(path):
display.vvvv("%s does not exist" % path)
return plugin_list
if not os.path.isdir(path):
display.vvvv("%s is not a directory" % path)
return plugin_list
bkey = ptype.upper()
for plugin in os.listdir(path):
display.vvvv("Found %s" % plugin)
full_path = '/'.join([path, plugin])
if plugin.startswith('.'):
continue
elif os.path.isdir(full_path):
continue
elif any(plugin.endswith(x) for x in C.BLACKLIST_EXTS):
continue
elif plugin.startswith('__'):
continue
elif plugin in C.IGNORE_FILES:
continue
elif plugin .startswith('_'):
if os.path.islink(full_path): # avoids aliases
continue
plugin = os.path.splitext(plugin)[0] # removes the extension
plugin = plugin.lstrip('_') # remove underscore from deprecated plugins
if plugin not in BLACKLIST.get(bkey, ()):
if collection:
plugin = '%s.%s' % (collection, plugin)
plugin_list.add(plugin)
display.vvvv("Added %s" % plugin)
return plugin_list
def _get_plugin_list_descriptions(self, loader):
descs = {}
plugins = self._get_plugin_list_filenames(loader)
for plugin in plugins.keys():
filename = plugins[plugin]
doc = None
try:
doc = read_docstub(filename)
except Exception:
display.warning("%s has a documentation formatting error" % plugin)
continue
if not doc or not isinstance(doc, dict):
with open(filename) as f:
metadata = extract_metadata(module_data=f.read())
if metadata[0]:
if 'removed' not in metadata[0].get('status', []):
display.warning("%s parsing did not produce documentation." % plugin)
else:
continue
desc = 'UNDOCUMENTED'
else:
desc = doc.get('short_description', 'INVALID SHORT DESCRIPTION').strip()
descs[plugin] = desc
return descs
def _get_plugin_list_filenames(self, loader):
pfiles = {}
for plugin in sorted(self.plugin_list):
try:
# if the module lives in a non-python file (eg, win_X.ps1), require the corresponding python file for docs
filename = loader.find_plugin(plugin, mod_type='.py', ignore_deprecated=True, check_aliases=True)
if filename is None:
continue
if filename.endswith(".ps1"):
continue
if os.path.isdir(filename):
continue
pfiles[plugin] = filename
except Exception as e:
raise AnsibleError("Failed reading docs at %s: %s" % (plugin, to_native(e)), orig_exc=e)
return pfiles
@staticmethod
def print_paths(finder):
''' Returns a string suitable for printing of the search path '''
# Uses a list to get the order right
ret = []
for i in finder._get_paths(subdirs=False):
if i not in ret:
ret.append(i)
return os.pathsep.join(ret)
@staticmethod
def get_snippet_text(doc):
text = []
desc = DocCLI.tty_ify(doc['short_description'])
text.append("- name: %s" % (desc))
text.append(" %s:" % (doc['module']))
pad = 31
subdent = " " * pad
limit = display.columns - pad
for o in sorted(doc['options'].keys()):
opt = doc['options'][o]
if isinstance(opt['description'], string_types):
desc = DocCLI.tty_ify(opt['description'])
else:
desc = DocCLI.tty_ify(" ".join(opt['description']))
required = opt.get('required', False)
if not isinstance(required, bool):
raise("Incorrect value for 'Required', a boolean is needed.: %s" % required)
if required:
desc = "(required) %s" % desc
o = '%s:' % o
text.append(" %-20s # %s" % (o, textwrap.fill(desc, limit, subsequent_indent=subdent)))
text.append('')
return "\n".join(text)
@staticmethod
def _dump_yaml(struct, indent):
return DocCLI.tty_ify('\n'.join([indent + line for line in
yaml.dump(struct, default_flow_style=False,
Dumper=AnsibleDumper).split('\n')]))
@staticmethod
def add_fields(text, fields, limit, opt_indent):
for o in sorted(fields):
opt = fields[o]
required = opt.pop('required', False)
if not isinstance(required, bool):
raise AnsibleError("Incorrect value for 'Required', a boolean is needed.: %s" % required)
if required:
opt_leadin = "="
else:
opt_leadin = "-"
text.append("%s %s" % (opt_leadin, o))
if isinstance(opt['description'], list):
for entry_idx, entry in enumerate(opt['description'], 1):
if not isinstance(entry, string_types):
raise AnsibleError("Expected string in description of %s at index %s, got %s" % (o, entry_idx, type(entry)))
text.append(textwrap.fill(DocCLI.tty_ify(entry), limit, initial_indent=opt_indent, subsequent_indent=opt_indent))
else:
if not isinstance(opt['description'], string_types):
raise AnsibleError("Expected string in description of %s, got %s" % (o, type(opt['description'])))
text.append(textwrap.fill(DocCLI.tty_ify(opt['description']), limit, initial_indent=opt_indent, subsequent_indent=opt_indent))
del opt['description']
aliases = ''
if 'aliases' in opt:
if len(opt['aliases']) > 0:
aliases = "(Aliases: " + ", ".join(str(i) for i in opt['aliases']) + ")"
del opt['aliases']
choices = ''
if 'choices' in opt:
if len(opt['choices']) > 0:
choices = "(Choices: " + ", ".join(str(i) for i in opt['choices']) + ")"
del opt['choices']
default = ''
if 'default' in opt or not required:
default = "[Default: %s" % str(opt.pop('default', '(null)')) + "]"
text.append(textwrap.fill(DocCLI.tty_ify(aliases + choices + default), limit,
initial_indent=opt_indent, subsequent_indent=opt_indent))
if 'options' in opt:
text.append("%soptions:\n" % opt_indent)
DocCLI.add_fields(text, opt.pop('options'), limit, opt_indent + opt_indent)
if 'spec' in opt:
text.append("%sspec:\n" % opt_indent)
DocCLI.add_fields(text, opt.pop('spec'), limit, opt_indent + opt_indent)
conf = {}
for config in ('env', 'ini', 'yaml', 'vars', 'keywords'):
if config in opt and opt[config]:
conf[config] = opt.pop(config)
for ignore in DocCLI.IGNORE:
for item in conf[config]:
if ignore in item:
del item[ignore]
if conf:
text.append(DocCLI._dump_yaml({'set_via': conf}, opt_indent))
for k in sorted(opt):
if k.startswith('_'):
continue
if isinstance(opt[k], string_types):
text.append('%s%s: %s' % (opt_indent, k,
textwrap.fill(DocCLI.tty_ify(opt[k]),
limit - (len(k) + 2),
subsequent_indent=opt_indent)))
elif isinstance(opt[k], (Sequence)) and all(isinstance(x, string_types) for x in opt[k]):
text.append(DocCLI.tty_ify('%s%s: %s' % (opt_indent, k, ', '.join(opt[k]))))
else:
text.append(DocCLI._dump_yaml({k: opt[k]}, opt_indent))
text.append('')
@staticmethod
def get_support_block(doc):
# Note: 'curated' is deprecated and not used in any of the modules we ship
support_level_msg = {'core': 'The Ansible Core Team',
'network': 'The Ansible Network Team',
'certified': 'an Ansible Partner',
'community': 'The Ansible Community',
'curated': 'A Third Party',
}
return [" * This module is maintained by %s" % support_level_msg[doc['metadata']['supported_by']]]
@staticmethod
def get_metadata_block(doc):
text = []
text.append("METADATA:")
text.append('\tSUPPORT LEVEL: %s' % doc['metadata']['supported_by'])
for k in (m for m in doc['metadata'] if m != 'supported_by'):
if isinstance(k, list):
text.append("\t%s: %s" % (k.capitalize(), ", ".join(doc['metadata'][k])))
else:
text.append("\t%s: %s" % (k.capitalize(), doc['metadata'][k]))
return text
@staticmethod
def get_man_text(doc):
DocCLI.IGNORE = DocCLI.IGNORE + (context.CLIARGS['type'],)
opt_indent = " "
text = []
pad = display.columns * 0.20
limit = max(display.columns - int(pad), 70)
text.append("> %s (%s)\n" % (doc.get(context.CLIARGS['type'], doc.get('plugin_type')).upper(), doc.pop('filename')))
if isinstance(doc['description'], list):
desc = " ".join(doc.pop('description'))
else:
desc = doc.pop('description')
text.append("%s\n" % textwrap.fill(DocCLI.tty_ify(desc), limit, initial_indent=opt_indent,
subsequent_indent=opt_indent))
if 'deprecated' in doc and doc['deprecated'] is not None and len(doc['deprecated']) > 0:
text.append("DEPRECATED: \n")
if isinstance(doc['deprecated'], dict):
if 'version' in doc['deprecated'] and 'removed_in' not in doc['deprecated']:
doc['deprecated']['removed_in'] = doc['deprecated']['version']
text.append("\tReason: %(why)s\n\tWill be removed in: Ansible %(removed_in)s\n\tAlternatives: %(alternative)s" % doc.pop('deprecated'))
else:
text.append("%s" % doc.pop('deprecated'))
text.append("\n")
try:
support_block = DocCLI.get_support_block(doc)
if support_block:
text.extend(support_block)
except Exception:
pass # FIXME: not suported by plugins
if doc.pop('action', False):
text.append(" * note: %s\n" % "This module has a corresponding action plugin.")
if 'options' in doc and doc['options']:
text.append("OPTIONS (= is mandatory):\n")
DocCLI.add_fields(text, doc.pop('options'), limit, opt_indent)
text.append('')
if 'notes' in doc and doc['notes'] and len(doc['notes']) > 0:
text.append("NOTES:")
for note in doc['notes']:
text.append(textwrap.fill(DocCLI.tty_ify(note), limit - 6,
initial_indent=opt_indent[:-2] + "* ", subsequent_indent=opt_indent))
text.append('')
text.append('')
del doc['notes']
if 'seealso' in doc and doc['seealso']:
text.append("SEE ALSO:")
for item in doc['seealso']:
if 'module' in item:
text.append(textwrap.fill(DocCLI.tty_ify('Module %s' % item['module']),
limit - 6, initial_indent=opt_indent[:-2] + "* ", subsequent_indent=opt_indent))
description = item.get('description', 'The official documentation on the %s module.' % item['module'])
text.append(textwrap.fill(DocCLI.tty_ify(description), limit - 6, initial_indent=opt_indent + ' ', subsequent_indent=opt_indent + ' '))
text.append(textwrap.fill(DocCLI.tty_ify(get_versioned_doclink('modules/%s_module.html' % item['module'])),
limit - 6, initial_indent=opt_indent + ' ', subsequent_indent=opt_indent))
elif 'name' in item and 'link' in item and 'description' in item:
text.append(textwrap.fill(DocCLI.tty_ify(item['name']),
limit - 6, initial_indent=opt_indent[:-2] + "* ", subsequent_indent=opt_indent))
text.append(textwrap.fill(DocCLI.tty_ify(item['description']),
limit - 6, initial_indent=opt_indent + ' ', subsequent_indent=opt_indent + ' '))
text.append(textwrap.fill(DocCLI.tty_ify(item['link']),
limit - 6, initial_indent=opt_indent + ' ', subsequent_indent=opt_indent + ' '))
elif 'ref' in item and 'description' in item:
text.append(textwrap.fill(DocCLI.tty_ify('Ansible documentation [%s]' % item['ref']),
limit - 6, initial_indent=opt_indent[:-2] + "* ", subsequent_indent=opt_indent))
text.append(textwrap.fill(DocCLI.tty_ify(item['description']),
limit - 6, initial_indent=opt_indent + ' ', subsequent_indent=opt_indent + ' '))
text.append(textwrap.fill(DocCLI.tty_ify(get_versioned_doclink('/#stq=%s&stp=1' % item['ref'])),
limit - 6, initial_indent=opt_indent + ' ', subsequent_indent=opt_indent + ' '))
text.append('')
text.append('')
del doc['seealso']
if 'requirements' in doc and doc['requirements'] is not None and len(doc['requirements']) > 0:
req = ", ".join(doc.pop('requirements'))
text.append("REQUIREMENTS:%s\n" % textwrap.fill(DocCLI.tty_ify(req), limit - 16, initial_indent=" ", subsequent_indent=opt_indent))
# Generic handler
for k in sorted(doc):
if k in DocCLI.IGNORE or not doc[k]:
continue
if isinstance(doc[k], string_types):
text.append('%s: %s' % (k.upper(), textwrap.fill(DocCLI.tty_ify(doc[k]), limit - (len(k) + 2), subsequent_indent=opt_indent)))
elif isinstance(doc[k], (list, tuple)):
text.append('%s: %s' % (k.upper(), ', '.join(doc[k])))
else:
text.append(DocCLI._dump_yaml({k.upper(): doc[k]}, opt_indent))
del doc[k]
text.append('')
if 'plainexamples' in doc and doc['plainexamples'] is not None:
text.append("EXAMPLES:")
text.append('')
if isinstance(doc['plainexamples'], string_types):
text.append(doc.pop('plainexamples').strip())
else:
text.append(yaml.dump(doc.pop('plainexamples'), indent=2, default_flow_style=False))
text.append('')
text.append('')
if 'returndocs' in doc and doc['returndocs'] is not None:
text.append("RETURN VALUES:")
if isinstance(doc['returndocs'], string_types):
text.append(doc.pop('returndocs'))
else:
text.append(yaml.dump(doc.pop('returndocs'), indent=2, default_flow_style=False))
text.append('')
try:
metadata_block = DocCLI.get_metadata_block(doc)
if metadata_block:
text.extend(metadata_block)
text.append('')
except Exception:
pass # metadata is optional
return "\n".join(text)
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,530 |
ansible-test sanity failing with "No module named 'jinja2'"
|
##### SUMMARY
In the Kubernetes collection's CI tests, the sanity check, which uses `ansible-test sanity --docker -v --color --python 3.6`, started failing with the following error:
```
ERROR: Command "importer.py" returned exit status 1.
>>> Standard Error
Traceback (most recent call last):
File "/root/ansible/ansible_collections/community/kubernetes/tests/output/.tmp/sanity/import/minimal-py36/bin/importer.py", line 447, in <module>
main()
File "/root/ansible/ansible_collections/community/kubernetes/tests/output/.tmp/sanity/import/minimal-py36/bin/importer.py", line 51, in main
from ansible.utils.collection_loader import AnsibleCollectionLoader
File "/root/ansible/lib/ansible/utils/collection_loader.py", line 15, in <module>
from ansible import constants as C
File "/root/ansible/lib/ansible/constants.py", line 12, in <module>
from jinja2 import Template
ModuleNotFoundError: No module named 'jinja2'
```
Nothing in the collection has changed in the past week, and no runs failed until this morning's CI run, so it seems something committed to ansible/ansible `devel` has caused this failure.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-test
##### ANSIBLE VERSION
devel (as of this morning)
##### CONFIGURATION
N/A - defaults
##### OS / ENVIRONMENT
Linux
##### STEPS TO REPRODUCE
1. Clone kubernetes collection repo
2. Install Ansible @ devel
3. Run `ansible-test sanity --docker -v --color --python 3.6`
##### EXPECTED RESULTS
Tests should pass, as they have for the past few weeks.
##### ACTUAL RESULTS
Test fail, with the message in this issue's summary.
|
https://github.com/ansible/ansible/issues/68530
|
https://github.com/ansible/ansible/pull/68531
|
0f5a63f1b99e586f86932907c49b1e8877128957
|
7777189954347e98310ac8d067f3141b81cf1c07
| 2020-03-28T15:23:03Z |
python
| 2020-03-28T18:21:08Z |
lib/ansible/constants.py
|
# Copyright: (c) 2012-2014, Michael DeHaan <[email protected]>
# Copyright: (c) 2017, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import os
import re
from ast import literal_eval
from jinja2 import Template
from string import ascii_letters, digits
from ansible.module_utils._text import to_text
from ansible.module_utils.common.collections import Sequence
from ansible.module_utils.parsing.convert_bool import boolean, BOOLEANS_TRUE
from ansible.module_utils.six import string_types
from ansible.config.manager import ConfigManager, ensure_type, get_ini_config_value
def _warning(msg):
''' display is not guaranteed here, nor it being the full class, but try anyways, fallback to sys.stderr.write '''
try:
from ansible.utils.display import Display
Display().warning(msg)
except Exception:
import sys
sys.stderr.write(' [WARNING] %s\n' % (msg))
def _deprecated(msg, version='2.8'):
''' display is not guaranteed here, nor it being the full class, but try anyways, fallback to sys.stderr.write '''
try:
from ansible.utils.display import Display
Display().deprecated(msg, version=version)
except Exception:
import sys
sys.stderr.write(' [DEPRECATED] %s, to be removed in %s\n' % (msg, version))
def mk_boolean(value):
''' moved to module_utils'''
_deprecated('ansible.constants.mk_boolean() is deprecated. Use ansible.module_utils.parsing.convert_bool.boolean() instead')
return boolean(value, strict=False)
def get_config(parser, section, key, env_var, default_value, value_type=None, expand_relative_paths=False):
''' kept for backwarsd compatibility, but deprecated '''
_deprecated('ansible.constants.get_config() is deprecated. There is new config API, see porting docs.')
value = None
# small reconstruction of the old code env/ini/default
value = os.environ.get(env_var, None)
if value is None:
try:
value = get_ini_config_value(parser, {'key': key, 'section': section})
except Exception:
pass
if value is None:
value = default_value
value = ensure_type(value, value_type)
return value
def set_constant(name, value, export=vars()):
''' sets constants and returns resolved options dict '''
export[name] = value
class _DeprecatedSequenceConstant(Sequence):
def __init__(self, value, msg, version):
self._value = value
self._msg = msg
self._version = version
def __len__(self):
_deprecated(self._msg, version=self._version)
return len(self._value)
def __getitem__(self, y):
_deprecated(self._msg, version=self._version)
return self._value[y]
# Deprecated constants
BECOME_METHODS = _DeprecatedSequenceConstant(
['sudo', 'su', 'pbrun', 'pfexec', 'doas', 'dzdo', 'ksu', 'runas', 'pmrun', 'enable', 'machinectl'],
('ansible.constants.BECOME_METHODS is deprecated, please use '
'ansible.plugins.loader.become_loader. This list is statically '
'defined and may not include all become methods'),
'2.10'
)
# CONSTANTS ### yes, actual ones
BLACKLIST_EXTS = ('.pyc', '.pyo', '.swp', '.bak', '~', '.rpm', '.md', '.txt', '.rst')
BOOL_TRUE = BOOLEANS_TRUE
COLLECTION_PTYPE_COMPAT = {'module': 'modules'}
DEFAULT_BECOME_PASS = None
DEFAULT_PASSWORD_CHARS = to_text(ascii_letters + digits + ".,:-_", errors='strict') # characters included in auto-generated passwords
DEFAULT_REMOTE_PASS = None
DEFAULT_SUBSET = None
# FIXME: expand to other plugins, but never doc fragments
CONFIGURABLE_PLUGINS = ('become', 'cache', 'callback', 'cliconf', 'connection', 'httpapi', 'inventory', 'lookup', 'netconf', 'shell', 'vars')
# NOTE: always update the docs/docsite/Makefile to match
DOCUMENTABLE_PLUGINS = CONFIGURABLE_PLUGINS + ('module', 'strategy')
IGNORE_FILES = ("COPYING", "CONTRIBUTING", "LICENSE", "README", "VERSION", "GUIDELINES") # ignore during module search
INTERNAL_RESULT_KEYS = ('add_host', 'add_group')
LOCALHOST = ('127.0.0.1', 'localhost', '::1')
MODULE_REQUIRE_ARGS = ('command', 'win_command', 'ansible.windows.win_command', 'shell', 'win_shell',
'ansible.windows.win_shell', 'raw', 'script')
MODULE_NO_JSON = ('command', 'win_command', 'ansible.windows.win_command', 'shell', 'win_shell',
'ansible.windows.win_shell', 'raw')
RESTRICTED_RESULT_KEYS = ('ansible_rsync_path', 'ansible_playbook_python', 'ansible_facts')
TREE_DIR = None
VAULT_VERSION_MIN = 1.0
VAULT_VERSION_MAX = 1.0
# This matches a string that cannot be used as a valid python variable name i.e 'not-valid', 'not!valid@either' '1_nor_This'
INVALID_VARIABLE_NAMES = re.compile(r'^[\d\W]|[^\w]')
# FIXME: remove once play_context mangling is removed
# the magic variable mapping dictionary below is used to translate
# host/inventory variables to fields in the PlayContext
# object. The dictionary values are tuples, to account for aliases
# in variable names.
COMMON_CONNECTION_VARS = frozenset(('ansible_connection', 'ansible_host', 'ansible_user', 'ansible_shell_executable',
'ansible_port', 'ansible_pipelining', 'ansible_password', 'ansible_timeout',
'ansible_shell_type', 'ansible_module_compression', 'ansible_private_key_file'))
MAGIC_VARIABLE_MAPPING = dict(
# base
connection=('ansible_connection', ),
module_compression=('ansible_module_compression', ),
shell=('ansible_shell_type', ),
executable=('ansible_shell_executable', ),
# connection common
remote_addr=('ansible_ssh_host', 'ansible_host'),
remote_user=('ansible_ssh_user', 'ansible_user'),
password=('ansible_ssh_pass', 'ansible_password'),
port=('ansible_ssh_port', 'ansible_port'),
pipelining=('ansible_ssh_pipelining', 'ansible_pipelining'),
timeout=('ansible_ssh_timeout', 'ansible_timeout'),
private_key_file=('ansible_ssh_private_key_file', 'ansible_private_key_file'),
# networking modules
network_os=('ansible_network_os', ),
connection_user=('ansible_connection_user',),
# ssh TODO: remove
ssh_executable=('ansible_ssh_executable', ),
ssh_common_args=('ansible_ssh_common_args', ),
sftp_extra_args=('ansible_sftp_extra_args', ),
scp_extra_args=('ansible_scp_extra_args', ),
ssh_extra_args=('ansible_ssh_extra_args', ),
ssh_transfer_method=('ansible_ssh_transfer_method', ),
# docker TODO: remove
docker_extra_args=('ansible_docker_extra_args', ),
# become
become=('ansible_become', ),
become_method=('ansible_become_method', ),
become_user=('ansible_become_user', ),
become_pass=('ansible_become_password', 'ansible_become_pass'),
become_exe=('ansible_become_exe', ),
become_flags=('ansible_become_flags', ),
)
# POPULATE SETTINGS FROM CONFIG ###
config = ConfigManager()
# Generate constants from config
for setting in config.data.get_settings():
value = setting.value
if setting.origin == 'default' and \
isinstance(setting.value, string_types) and \
(setting.value.startswith('{{') and setting.value.endswith('}}')):
try:
t = Template(setting.value)
value = t.render(vars())
try:
value = literal_eval(value)
except ValueError:
pass # not a python data structure
except Exception:
pass # not templatable
value = ensure_type(value, setting.type)
set_constant(setting.name, value)
for warn in config.WARNINGS:
_warning(warn)
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,530 |
ansible-test sanity failing with "No module named 'jinja2'"
|
##### SUMMARY
In the Kubernetes collection's CI tests, the sanity check, which uses `ansible-test sanity --docker -v --color --python 3.6`, started failing with the following error:
```
ERROR: Command "importer.py" returned exit status 1.
>>> Standard Error
Traceback (most recent call last):
File "/root/ansible/ansible_collections/community/kubernetes/tests/output/.tmp/sanity/import/minimal-py36/bin/importer.py", line 447, in <module>
main()
File "/root/ansible/ansible_collections/community/kubernetes/tests/output/.tmp/sanity/import/minimal-py36/bin/importer.py", line 51, in main
from ansible.utils.collection_loader import AnsibleCollectionLoader
File "/root/ansible/lib/ansible/utils/collection_loader.py", line 15, in <module>
from ansible import constants as C
File "/root/ansible/lib/ansible/constants.py", line 12, in <module>
from jinja2 import Template
ModuleNotFoundError: No module named 'jinja2'
```
Nothing in the collection has changed in the past week, and no runs failed until this morning's CI run, so it seems something committed to ansible/ansible `devel` has caused this failure.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible-test
##### ANSIBLE VERSION
devel (as of this morning)
##### CONFIGURATION
N/A - defaults
##### OS / ENVIRONMENT
Linux
##### STEPS TO REPRODUCE
1. Clone kubernetes collection repo
2. Install Ansible @ devel
3. Run `ansible-test sanity --docker -v --color --python 3.6`
##### EXPECTED RESULTS
Tests should pass, as they have for the past few weeks.
##### ACTUAL RESULTS
Test fail, with the message in this issue's summary.
|
https://github.com/ansible/ansible/issues/68530
|
https://github.com/ansible/ansible/pull/68531
|
0f5a63f1b99e586f86932907c49b1e8877128957
|
7777189954347e98310ac8d067f3141b81cf1c07
| 2020-03-28T15:23:03Z |
python
| 2020-03-28T18:21:08Z |
lib/ansible/utils/collection_loader.py
|
# (c) 2019 Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import os
import os.path
import re
import sys
from types import ModuleType
from collections import defaultdict
from ansible import constants as C
from ansible.utils.display import Display
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.module_utils.compat.importlib import import_module
from ansible.module_utils.six import iteritems, string_types, with_metaclass
from ansible.utils.singleton import Singleton
display = Display()
_SYNTHETIC_PACKAGES = {
# these provide fallback package definitions when there are no on-disk paths
'ansible_collections': dict(type='pkg_only', allow_external_subpackages=True),
'ansible_collections.ansible': dict(type='pkg_only', allow_external_subpackages=True),
# these implement the ansible.builtin synthetic collection mapped to the packages inside the ansible distribution
'ansible_collections.ansible.builtin': dict(type='pkg_only'),
'ansible_collections.ansible.builtin.plugins': dict(type='map', map='ansible.plugins'),
'ansible_collections.ansible.builtin.plugins.module_utils': dict(type='map', map='ansible.module_utils', graft=True),
'ansible_collections.ansible.builtin.plugins.modules': dict(type='flatmap', flatmap='ansible.modules', graft=True),
}
FLAG_FILES = frozenset(['MANIFEST.json', 'galaxy.yml'])
# FIXME: exception handling/error logging
class AnsibleCollectionLoader(with_metaclass(Singleton, object)):
def __init__(self, config=None):
if config:
paths = config.get_config_value('COLLECTIONS_PATHS')
else:
paths = os.environ.get('ANSIBLE_COLLECTIONS_PATHS', '').split(os.pathsep)
if isinstance(paths, string_types):
paths = [paths]
elif paths is None:
paths = []
# expand any placeholders in configured paths
paths = [
to_native(os.path.expanduser(p), errors='surrogate_or_strict')
for p in paths
]
# Append all ``ansible_collections`` dirs from sys.path to the end
for path in sys.path:
if (
path not in paths and
os.path.isdir(to_bytes(
os.path.join(path, 'ansible_collections'),
errors='surrogate_or_strict',
))
):
paths.append(path)
self._n_configured_paths = paths
self._n_playbook_paths = []
self._default_collection = None
# pre-inject grafted package maps so we can force them to use the right loader instead of potentially delegating to a "normal" loader
for syn_pkg_def in (p for p in iteritems(_SYNTHETIC_PACKAGES) if p[1].get('graft')):
pkg_name = syn_pkg_def[0]
pkg_def = syn_pkg_def[1]
newmod = ModuleType(pkg_name)
newmod.__package__ = pkg_name
newmod.__file__ = '<ansible_synthetic_collection_package>'
pkg_type = pkg_def.get('type')
# TODO: need to rethink map style so we can just delegate all the loading
if pkg_type == 'flatmap':
newmod.__loader__ = AnsibleFlatMapLoader(import_module(pkg_def['flatmap']))
newmod.__path__ = []
sys.modules[pkg_name] = newmod
@property
def n_collection_paths(self):
return self._n_playbook_paths + self._n_configured_paths
def get_collection_path(self, collection_name):
if not AnsibleCollectionRef.is_valid_collection_name(collection_name):
raise ValueError('{0} is not a valid collection name'.format(to_native(collection_name)))
m = import_module('ansible_collections.{0}'.format(collection_name))
return m.__file__
def set_playbook_paths(self, b_playbook_paths):
if isinstance(b_playbook_paths, string_types):
b_playbook_paths = [b_playbook_paths]
# track visited paths; we have to preserve the dir order as-passed in case there are duplicate collections (first one wins)
added_paths = set()
# de-dupe and ensure the paths are native strings (Python seems to do this for package paths etc, so assume it's safe)
self._n_playbook_paths = [os.path.join(to_native(p), 'collections') for p in b_playbook_paths if not (p in added_paths or added_paths.add(p))]
# FIXME: only allow setting this once, or handle any necessary cache/package path invalidations internally?
# FIXME: is there a better place to store this?
# FIXME: only allow setting this once
def set_default_collection(self, collection_name):
self._default_collection = collection_name
@property
def default_collection(self):
return self._default_collection
def find_module(self, fullname, path=None):
if self._find_module(fullname, path, load=False)[0]:
return self
return None
def load_module(self, fullname):
mod = self._find_module(fullname, None, load=True)[1]
if not mod:
raise ImportError('module {0} not found'.format(fullname))
return mod
def _find_module(self, fullname, path, load):
# this loader is only concerned with items under the Ansible Collections namespace hierarchy, ignore others
if not fullname.startswith('ansible_collections.') and fullname != 'ansible_collections':
return False, None
if sys.modules.get(fullname):
if not load:
return True, None
return True, sys.modules[fullname]
newmod = None
# this loader implements key functionality for Ansible collections
# * implicit distributed namespace packages for the root Ansible namespace (no pkgutil.extend_path hackery reqd)
# * implicit package support for Python 2.7 (no need for __init__.py in collections, except to use standard Py2.7 tooling)
# * preventing controller-side code injection during collection loading
# * (default loader would execute arbitrary package code from all __init__.py's)
parent_pkg_name = '.'.join(fullname.split('.')[:-1])
parent_pkg = sys.modules.get(parent_pkg_name)
if parent_pkg_name and not parent_pkg:
raise ImportError('parent package {0} not found'.format(parent_pkg_name))
# are we at or below the collection level? eg a.mynamespace.mycollection.something.else
# if so, we don't want distributed namespace behavior; first mynamespace.mycollection on the path is where
# we'll load everything from (ie, don't fall back to another mynamespace.mycollection lower on the path)
sub_collection = fullname.count('.') > 1
synpkg_def = _SYNTHETIC_PACKAGES.get(fullname)
synpkg_remainder = ''
if not synpkg_def:
# if the parent is a grafted package, we have some special work to do, otherwise just look for stuff on disk
parent_synpkg_def = _SYNTHETIC_PACKAGES.get(parent_pkg_name)
if parent_synpkg_def and parent_synpkg_def.get('graft'):
synpkg_def = parent_synpkg_def
synpkg_remainder = '.' + fullname.rpartition('.')[2]
# FUTURE: collapse as much of this back to on-demand as possible (maybe stub packages that get replaced when actually loaded?)
if synpkg_def:
pkg_type = synpkg_def.get('type')
if not pkg_type:
raise KeyError('invalid synthetic package type (no package "type" specified)')
if pkg_type == 'map':
map_package = synpkg_def.get('map')
if not map_package:
raise KeyError('invalid synthetic map package definition (no target "map" defined)')
if not load:
return True, None
mod = import_module(map_package + synpkg_remainder)
sys.modules[fullname] = mod
return True, mod
elif pkg_type == 'flatmap':
raise NotImplementedError()
elif pkg_type == 'pkg_only':
if not load:
return True, None
newmod = ModuleType(fullname)
newmod.__package__ = fullname
newmod.__file__ = '<ansible_synthetic_collection_package>'
newmod.__loader__ = self
newmod.__path__ = []
if not synpkg_def.get('allow_external_subpackages'):
# if external subpackages are NOT allowed, we're done
sys.modules[fullname] = newmod
return True, newmod
# if external subpackages ARE allowed, check for on-disk implementations and return a normal
# package if we find one, otherwise return the one we created here
if not parent_pkg: # top-level package, look for NS subpackages on all collection paths
package_paths = [self._extend_path_with_ns(p, fullname) for p in self.n_collection_paths]
else: # subpackage; search in all subpaths (we'll limit later inside a collection)
package_paths = [self._extend_path_with_ns(p, fullname) for p in parent_pkg.__path__]
for candidate_child_path in package_paths:
code_object = None
is_package = True
location = None
# check for implicit sub-package first
if os.path.isdir(to_bytes(candidate_child_path)):
# Py3.x implicit namespace packages don't have a file location, so they don't support get_data
# (which assumes the parent dir or that the loader has an internal mapping); so we have to provide
# a bogus leaf file on the __file__ attribute for pkgutil.get_data to strip off
location = os.path.join(candidate_child_path, '__synthetic__')
else:
for source_path in [os.path.join(candidate_child_path, '__init__.py'),
candidate_child_path + '.py']:
if not os.path.isfile(to_bytes(source_path)):
continue
if not load:
return True, None
with open(to_bytes(source_path), 'rb') as fd:
source = fd.read()
code_object = compile(source=source, filename=source_path, mode='exec', flags=0, dont_inherit=True)
location = source_path
is_package = source_path.endswith('__init__.py')
break
if not location:
continue
newmod = ModuleType(fullname)
newmod.__file__ = location
newmod.__loader__ = self
if is_package:
if sub_collection: # we never want to search multiple instances of the same collection; use first found
newmod.__path__ = [candidate_child_path]
else:
newmod.__path__ = package_paths
newmod.__package__ = fullname
else:
newmod.__package__ = parent_pkg_name
sys.modules[fullname] = newmod
if code_object:
# FIXME: decide cases where we don't actually want to exec the code?
exec(code_object, newmod.__dict__)
return True, newmod
# even if we didn't find one on disk, fall back to a synthetic package if we have one...
if newmod:
sys.modules[fullname] = newmod
return True, newmod
# FIXME: need to handle the "no dirs present" case for at least the root and synthetic internal collections like ansible.builtin
return False, None
@staticmethod
def _extend_path_with_ns(path, ns):
ns_path_add = ns.rsplit('.', 1)[-1]
return os.path.join(path, ns_path_add)
def get_data(self, filename):
with open(filename, 'rb') as fd:
return fd.read()
class AnsibleFlatMapLoader(object):
_extension_blacklist = ['.pyc', '.pyo']
def __init__(self, root_package):
self._root_package = root_package
self._dirtree = None
def _init_dirtree(self):
# FIXME: thread safety
root_path = os.path.dirname(self._root_package.__file__)
flat_files = []
# FIXME: make this a dict of filename->dir for faster direct lookup?
# FIXME: deal with _ prefixed deprecated files (or require another method for collections?)
# FIXME: fix overloaded filenames (eg, rename Windows setup to win_setup)
for root, dirs, files in os.walk(root_path):
# add all files in this dir that don't have a blacklisted extension
flat_files.extend(((root, f) for f in files if not any((f.endswith(ext) for ext in self._extension_blacklist))))
# HACK: Put Windows modules at the end of the list. This makes collection_loader behave
# the same way as plugin loader, preventing '.ps1' from modules being selected before '.py'
# modules simply because '.ps1' files may be above '.py' files in the flat_files list.
#
# The expected sort order is paths in the order they were in 'flat_files'
# with paths ending in '/windows' at the end, also in the original order they were
# in 'flat_files'. The .sort() method is guaranteed to be stable, so original order is preserved.
flat_files.sort(key=lambda p: p[0].endswith('/windows'))
self._dirtree = flat_files
def find_file(self, filename):
# FIXME: thread safety
if not self._dirtree:
self._init_dirtree()
if '.' not in filename: # no extension specified, use extension regex to filter
extensionless_re = re.compile(r'^{0}(\..+)?$'.format(re.escape(filename)))
# why doesn't Python have first()?
try:
# FIXME: store extensionless in a separate direct lookup?
filepath = next(os.path.join(r, f) for r, f in self._dirtree if extensionless_re.match(f))
except StopIteration:
raise IOError("couldn't find {0}".format(filename))
else: # actual filename, just look it up
# FIXME: this case sucks; make it a lookup
try:
filepath = next(os.path.join(r, f) for r, f in self._dirtree if f == filename)
except StopIteration:
raise IOError("couldn't find {0}".format(filename))
return filepath
def get_data(self, filename):
found_file = self.find_file(filename)
with open(found_file, 'rb') as fd:
return fd.read()
# TODO: implement these for easier inline debugging?
# def get_source(self, fullname):
# def get_code(self, fullname):
# def is_package(self, fullname):
class AnsibleCollectionRef:
# FUTURE: introspect plugin loaders to get these dynamically?
VALID_REF_TYPES = frozenset(to_text(r) for r in ['action', 'become', 'cache', 'callback', 'cliconf', 'connection',
'doc_fragments', 'filter', 'httpapi', 'inventory', 'lookup',
'module_utils', 'modules', 'netconf', 'role', 'shell', 'strategy',
'terminal', 'test', 'vars'])
# FIXME: tighten this up to match Python identifier reqs, etc
VALID_COLLECTION_NAME_RE = re.compile(to_text(r'^(\w+)\.(\w+)$'))
VALID_SUBDIRS_RE = re.compile(to_text(r'^\w+(\.\w+)*$'))
VALID_FQCR_RE = re.compile(to_text(r'^\w+\.\w+\.\w+(\.\w+)*$')) # can have 0-N included subdirs as well
def __init__(self, collection_name, subdirs, resource, ref_type):
"""
Create an AnsibleCollectionRef from components
:param collection_name: a collection name of the form 'namespace.collectionname'
:param subdirs: optional subdir segments to be appended below the plugin type (eg, 'subdir1.subdir2')
:param resource: the name of the resource being references (eg, 'mymodule', 'someaction', 'a_role')
:param ref_type: the type of the reference, eg 'module', 'role', 'doc_fragment'
"""
collection_name = to_text(collection_name, errors='strict')
if subdirs is not None:
subdirs = to_text(subdirs, errors='strict')
resource = to_text(resource, errors='strict')
ref_type = to_text(ref_type, errors='strict')
if not self.is_valid_collection_name(collection_name):
raise ValueError('invalid collection name (must be of the form namespace.collection): {0}'.format(to_native(collection_name)))
if ref_type not in self.VALID_REF_TYPES:
raise ValueError('invalid collection ref_type: {0}'.format(ref_type))
self.collection = collection_name
if subdirs:
if not re.match(self.VALID_SUBDIRS_RE, subdirs):
raise ValueError('invalid subdirs entry: {0} (must be empty/None or of the form subdir1.subdir2)'.format(to_native(subdirs)))
self.subdirs = subdirs
else:
self.subdirs = u''
self.resource = resource
self.ref_type = ref_type
package_components = [u'ansible_collections', self.collection]
if self.ref_type == u'role':
package_components.append(u'roles')
else:
# we assume it's a plugin
package_components += [u'plugins', self.ref_type]
if self.subdirs:
package_components.append(self.subdirs)
if self.ref_type == u'role':
# roles are their own resource
package_components.append(self.resource)
self.n_python_package_name = to_native('.'.join(package_components))
@staticmethod
def from_fqcr(ref, ref_type):
"""
Parse a string as a fully-qualified collection reference, raises ValueError if invalid
:param ref: collection reference to parse (a valid ref is of the form 'ns.coll.resource' or 'ns.coll.subdir1.subdir2.resource')
:param ref_type: the type of the reference, eg 'module', 'role', 'doc_fragment'
:return: a populated AnsibleCollectionRef object
"""
# assuming the fq_name is of the form (ns).(coll).(optional_subdir_N).(resource_name),
# we split the resource name off the right, split ns and coll off the left, and we're left with any optional
# subdirs that need to be added back below the plugin-specific subdir we'll add. So:
# ns.coll.resource -> ansible_collections.ns.coll.plugins.(plugintype).resource
# ns.coll.subdir1.resource -> ansible_collections.ns.coll.plugins.subdir1.(plugintype).resource
# ns.coll.rolename -> ansible_collections.ns.coll.roles.rolename
if not AnsibleCollectionRef.is_valid_fqcr(ref):
raise ValueError('{0} is not a valid collection reference'.format(to_native(ref)))
ref = to_text(ref, errors='strict')
ref_type = to_text(ref_type, errors='strict')
resource_splitname = ref.rsplit(u'.', 1)
package_remnant = resource_splitname[0]
resource = resource_splitname[1]
# split the left two components of the collection package name off, anything remaining is plugin-type
# specific subdirs to be added back on below the plugin type
package_splitname = package_remnant.split(u'.', 2)
if len(package_splitname) == 3:
subdirs = package_splitname[2]
else:
subdirs = u''
collection_name = u'.'.join(package_splitname[0:2])
return AnsibleCollectionRef(collection_name, subdirs, resource, ref_type)
@staticmethod
def try_parse_fqcr(ref, ref_type):
"""
Attempt to parse a string as a fully-qualified collection reference, returning None on failure (instead of raising an error)
:param ref: collection reference to parse (a valid ref is of the form 'ns.coll.resource' or 'ns.coll.subdir1.subdir2.resource')
:param ref_type: the type of the reference, eg 'module', 'role', 'doc_fragment'
:return: a populated AnsibleCollectionRef object on successful parsing, else None
"""
try:
return AnsibleCollectionRef.from_fqcr(ref, ref_type)
except ValueError:
pass
@staticmethod
def legacy_plugin_dir_to_plugin_type(legacy_plugin_dir_name):
"""
Utility method to convert from a PluginLoader dir name to a plugin ref_type
:param legacy_plugin_dir_name: PluginLoader dir name (eg, 'action_plugins', 'library')
:return: the corresponding plugin ref_type (eg, 'action', 'role')
"""
legacy_plugin_dir_name = to_text(legacy_plugin_dir_name)
plugin_type = legacy_plugin_dir_name.replace(u'_plugins', u'')
if plugin_type == u'library':
plugin_type = u'modules'
if plugin_type not in AnsibleCollectionRef.VALID_REF_TYPES:
raise ValueError('{0} cannot be mapped to a valid collection ref type'.format(to_native(legacy_plugin_dir_name)))
return plugin_type
@staticmethod
def is_valid_fqcr(ref, ref_type=None):
"""
Validates if is string is a well-formed fully-qualified collection reference (does not look up the collection itself)
:param ref: candidate collection reference to validate (a valid ref is of the form 'ns.coll.resource' or 'ns.coll.subdir1.subdir2.resource')
:param ref_type: optional reference type to enable deeper validation, eg 'module', 'role', 'doc_fragment'
:return: True if the collection ref passed is well-formed, False otherwise
"""
ref = to_text(ref)
if not ref_type:
return bool(re.match(AnsibleCollectionRef.VALID_FQCR_RE, ref))
return bool(AnsibleCollectionRef.try_parse_fqcr(ref, ref_type))
@staticmethod
def is_valid_collection_name(collection_name):
"""
Validates if the given string is a well-formed collection name (does not look up the collection itself)
:param collection_name: candidate collection name to validate (a valid name is of the form 'ns.collname')
:return: True if the collection name passed is well-formed, False otherwise
"""
collection_name = to_text(collection_name)
return bool(re.match(AnsibleCollectionRef.VALID_COLLECTION_NAME_RE, collection_name))
def get_collection_role_path(role_name, collection_list=None):
acr = AnsibleCollectionRef.try_parse_fqcr(role_name, 'role')
if acr:
# looks like a valid qualified collection ref; skip the collection_list
role = acr.resource
collection_list = [acr.collection]
subdirs = acr.subdirs
resource = acr.resource
elif not collection_list:
return None # not a FQ role and no collection search list spec'd, nothing to do
else:
resource = role_name # treat as unqualified, loop through the collection search list to try and resolve
subdirs = ''
for collection_name in collection_list:
try:
acr = AnsibleCollectionRef(collection_name=collection_name, subdirs=subdirs, resource=resource, ref_type='role')
# FIXME: error handling/logging; need to catch any import failures and move along
# FIXME: this line shouldn't be necessary, but py2 pkgutil.get_data is delegating back to built-in loader when it shouldn't
pkg = import_module(acr.n_python_package_name)
if pkg is not None:
# the package is now loaded, get the collection's package and ask where it lives
path = os.path.dirname(to_bytes(sys.modules[acr.n_python_package_name].__file__, errors='surrogate_or_strict'))
return resource, to_text(path, errors='surrogate_or_strict'), collection_name
except IOError:
continue
except Exception as ex:
# FIXME: pick out typical import errors first, then error logging
continue
return None
_N_COLLECTION_PATH_RE = re.compile(r'/ansible_collections/([^/]+)/([^/]+)')
def get_collection_name_from_path(path):
"""
Return the containing collection name for a given path, or None if the path is not below a configured collection, or
the collection cannot be loaded (eg, the collection is masked by another of the same name higher in the configured
collection roots).
:param n_path: native-string path to evaluate for collection containment
:return: collection name or None
"""
n_collection_paths = [to_native(os.path.realpath(to_bytes(p))) for p in AnsibleCollectionLoader().n_collection_paths]
b_path = os.path.realpath(to_bytes(path))
n_path = to_native(b_path)
for coll_path in n_collection_paths:
common_prefix = to_native(os.path.commonprefix([b_path, to_bytes(coll_path)]))
if common_prefix == coll_path:
# strip off the common prefix (handle weird testing cases of nested collection roots, eg)
collection_remnant = n_path[len(coll_path):]
# commonprefix may include the trailing /, prepend to the remnant if necessary (eg trailing / on root)
if collection_remnant and collection_remnant[0] != '/':
collection_remnant = '/' + collection_remnant
# the path lives under this collection root, see if it maps to a collection
found_collection = _N_COLLECTION_PATH_RE.search(collection_remnant)
if not found_collection:
continue
n_collection_name = '{0}.{1}'.format(*found_collection.groups())
loaded_collection_path = AnsibleCollectionLoader().get_collection_path(n_collection_name)
if not loaded_collection_path:
return None
# ensure we're using the canonical real path, with the bogus __synthetic__ stripped off
b_loaded_collection_path = os.path.dirname(os.path.realpath(to_bytes(loaded_collection_path)))
# if the collection path prefix matches the path prefix we were passed, it's the same collection that's loaded
if os.path.commonprefix([b_path, b_loaded_collection_path]) == b_loaded_collection_path:
return n_collection_name
return None # if not, it's a collection, but not the same collection the loader sees, so ignore it
def set_collection_playbook_paths(b_playbook_paths):
AnsibleCollectionLoader().set_playbook_paths(b_playbook_paths)
def is_collection_path(path):
if os.path.isdir(path):
for flag in FLAG_FILES:
if os.path.exists(os.path.join(path, flag)):
return True
return False
def list_valid_collection_paths(search_paths=None, warn=False):
found_paths = []
if search_paths is None:
search_paths = C.COLLECTIONS_PATHS
for path in search_paths:
if not os.path.exists(path):
# warn for missing, but not if default
if warn:
display.warning("The configured collection path {0} does not exist.".format(path))
continue
if not os.path.isdir(path):
if warn:
display.warning("The configured collection path {0}, exists, but it is not a directory.".format(path))
continue
found_paths.append(path)
return found_paths
def list_collection_dirs(search_paths=None, namespace=None):
collections = defaultdict(list)
paths = list_valid_collection_paths(search_paths)
for path in paths:
if os.path.isdir(path):
coll_root = os.path.join(path, 'ansible_collections')
if os.path.exists(coll_root) and os.path.isdir(coll_root):
for namespace in os.listdir(coll_root):
namespace_dir = os.path.join(coll_root, namespace)
if os.path.isdir(namespace_dir):
for collection in os.listdir(namespace_dir):
coll_dir = os.path.join(namespace_dir, collection)
if is_collection_path(coll_dir):
collections[namespace].append(os.path.join(namespace_dir, collection))
return collections
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,528 |
Ansible.ModuleUtils.WebRequest: cannot ignore proxy
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
It seems not possible to run `win_get_url` without proxy.
Based on the documentation, my expectation was that setting the parameter `use_proxy` to 'no' should ignore the default IE proxy, and run without proxy if `proxy_url` is not set at the same time. However, this is not working, even with `use_proxy: no`, the IE proxy is still being used.
Looking into the actual source code at https://github.com/ansible/ansible/blob/e5995a2eed08c0a0bf4797ae8a38aefd160b0f0a/lib/ansible/module_utils/powershell/Ansible.ModuleUtils.WebRequest.psm1#L252, I believe the issue is this:
```
if (-not $UseProxy) {
$proxy = $null
} elseif ($ProxyUrl) {
$proxy = New-Object -TypeName System.Net.WebProxy -ArgumentList $ProxyUrl, $true
} else {
$proxy = $web_request.Proxy
}
# $web_request.Proxy may return $null for a FTP web request. We only set the credentials if we have an actual
# proxy to work with, otherwise just ignore the credentials property.
if ($null -ne $proxy) {
[...]
$web_request.Proxy = $proxy
}
```
In other words, when `use_proxy: no`, the `$proxy` variable will first (correctly) be set to `$null`, but that value is never written to `$web_request.Proxy`, and thus never used. Instead the default value is used.
Bringing `$web_request.Proxy = $proxy` outside of that last `if block` fixes things for me, but might break other things.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.WebRequest.psm1
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible --version
ansible 2.9.1
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
NA
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
Seeing this on Windows 10.
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
- hosts: all
vars:
# set unexisting proxy
proxy: unexisting
download_url: <local url that can be reached without proxy>
tasks:
- name: Configure IE to use explicit proxy host with port and without auto detection
win_inet_proxy:
auto_detect: no
proxy: "{{ proxy }}"
- name: Download package without proxy
win_get_url:
url: "{{ download_url }}"
dest: C:\Users\ansible\test.jpg
use_proxy: no
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
Playbook should succesfully download test file without trying to connect to the unexisting proxy.
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
Playbook fails, as it tries to connect to the unexisting proxy.
|
https://github.com/ansible/ansible/issues/68528
|
https://github.com/ansible/ansible/pull/68603
|
e785bdaa5b08c8426694e6348c52c562d38147cd
|
ae1cd27b575a759e9d2477042fc5dbbb3275cd84
| 2020-03-28T13:26:56Z |
python
| 2020-04-01T21:17:50Z |
changelogs/fragments/win-web-request-no_proxy.yaml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,528 |
Ansible.ModuleUtils.WebRequest: cannot ignore proxy
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
It seems not possible to run `win_get_url` without proxy.
Based on the documentation, my expectation was that setting the parameter `use_proxy` to 'no' should ignore the default IE proxy, and run without proxy if `proxy_url` is not set at the same time. However, this is not working, even with `use_proxy: no`, the IE proxy is still being used.
Looking into the actual source code at https://github.com/ansible/ansible/blob/e5995a2eed08c0a0bf4797ae8a38aefd160b0f0a/lib/ansible/module_utils/powershell/Ansible.ModuleUtils.WebRequest.psm1#L252, I believe the issue is this:
```
if (-not $UseProxy) {
$proxy = $null
} elseif ($ProxyUrl) {
$proxy = New-Object -TypeName System.Net.WebProxy -ArgumentList $ProxyUrl, $true
} else {
$proxy = $web_request.Proxy
}
# $web_request.Proxy may return $null for a FTP web request. We only set the credentials if we have an actual
# proxy to work with, otherwise just ignore the credentials property.
if ($null -ne $proxy) {
[...]
$web_request.Proxy = $proxy
}
```
In other words, when `use_proxy: no`, the `$proxy` variable will first (correctly) be set to `$null`, but that value is never written to `$web_request.Proxy`, and thus never used. Instead the default value is used.
Bringing `$web_request.Proxy = $proxy` outside of that last `if block` fixes things for me, but might break other things.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.WebRequest.psm1
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible --version
ansible 2.9.1
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
NA
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
Seeing this on Windows 10.
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
- hosts: all
vars:
# set unexisting proxy
proxy: unexisting
download_url: <local url that can be reached without proxy>
tasks:
- name: Configure IE to use explicit proxy host with port and without auto detection
win_inet_proxy:
auto_detect: no
proxy: "{{ proxy }}"
- name: Download package without proxy
win_get_url:
url: "{{ download_url }}"
dest: C:\Users\ansible\test.jpg
use_proxy: no
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
Playbook should succesfully download test file without trying to connect to the unexisting proxy.
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
Playbook fails, as it tries to connect to the unexisting proxy.
|
https://github.com/ansible/ansible/issues/68528
|
https://github.com/ansible/ansible/pull/68603
|
e785bdaa5b08c8426694e6348c52c562d38147cd
|
ae1cd27b575a759e9d2477042fc5dbbb3275cd84
| 2020-03-28T13:26:56Z |
python
| 2020-04-01T21:17:50Z |
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.WebRequest.psm1
|
# Copyright (c) 2019 Ansible Project
# Simplified BSD License (see licenses/simplified_bsd.txt or https://opensource.org/licenses/BSD-2-Clause)
Function Get-AnsibleWebRequest {
<#
.SYNOPSIS
Creates a System.Net.WebRequest object based on common URL module options in Ansible.
.DESCRIPTION
Will create a WebRequest based on common input options within Ansible. This can be used manually or with
Invoke-WithWebRequest.
.PARAMETER Uri
The URI to create the web request for.
.PARAMETER Method
The protocol method to use, if omitted, will use the default value for the URI protocol specified.
.PARAMETER FollowRedirects
Whether to follow redirect reponses. This is only valid when using a HTTP URI.
all - Will follow all redirects
none - Will follow no redirects
safe - Will only follow redirects when GET or HEAD is used as the Method
.PARAMETER Headers
A hashtable or dictionary of header values to set on the request. This is only valid for a HTTP URI.
.PARAMETER HttpAgent
A string to set for the 'User-Agent' header. This is only valid for a HTTP URI.
.PARAMETER MaximumRedirection
The maximum number of redirections that will be followed. This is only valid for a HTTP URI.
.PARAMETER Timeout
The timeout in seconds that defines how long to wait until the request times out.
.PARAMETER ValidateCerts
Whether to validate SSL certificates, default to True.
.PARAMETER ClientCert
The path to PFX file to use for X509 authentication. This is only valid for a HTTP URI. This path can either
be a filesystem path (C:\folder\cert.pfx) or a PSPath to a credential (Cert:\CurrentUser\My\<thumbprint>).
.PARAMETER ClientCertPassword
The password for the PFX certificate if required. This is only valid for a HTTP URI.
.PARAMETER ForceBasicAuth
Whether to set the Basic auth header on the first request instead of when required. This is only valid for a
HTTP URI.
.PARAMETER UrlUsername
The username to use for authenticating with the target.
.PARAMETER UrlPassword
The password to use for authenticating with the target.
.PARAMETER UseDefaultCredential
Whether to use the current user's credentials if available. This will only work when using Become, using SSH with
password auth, or WinRM with CredSSP or Kerberos with credential delegation.
.PARAMETER UseProxy
Whether to use the default proxy defined in IE (WinINet) for the user or set no proxy at all. This should not
be set to True when ProxyUrl is also defined.
.PARAMETER ProxyUrl
An explicit proxy server to use for the request instead of relying on the default proxy in IE. This is only
valid for a HTTP URI.
.PARAMETER ProxyUsername
An optional username to use for proxy authentication.
.PARAMETER ProxyPassword
The password for ProxyUsername.
.PARAMETER ProxyUseDefaultCredential
Whether to use the current user's credentials for proxy authentication if available. This will only work when
using Become, using SSH with password auth, or WinRM with CredSSP or Kerberos with credential delegation.
.PARAMETER Module
The AnsibleBasic module that can be used as a backup parameter source or a way to return warnings back to the
Ansible controller.
.EXAMPLE
$spec = @{
options = @{}
}
$spec.options += $ansible_web_request_options
$module = Ansible.Basic.AnsibleModule]::Create($args, $spec)
$web_request = Get-AnsibleWebRequest -Module $module
#>
[CmdletBinding()]
[OutputType([System.Net.WebRequest])]
Param (
[Alias("url")]
[System.Uri]
$Uri,
[System.String]
$Method,
[Alias("follow_redirects")]
[ValidateSet("all", "none", "safe")]
[System.String]
$FollowRedirects = "safe",
[System.Collections.IDictionary]
$Headers,
[Alias("http_agent")]
[System.String]
$HttpAgent = "ansible-httpget",
[Alias("maximum_redirection")]
[System.Int32]
$MaximumRedirection = 50,
[System.Int32]
$Timeout = 30,
[Alias("validate_certs")]
[System.Boolean]
$ValidateCerts = $true,
# Credential params
[Alias("client_cert")]
[System.String]
$ClientCert,
[Alias("client_cert_password")]
[System.String]
$ClientCertPassword,
[Alias("force_basic_auth")]
[Switch]
$ForceBasicAuth,
[Alias("url_username")]
[System.String]
$UrlUsername,
[Alias("url_password")]
[System.String]
$UrlPassword,
[Alias("use_default_credential")]
[Switch]
$UseDefaultCredential,
# Proxy params
[Alias("use_proxy")]
[System.Boolean]
$UseProxy = $true,
[Alias("proxy_url")]
[System.String]
$ProxyUrl,
[Alias("proxy_username")]
[System.String]
$ProxyUsername,
[Alias("proxy_password")]
[System.String]
$ProxyPassword,
[Alias("proxy_use_default_credential")]
[Switch]
$ProxyUseDefaultCredential,
[ValidateScript({ $_.GetType().FullName -eq 'Ansible.Basic.AnsibleModule' })]
[System.Object]
$Module
)
# Set module options for parameters unless they were explicitly passed in.
if ($Module) {
foreach ($param in $PSCmdlet.MyInvocation.MyCommand.Parameters.GetEnumerator()) {
if ($PSBoundParameters.ContainsKey($param.Key)) {
# Was set explicitly we want to use that value
continue
}
foreach ($alias in @($Param.Key) + $param.Value.Aliases) {
if ($Module.Params.ContainsKey($alias)) {
$var_value = $Module.Params.$alias -as $param.Value.ParameterType
Set-Variable -Name $param.Key -Value $var_value
break
}
}
}
}
# Disable certificate validation if requested
# FUTURE: set this on ServerCertificateValidationCallback of the HttpWebRequest once .NET 4.5 is the minimum
if (-not $ValidateCerts) {
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $true }
}
# Enable TLS1.1/TLS1.2 if they're available but disabled (eg. .NET 4.5)
$security_protocols = [System.Net.ServicePointManager]::SecurityProtocol -bor [System.Net.SecurityProtocolType]::SystemDefault
if ([System.Net.SecurityProtocolType].GetMember("Tls11").Count -gt 0) {
$security_protocols = $security_protocols -bor [System.Net.SecurityProtocolType]::Tls11
}
if ([System.Net.SecurityProtocolType].GetMember("Tls12").Count -gt 0) {
$security_protocols = $security_protocols -bor [System.Net.SecurityProtocolType]::Tls12
}
[System.Net.ServicePointManager]::SecurityProtocol = $security_protocols
$web_request = [System.Net.WebRequest]::Create($Uri)
if ($Method) {
$web_request.Method = $Method
}
$web_request.Timeout = $Timeout * 1000
if ($UseDefaultCredential -and $web_request -is [System.Net.HttpWebRequest]) {
$web_request.UseDefaultCredentials = $true
} elseif ($UrlUsername) {
if ($ForceBasicAuth) {
$auth_value = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $UrlUsername, $UrlPassword)))
$web_request.Headers.Add("Authorization", "Basic $auth_value")
} else {
$credential = New-Object -TypeName System.Net.NetworkCredential -ArgumentList $UrlUsername, $UrlPassword
$web_request.Credentials = $credential
}
}
if ($ClientCert) {
# Expecting either a filepath or PSPath (Cert:\CurrentUser\My\<thumbprint>)
$cert = Get-Item -LiteralPath $ClientCert -ErrorAction SilentlyContinue
if ($null -eq $cert) {
Write-Error -Message "Client certificate '$ClientCert' does not exist" -Category ObjectNotFound
return
}
$crypto_ns = 'System.Security.Cryptography.X509Certificates'
if ($cert.PSProvider.Name -ne 'Certificate') {
try {
$cert = New-Object -TypeName "$crypto_ns.X509Certificate2" -ArgumentList @(
$ClientCert, $ClientCertPassword
)
} catch [System.Security.Cryptography.CryptographicException] {
Write-Error -Message "Failed to read client certificate at '$ClientCert'" -Exception $_.Exception -Category SecurityError
return
}
}
$web_request.ClientCertificates = New-Object -TypeName "$crypto_ns.X509Certificate2Collection" -ArgumentList @(
$cert
)
}
if (-not $UseProxy) {
$proxy = $null
} elseif ($ProxyUrl) {
$proxy = New-Object -TypeName System.Net.WebProxy -ArgumentList $ProxyUrl, $true
} else {
$proxy = $web_request.Proxy
}
# $web_request.Proxy may return $null for a FTP web request. We only set the credentials if we have an actual
# proxy to work with, otherwise just ignore the credentials property.
if ($null -ne $proxy) {
if ($ProxyUseDefaultCredential) {
# Weird hack, $web_request.Proxy returns an IWebProxy object which only gurantees the Credentials
# property. We cannot set UseDefaultCredentials so we just set the Credentials to the
# DefaultCredentials in the CredentialCache which does the same thing.
$proxy.Credentials = [System.Net.CredentialCache]::DefaultCredentials
} elseif ($ProxyUsername) {
$proxy.Credentials = New-Object -TypeName System.Net.NetworkCredential -ArgumentList @(
$ProxyUsername, $ProxyPassword
)
} else {
$proxy.Credentials = $null
}
$web_request.Proxy = $proxy
}
# Some parameters only apply when dealing with a HttpWebRequest
if ($web_request -is [System.Net.HttpWebRequest]) {
if ($Headers) {
foreach ($header in $Headers.GetEnumerator()) {
switch ($header.Key) {
Accept { $web_request.Accept = $header.Value }
Connection { $web_request.Connection = $header.Value }
Content-Length { $web_request.ContentLength = $header.Value }
Content-Type { $web_request.ContentType = $header.Value }
Expect { $web_request.Expect = $header.Value }
Date { $web_request.Date = $header.Value }
Host { $web_request.Host = $header.Value }
If-Modified-Since { $web_request.IfModifiedSince = $header.Value }
Range { $web_request.AddRange($header.Value) }
Referer { $web_request.Referer = $header.Value }
Transfer-Encoding {
$web_request.SendChunked = $true
$web_request.TransferEncoding = $header.Value
}
User-Agent { continue }
default { $web_request.Headers.Add($header.Key, $header.Value) }
}
}
}
# For backwards compatibility we need to support setting the User-Agent if the header was set in the task.
# We just need to make sure that if an explicit http_agent module was set then that takes priority.
if ($Headers -and $Headers.ContainsKey("User-Agent")) {
if ($HttpAgent -eq $ansible_web_request_options.http_agent.default) {
$HttpAgent = $Headers['User-Agent']
} elseif ($null -ne $Module) {
$Module.Warn("The 'User-Agent' header and the 'http_agent' was set, using the 'http_agent' for web request")
}
}
$web_request.UserAgent = $HttpAgent
switch ($FollowRedirects) {
none { $web_request.AllowAutoRedirect = $false }
safe {
if ($web_request.Method -in @("GET", "HEAD")) {
$web_request.AllowAutoRedirect = $true
} else {
$web_request.AllowAutoRedirect = $false
}
}
all { $web_request.AllowAutoRedirect = $true }
}
if ($MaximumRedirection -eq 0) {
$web_request.AllowAutoRedirect = $false
} else {
$web_request.MaximumAutomaticRedirections = $MaximumRedirection
}
}
return $web_request
}
Function Invoke-WithWebRequest {
<#
.SYNOPSIS
Invokes a ScriptBlock with the WebRequest.
.DESCRIPTION
Invokes the ScriptBlock and handle extra information like accessing the response stream, closing those streams
safely as well as setting common module return values.
.PARAMETER Module
The Ansible.Basic module to set the return values for. This will set the following return values;
elapsed - The total time, in seconds, that it took to send the web request and process the response
msg - The human readable description of the response status code
status_code - An int that is the response status code
.PARAMETER Request
The System.Net.WebRequest to call. This can either be manually crafted or created with Get-AnsibleWebRequest.
.PARAMETER Script
The ScriptBlock to invoke during the web request. This ScriptBlock should take in the params
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
This scriptblock should manage the response based on what it need to do.
.PARAMETER Body
An optional Stream to send to the target during the request.
.PARAMETER IgnoreBadResponse
By default a WebException will be raised for a non 2xx status code and the Script will not be invoked. This
parameter can be set to process all responses regardless of the status code.
.EXAMPLE Basic module that downloads a file
$spec = @{
options = @{
path = @{ type = "path"; required = $true }
}
}
$spec.options += $ansible_web_request_options
$module = Ansible.Basic.AnsibleModule]::Create($args, $spec)
$web_request = Get-AnsibleWebRequest -Module $module
Invoke-WithWebRequest -Module $module -Request $web_request -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$fs = [System.IO.File]::Create($module.Params.path)
try {
$Stream.CopyTo($fs)
$fs.Flush()
} finally {
$fs.Dispose()
}
}
#>
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[System.Object]
[ValidateScript({ $_.GetType().FullName -eq 'Ansible.Basic.AnsibleModule' })]
$Module,
[Parameter(Mandatory=$true)]
[System.Net.WebRequest]
$Request,
[Parameter(Mandatory=$true)]
[ScriptBlock]
$Script,
[AllowNull()]
[System.IO.Stream]
$Body,
[Switch]
$IgnoreBadResponse
)
$start = Get-Date
if ($null -ne $Body) {
$request_st = $Request.GetRequestStream()
try {
$Body.CopyTo($request_st)
$request_st.Flush()
} finally {
$request_st.Close()
}
}
try {
try {
$web_response = $Request.GetResponse()
} catch [System.Net.WebException] {
# A WebResponse with a status code not in the 200 range will raise a WebException. We check if the
# exception raised contains the actual response and continue on if IgnoreBadResponse is set. We also
# make sure we set the status_code return value on the Module object if possible
if ($_.Exception.PSObject.Properties.Name -match "Response") {
$web_response = $_.Exception.Response
if (-not $IgnoreBadResponse -or $null -eq $web_response) {
$Module.Result.msg = $_.Exception.StatusDescription
$Module.Result.status_code = $_.Exception.Response.StatusCode
throw $_
}
} else {
throw $_
}
}
if ($Request.RequestUri.IsFile) {
# A FileWebResponse won't have these properties set
$Module.Result.msg = "OK"
$Module.Result.status_code = 200
} else {
$Module.Result.msg = $web_response.StatusDescription
$Module.Result.status_code = $web_response.StatusCode
}
$response_stream = $web_response.GetResponseStream()
try {
# Invoke the ScriptBlock and pass in WebResponse and ResponseStream
&$Script -Response $web_response -Stream $response_stream
} finally {
$response_stream.Dispose()
}
} finally {
if ($web_response) {
$web_response.Close()
}
$Module.Result.elapsed = ((Get-date) - $start).TotalSeconds
}
}
Function Merge-WebRequestSpec {
<#
.SYNOPSIS
Merges a modules spec definition with extra options supplied by this module_util. Options from the module take
priority over the module util spec.
.PARAMETER ModuleSpec
The root $spec of a module option definition to merge with.
.EXAMPLE
$spec = @{
options = @{
name = @{ type = "str" }
}
supports_check_mode = $true
}
$spec = Merge-WebRequestSpec -ModuleSpec $spec
#>
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[System.Collections.IDictionary]
$ModuleSpec,
[System.Collections.IDictionary]
$SpecToMerge = @{ options = $ansible_web_request_options }
)
foreach ($option_kvp in $SpecToMerge.GetEnumerator()) {
$k = $option_kvp.Key
$v = $option_kvp.Value
if ($ModuleSpec.Contains($k)) {
if ($v -is [System.Collections.IDictionary]) {
$ModuleSpec[$k] = Merge-WebRequestSpec -ModuleSpec $ModuleSpec[$k] -SpecToMerge $v
} elseif ($v -is [Array] -or $v -is [System.Collections.IList]) {
$sourceList = [System.Collections.Generic.List[Object]]$ModuleSpec[$k]
foreach ($entry in $v) {
$sourceList.Add($entry)
}
$ModuleSpec[$k] = $sourceList
}
} else {
$ModuleSpec[$k] = $v
}
}
$ModuleSpec
}
# See lib/ansible/plugins/doc_fragments/url_windows.py
$ansible_web_request_options = @{
method = @{ type="str" }
follow_redirects = @{ type="str"; choices=@("all","none","safe"); default="safe" }
headers = @{ type="dict" }
http_agent = @{ type="str"; default="ansible-httpget" }
maximum_redirection = @{ type="int"; default=50 }
timeout = @{ type="int"; default=30 } # Was defaulted to 10 in win_get_url but 30 in win_uri so we use 30
validate_certs = @{ type="bool"; default=$true }
# Credential options
client_cert = @{ type="str" }
client_cert_password = @{ type="str"; no_log=$true }
force_basic_auth = @{ type="bool"; default=$false }
url_username = @{ type="str" }
url_password = @{ type="str"; no_log=$true }
use_default_credential = @{ type="bool"; default=$false }
# Proxy options
use_proxy = @{ type="bool"; default=$true }
proxy_url = @{ type="str" }
proxy_username = @{ type="str" }
proxy_password = @{ type="str"; no_log=$true }
proxy_use_default_credential = @{ type="bool"; default=$false }
}
$export_members = @{
Function = "Get-AnsibleWebRequest", "Invoke-WithWebRequest", "Merge-WebRequestSpec"
Variable = "ansible_web_request_options"
}
Export-ModuleMember @export_members
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,528 |
Ansible.ModuleUtils.WebRequest: cannot ignore proxy
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
It seems not possible to run `win_get_url` without proxy.
Based on the documentation, my expectation was that setting the parameter `use_proxy` to 'no' should ignore the default IE proxy, and run without proxy if `proxy_url` is not set at the same time. However, this is not working, even with `use_proxy: no`, the IE proxy is still being used.
Looking into the actual source code at https://github.com/ansible/ansible/blob/e5995a2eed08c0a0bf4797ae8a38aefd160b0f0a/lib/ansible/module_utils/powershell/Ansible.ModuleUtils.WebRequest.psm1#L252, I believe the issue is this:
```
if (-not $UseProxy) {
$proxy = $null
} elseif ($ProxyUrl) {
$proxy = New-Object -TypeName System.Net.WebProxy -ArgumentList $ProxyUrl, $true
} else {
$proxy = $web_request.Proxy
}
# $web_request.Proxy may return $null for a FTP web request. We only set the credentials if we have an actual
# proxy to work with, otherwise just ignore the credentials property.
if ($null -ne $proxy) {
[...]
$web_request.Proxy = $proxy
}
```
In other words, when `use_proxy: no`, the `$proxy` variable will first (correctly) be set to `$null`, but that value is never written to `$web_request.Proxy`, and thus never used. Instead the default value is used.
Bringing `$web_request.Proxy = $proxy` outside of that last `if block` fixes things for me, but might break other things.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.WebRequest.psm1
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible --version
ansible 2.9.1
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
NA
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
Seeing this on Windows 10.
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
- hosts: all
vars:
# set unexisting proxy
proxy: unexisting
download_url: <local url that can be reached without proxy>
tasks:
- name: Configure IE to use explicit proxy host with port and without auto detection
win_inet_proxy:
auto_detect: no
proxy: "{{ proxy }}"
- name: Download package without proxy
win_get_url:
url: "{{ download_url }}"
dest: C:\Users\ansible\test.jpg
use_proxy: no
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
Playbook should succesfully download test file without trying to connect to the unexisting proxy.
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
Playbook fails, as it tries to connect to the unexisting proxy.
|
https://github.com/ansible/ansible/issues/68528
|
https://github.com/ansible/ansible/pull/68603
|
e785bdaa5b08c8426694e6348c52c562d38147cd
|
ae1cd27b575a759e9d2477042fc5dbbb3275cd84
| 2020-03-28T13:26:56Z |
python
| 2020-04-01T21:17:50Z |
test/integration/targets/module_utils_Ansible.ModuleUtils.WebRequest/library/web_request_test.ps1
|
#!powershell
#AnsibleRequires -CSharpUtil Ansible.Basic
#Requires -Module Ansible.ModuleUtils.WebRequest
$spec = @{
options = @{
httpbin_host = @{ type = 'str'; required = $true }
}
}
$module = [Ansible.Basic.AnsibleModule]::Create($args, $spec)
$httpbin_host = $module.Params.httpbin_host
Function Assert-Equals {
param(
[Parameter(Mandatory=$true, ValueFromPipeline=$true)][AllowNull()]$Actual,
[Parameter(Mandatory=$true, Position=0)][AllowNull()]$Expected
)
$matched = $false
if ($Actual -is [System.Collections.ArrayList] -or $Actual -is [Array] -or $Actual -is [System.Collections.IList]) {
$Actual.Count | Assert-Equals -Expected $Expected.Count
for ($i = 0; $i -lt $Actual.Count; $i++) {
$actualValue = $Actual[$i]
$expectedValue = $Expected[$i]
Assert-Equals -Actual $actualValue -Expected $expectedValue
}
$matched = $true
} else {
$matched = $Actual -ceq $Expected
}
if (-not $matched) {
if ($Actual -is [PSObject]) {
$Actual = $Actual.ToString()
}
$call_stack = (Get-PSCallStack)[1]
$module.Result.test = $test
$module.Result.actual = $Actual
$module.Result.expected = $Expected
$module.Result.line = $call_stack.ScriptLineNumber
$module.Result.method = $call_stack.Position.Text
$module.FailJson("AssertionError: actual != expected")
}
}
Function Convert-StreamToString {
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[System.IO.Stream]
$Stream
)
$ms = New-Object -TypeName System.IO.MemoryStream
try {
$Stream.CopyTo($ms)
[System.Text.Encoding]::UTF8.GetString($ms.ToArray())
} finally {
$ms.Dispose()
}
}
$tests = [Ordered]@{
'GET request over http' = {
$r = Get-AnsibleWebRequest -Uri "http://$httpbin_host/get"
$r.Method | Assert-Equals -Expected 'GET'
$r.Timeout | Assert-Equals -Expected 30000
$r.UseDefaultCredentials | Assert-Equals -Expected $false
$r.Credentials | Assert-Equals -Expected $null
$r.ClientCertificates.Count | Assert-Equals -Expected 0
$r.Proxy.Credentials | Assert-Equals -Expected $null
$r.UserAgent | Assert-Equals -Expected 'ansible-httpget'
$actual = Invoke-WithWebRequest -Module $module -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.StatusCode | Assert-Equals -Expected 200
Convert-StreamToString -Stream $Stream
} | ConvertFrom-Json
$actual.headers.'User-Agent' | Assert-Equals -Expected 'ansible-httpget'
$actual.headers.'Host' | Assert-Equals -Expected $httpbin_host
$module.Result.msg | Assert-Equals -Expected 'OK'
$module.Result.status_code | Assert-Equals -Expected 200
$module.Result.ContainsKey('elapsed') | Assert-Equals -Expected $true
}
'GET request over https' = {
# url is an alias for the -Uri parameter.
$r = Get-AnsibleWebRequest -url "https://$httpbin_host/get"
$r.Method | Assert-Equals -Expected 'GET'
$r.Timeout | Assert-Equals -Expected 30000
$r.UseDefaultCredentials | Assert-Equals -Expected $false
$r.Credentials | Assert-Equals -Expected $null
$r.ClientCertificates.Count | Assert-Equals -Expected 0
$r.Proxy.Credentials | Assert-Equals -Expected $null
$r.UserAgent | Assert-Equals -Expected 'ansible-httpget'
$actual = Invoke-WithWebRequest -Module $module -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.StatusCode | Assert-Equals -Expected 200
Convert-StreamToString -Stream $Stream
} | ConvertFrom-Json
$actual.headers.'User-Agent' | Assert-Equals -Expected 'ansible-httpget'
$actual.headers.'Host' | Assert-Equals -Expected $httpbin_host
}
'POST request' = {
$getParams = @{
Headers = @{
'Content-Type' = 'application/json'
}
Method = 'POST'
Uri = "https://$httpbin_host/post"
}
$r = Get-AnsibleWebRequest @getParams
$r.Method | Assert-Equals -Expected 'POST'
$r.Timeout | Assert-Equals -Expected 30000
$r.UseDefaultCredentials | Assert-Equals -Expected $false
$r.Credentials | Assert-Equals -Expected $null
$r.ClientCertificates.Count | Assert-Equals -Expected 0
$r.Proxy.Credentials | Assert-Equals -Expected $null
$r.ContentType | Assert-Equals -Expected 'application/json'
$r.UserAgent | Assert-Equals -Expected 'ansible-httpget'
$body = New-Object -TypeName System.IO.MemoryStream -ArgumentList @(,
([System.Text.Encoding]::UTF8.GetBytes('{"foo":"bar"}'))
)
$actual = Invoke-WithWebRequest -Module $module -Request $r -Body $body -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.StatusCode | Assert-Equals -Expected 200
Convert-StreamToString -Stream $Stream
} | ConvertFrom-Json
$actual.headers.'User-Agent' | Assert-Equals -Expected 'ansible-httpget'
$actual.headers.'Host' | Assert-Equals -Expected $httpbin_host
$actual.data | Assert-Equals -Expected '{"foo":"bar"}'
}
'Safe redirection of GET' = {
$r = Get-AnsibleWebRequest -Uri "http://$httpbin_host/redirect/2"
Invoke-WithWebRequest -Module $module -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.ResponseUri | Assert-Equals -Expected "http://$httpbin_host/get"
$Response.StatusCode | Assert-Equals -Expected 200
}
}
'Safe redirection of HEAD' = {
$r = Get-AnsibleWebRequest -Uri "http://$httpbin_host/redirect/2" -Method HEAD
Invoke-WithWebRequest -Module $module -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.ResponseUri | Assert-Equals -Expected "http://$httpbin_host/get"
$Response.StatusCode | Assert-Equals -Expected 200
}
}
'Safe redirection of PUT' = {
$params = @{
Method = 'PUT'
Uri = "http://$httpbin_host/redirect-to?url=https://$httpbin_host/put"
}
$r = Get-AnsibleWebRequest @params
Invoke-WithWebRequest -Module $module -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.ResponseUri | Assert-Equals -Expected $r.RequestUri
$Response.StatusCode | Assert-Equals -Expected 302
}
}
'None redirection of GET' = {
$params = @{
FollowRedirects = 'None'
Uri = "http://$httpbin_host/redirect/2"
}
$r = Get-AnsibleWebRequest @params
Invoke-WithWebRequest -Module $module -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.ResponseUri | Assert-Equals -Expected $r.RequestUri
$Response.StatusCode | Assert-Equals -Expected 302
}
}
'None redirection of HEAD' = {
$params = @{
follow_redirects = 'None'
method = 'HEAD'
Uri = "http://$httpbin_host/redirect/2"
}
$r = Get-AnsibleWebRequest @params
Invoke-WithWebRequest -Module $module -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.ResponseUri | Assert-Equals -Expected $r.RequestUri
$Response.StatusCode | Assert-Equals -Expected 302
}
}
'None redirection of PUT' = {
$params = @{
FollowRedirects = 'None'
Method = 'PUT'
Uri = "http://$httpbin_host/redirect-to?url=https://$httpbin_host/put"
}
$r = Get-AnsibleWebRequest @params
Invoke-WithWebRequest -Module $module -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.ResponseUri | Assert-Equals -Expected $r.RequestUri
$Response.StatusCode | Assert-Equals -Expected 302
}
}
'All redirection of GET' = {
$params = @{
FollowRedirects = 'All'
Uri = "http://$httpbin_host/redirect/2"
}
$r = Get-AnsibleWebRequest @params
Invoke-WithWebRequest -Module $module -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.ResponseUri | Assert-Equals -Expected "http://$httpbin_host/get"
$Response.StatusCode | Assert-Equals -Expected 200
}
}
'All redirection of HEAD' = {
$params = @{
follow_redirects = 'All'
method = 'HEAD'
Uri = "http://$httpbin_host/redirect/2"
}
$r = Get-AnsibleWebRequest @params
Invoke-WithWebRequest -Module $module -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.ResponseUri | Assert-Equals -Expected "http://$httpbin_host/get"
$Response.StatusCode | Assert-Equals -Expected 200
}
}
'All redirection of PUT' = {
$params = @{
FollowRedirects = 'All'
Method = 'PUT'
Uri = "http://$httpbin_host/redirect-to?url=https://$httpbin_host/put"
}
$r = Get-AnsibleWebRequest @params
Invoke-WithWebRequest -Module $module -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.ResponseUri | Assert-Equals -Expected "https://$httpbin_host/put"
$Response.StatusCode | Assert-Equals -Expected 200
}
}
'Exceeds maximum redirection - ignored' = {
$params = @{
MaximumRedirection = 4
Uri = "https://$httpbin_host/redirect/5"
}
$r = Get-AnsibleWebRequest @params
Invoke-WithWebRequest -Module $module -Request $r -IgnoreBadResponse -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.ResponseUri | Assert-Equals -Expected "https://$httpbin_host/relative-redirect/1"
$Response.StatusCode | Assert-Equals -Expected 302
}
}
'Exceeds maximum redirection - exception' = {
$params = @{
MaximumRedirection = 1
Uri = "https://$httpbin_host/redirect/2"
}
$r = Get-AnsibleWebRequest @params
$failed = $false
try {
$null = Invoke-WithWebRequest -Module $module -Request $r -Script {}
} catch {
$_.Exception.GetType().Name | Assert-Equals -Expected 'WebException'
$_.Exception.Message | Assert-Equals -Expected 'Too many automatic redirections were attempted.'
$failed = $true
}
$failed | Assert-Equals -Expected $true
}
'Basic auth as Credential' = {
$params = @{
Url = "http://$httpbin_host/basic-auth/username/password"
UrlUsername = 'username'
UrlPassword = 'password'
}
$r = Get-AnsibleWebRequest @params
Invoke-WithWebRequest -Module $module -Request $r -IgnoreBadResponse -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.StatusCode | Assert-Equals -Expected 200
}
}
'Basic auth as Header' = {
$params = @{
Url = "http://$httpbin_host/basic-auth/username/password"
url_username = 'username'
url_password = 'password'
ForceBasicAuth = $true
}
$r = Get-AnsibleWebRequest @params
Invoke-WithWebRequest -Module $module -Request $r -IgnoreBadResponse -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.StatusCode | Assert-Equals -Expected 200
}
}
'Send request with headers' = {
$params = @{
Headers = @{
'Content-Length' = 0
testingheader = 'testing_header'
TestHeader = 'test-header'
'User-Agent' = 'test-agent'
}
Url = "https://$httpbin_host/get"
}
$r = Get-AnsibleWebRequest @params
$actual = Invoke-WithWebRequest -Module $module -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.StatusCode | Assert-Equals -Expected 200
Convert-StreamToString -Stream $Stream
} | ConvertFrom-Json
$actual.headers.'Testheader' | Assert-Equals -Expected 'test-header'
$actual.headers.'testingheader' | Assert-Equals -Expected 'testing_header'
$actual.Headers.'User-Agent' | Assert-Equals -Expected 'test-agent'
}
'Request with timeout' = {
$params = @{
Uri = "https://$httpbin_host/delay/5"
Timeout = 1
}
$r = Get-AnsibleWebRequest @params
$failed = $false
try {
$null = Invoke-WithWebRequest -Module $module -Request $r -Script {}
} catch {
$failed = $true
$_.Exception.GetType().Name | Assert-Equals -Expected WebException
$_.Exception.Message | Assert-Equals -Expected 'The operation has timed out'
}
$failed | Assert-Equals -Expected $true
}
'Request with file URI' = {
$filePath = Join-Path $module.Tmpdir -ChildPath 'test.txt'
Set-Content -LiteralPath $filePath -Value 'test'
$r = Get-AnsibleWebRequest -Uri $filePath
$actual = Invoke-WithWebRequest -Module $module -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.ContentLength | Assert-Equals -Expected 6
Convert-StreamToString -Stream $Stream
}
$actual | Assert-Equals -Expected "test`r`n"
$module.Result.msg | Assert-Equals -Expected "OK"
$module.Result.status_code | Assert-Equals -Expected 200
}
'Web request based on module options' = {
Set-Variable complex_args -Scope Global -Value @{
url = "https://$httpbin_host/redirect/2"
method = 'GET'
follow_redirects = 'safe'
headers = @{
'User-Agent' = 'other-agent'
}
http_agent = 'actual-agent'
maximum_redirection = 2
timeout = 10
validate_certs = $false
}
$spec = @{
options = @{
url = @{ type = 'str'; required = $true }
test = @{ type = 'str'; choices = 'abc', 'def'}
}
mutually_exclusive = @(,@('url', 'test'))
}
$spec = Merge-WebRequestSpec -ModuleSpec $spec
$testModule = [Ansible.Basic.AnsibleModule]::Create(@(), $spec)
$r = Get-AnsibleWebRequest -Url $testModule.Params.url -Module $testModule
$actual = Invoke-WithWebRequest -Module $testModule -Request $r -Script {
Param ([System.Net.WebResponse]$Response, [System.IO.Stream]$Stream)
$Response.ResponseUri | Assert-Equals -Expected "https://$httpbin_host/get"
Convert-StreamToString -Stream $Stream
} | ConvertFrom-Json
$actual.headers.'User-Agent' | Assert-Equals -Expected 'actual-agent'
}
}
# setup and teardown should favour native tools to create and delete the service and not the util we are testing.
foreach ($testImpl in $tests.GetEnumerator()) {
Set-Variable -Name complex_args -Scope Global -Value @{}
$test = $testImpl.Key
&$testImpl.Value
}
$module.Result.data = "success"
$module.ExitJson()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,637 |
apt_repo should be moved to a collection
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
I've talked about this with @gundalow in https://github.com/theforeman/foreman-ansible-modules/pull/591#issuecomment-561712408 but it seems to be forgotten.
`apt_repo` is for managing ALT Linux repositories, unlike `apt_repository` which is for Debian repositories. As we aim to only support Debian and Red Hat family OSes in "base", `apt_repo` should be moved to a collection.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
apt_repo
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/68637
|
https://github.com/ansible/ansible/pull/68641
|
ae1cd27b575a759e9d2477042fc5dbbb3275cd84
|
40d9650f20133cd6942990df205300fec802511f
| 2020-04-02T13:37:22Z |
python
| 2020-04-02T16:06:12Z |
lib/ansible/config/routing.yml
|
plugin_routing:
connection:
buildah:
redirect: containers.podman.buildah
podman:
redirect: containers.podman.podman
aws_ssm:
redirect: community.aws.aws_ssm
chroot:
redirect: community.general.chroot
docker:
redirect: community.general.docker
funcd:
redirect: community.general.funcd
iocage:
redirect: community.general.iocage
jail:
redirect: community.general.jail
kubectl:
redirect: community.general.kubectl
libvirt_lxc:
redirect: community.general.libvirt_lxc
lxc:
redirect: community.general.lxc
lxd:
redirect: community.general.lxd
oc:
redirect: community.general.oc
qubes:
redirect: community.general.qubes
saltstack:
redirect: community.general.saltstack
zone:
redirect: community.general.zone
vmware_tools:
redirect: community.vmware.vmware_tools
httpapi:
redirect: ansible.netcommon.httpapi
napalm:
redirect: ansible.netcommon.napalm
netconf:
redirect: ansible.netcommon.netconf
network_cli:
redirect: ansible.netcommon.network_cli
persistent:
redirect: ansible.netcommon.persistent
modules:
podman_container_info:
redirect: containers.podman.podman_container_info
podman_image_info:
redirect: containers.podman.podman_image_info
podman_image:
redirect: containers.podman.podman_image
podman_volume_info:
redirect: containers.podman.podman_volume_info
frr_facts:
redirect: frr.frr.frr_facts
frr_bgp:
redirect: frr.frr.frr_bgp
aws_acm_facts:
redirect: community.aws.aws_acm_facts
aws_kms_facts:
redirect: community.aws.aws_kms_facts
aws_region_facts:
redirect: community.aws.aws_region_facts
aws_s3_bucket_facts:
redirect: community.aws.aws_s3_bucket_facts
aws_sgw_facts:
redirect: community.aws.aws_sgw_facts
aws_waf_facts:
redirect: community.aws.aws_waf_facts
cloudfront_facts:
redirect: community.aws.cloudfront_facts
cloudwatchlogs_log_group_facts:
redirect: community.aws.cloudwatchlogs_log_group_facts
ec2_asg_facts:
redirect: community.aws.ec2_asg_facts
ec2_customer_gateway_facts:
redirect: community.aws.ec2_customer_gateway_facts
ec2_instance_facts:
redirect: community.aws.ec2_instance_facts
ec2_eip_facts:
redirect: community.aws.ec2_eip_facts
ec2_elb_facts:
redirect: community.aws.ec2_elb_facts
ec2_lc_facts:
redirect: community.aws.ec2_lc_facts
ec2_placement_group_facts:
redirect: community.aws.ec2_placement_group_facts
ec2_vpc_endpoint_facts:
redirect: community.aws.ec2_vpc_endpoint_facts
ec2_vpc_igw_facts:
redirect: community.aws.ec2_vpc_igw_facts
ec2_vpc_nacl_facts:
redirect: community.aws.ec2_vpc_nacl_facts
ec2_vpc_nat_gateway_facts:
redirect: community.aws.ec2_vpc_nat_gateway_facts
ec2_vpc_peering_facts:
redirect: community.aws.ec2_vpc_peering_facts
ec2_vpc_route_table_facts:
redirect: community.aws.ec2_vpc_route_table_facts
ec2_vpc_vgw_facts:
redirect: community.aws.ec2_vpc_vgw_facts
ec2_vpc_vpn_facts:
redirect: community.aws.ec2_vpc_vpn_facts
ecs_service_facts:
redirect: community.aws.ecs_service_facts
ecs_taskdefinition_facts:
redirect: community.aws.ecs_taskdefinition_facts
efs_facts:
redirect: community.aws.efs_facts
elasticache_facts:
redirect: community.aws.elasticache_facts
elb_application_lb_facts:
redirect: community.aws.elb_application_lb_facts
elb_classic_lb_facts:
redirect: community.aws.elb_classic_lb_facts
elb_target_facts:
redirect: community.aws.elb_target_facts
elb_target_group_facts:
redirect: community.aws.elb_target_group_facts
iam_cert_facts:
redirect: community.aws.iam_cert_facts
iam_mfa_device_facts:
redirect: community.aws.iam_mfa_device_facts
iam_role_facts:
redirect: community.aws.iam_role_facts
iam_server_certificate_facts:
redirect: community.aws.iam_server_certificate_facts
lambda_facts:
redirect: community.aws.lambda_facts
rds_instance_facts:
redirect: community.aws.rds_instance_facts
rds_snapshot_facts:
redirect: community.aws.rds_snapshot_facts
redshift_facts:
redirect: community.aws.redshift_facts
route53_facts:
redirect: community.aws.route53_facts
aws_acm:
redirect: community.aws.aws_acm
aws_acm_info:
redirect: community.aws.aws_acm_info
aws_api_gateway:
redirect: community.aws.aws_api_gateway
aws_application_scaling_policy:
redirect: community.aws.aws_application_scaling_policy
aws_batch_compute_environment:
redirect: community.aws.aws_batch_compute_environment
aws_batch_job_definition:
redirect: community.aws.aws_batch_job_definition
aws_batch_job_queue:
redirect: community.aws.aws_batch_job_queue
aws_codebuild:
redirect: community.aws.aws_codebuild
aws_codecommit:
redirect: community.aws.aws_codecommit
aws_codepipeline:
redirect: community.aws.aws_codepipeline
aws_config_aggregation_authorization:
redirect: community.aws.aws_config_aggregation_authorization
aws_config_aggregator:
redirect: community.aws.aws_config_aggregator
aws_config_delivery_channel:
redirect: community.aws.aws_config_delivery_channel
aws_config_recorder:
redirect: community.aws.aws_config_recorder
aws_config_rule:
redirect: community.aws.aws_config_rule
aws_direct_connect_connection:
redirect: community.aws.aws_direct_connect_connection
aws_direct_connect_gateway:
redirect: community.aws.aws_direct_connect_gateway
aws_direct_connect_link_aggregation_group:
redirect: community.aws.aws_direct_connect_link_aggregation_group
aws_direct_connect_virtual_interface:
redirect: community.aws.aws_direct_connect_virtual_interface
aws_eks_cluster:
redirect: community.aws.aws_eks_cluster
aws_elasticbeanstalk_app:
redirect: community.aws.aws_elasticbeanstalk_app
aws_glue_connection:
redirect: community.aws.aws_glue_connection
aws_glue_job:
redirect: community.aws.aws_glue_job
aws_inspector_target:
redirect: community.aws.aws_inspector_target
aws_kms:
redirect: community.aws.aws_kms
aws_kms_info:
redirect: community.aws.aws_kms_info
aws_region_info:
redirect: community.aws.aws_region_info
aws_s3_bucket_info:
redirect: community.aws.aws_s3_bucket_info
aws_s3_cors:
redirect: community.aws.aws_s3_cors
aws_secret:
redirect: community.aws.aws_secret
aws_ses_identity:
redirect: community.aws.aws_ses_identity
aws_ses_identity_policy:
redirect: community.aws.aws_ses_identity_policy
aws_ses_rule_set:
redirect: community.aws.aws_ses_rule_set
aws_sgw_info:
redirect: community.aws.aws_sgw_info
aws_ssm_parameter_store:
redirect: community.aws.aws_ssm_parameter_store
aws_step_functions_state_machine:
redirect: community.aws.aws_step_functions_state_machine
aws_step_functions_state_machine_execution:
redirect: community.aws.aws_step_functions_state_machine_execution
aws_waf_condition:
redirect: community.aws.aws_waf_condition
aws_waf_info:
redirect: community.aws.aws_waf_info
aws_waf_rule:
redirect: community.aws.aws_waf_rule
aws_waf_web_acl:
redirect: community.aws.aws_waf_web_acl
cloudformation_stack_set:
redirect: community.aws.cloudformation_stack_set
cloudformation_exports_info:
redirect: community.aws.cloudformation_exports_info
cloudfront_distribution:
redirect: community.aws.cloudfront_distribution
cloudfront_info:
redirect: community.aws.cloudfront_info
cloudfront_invalidation:
redirect: community.aws.cloudfront_invalidation
cloudfront_origin_access_identity:
redirect: community.aws.cloudfront_origin_access_identity
cloudtrail:
redirect: community.aws.cloudtrail
cloudwatchevent_rule:
redirect: community.aws.cloudwatchevent_rule
cloudwatchlogs_log_group:
redirect: community.aws.cloudwatchlogs_log_group
cloudwatchlogs_log_group_info:
redirect: community.aws.cloudwatchlogs_log_group_info
cloudwatchlogs_log_group_metric_filter:
redirect: community.aws.cloudwatchlogs_log_group_metric_filter
data_pipeline:
redirect: community.aws.data_pipeline
dms_endpoint:
redirect: community.aws.dms_endpoint
dms_replication_subnet_group:
redirect: community.aws.dms_replication_subnet_group
dynamodb_table:
redirect: community.aws.dynamodb_table
dynamodb_ttl:
redirect: community.aws.dynamodb_ttl
ec2_ami_copy:
redirect: community.aws.ec2_ami_copy
ec2_asg:
redirect: community.aws.ec2_asg
ec2_asg_info:
redirect: community.aws.ec2_asg_info
ec2_asg_lifecycle_hook:
redirect: community.aws.ec2_asg_lifecycle_hook
ec2_customer_gateway:
redirect: community.aws.ec2_customer_gateway
ec2_customer_gateway_info:
redirect: community.aws.ec2_customer_gateway_info
ec2_eip:
redirect: community.aws.ec2_eip
ec2_eip_info:
redirect: community.aws.ec2_eip_info
ec2_elb:
redirect: community.aws.ec2_elb
ec2_elb_info:
redirect: community.aws.ec2_elb_info
ec2_instance:
redirect: community.aws.ec2_instance
ec2_instance_info:
redirect: community.aws.ec2_instance_info
ec2_launch_template:
redirect: community.aws.ec2_launch_template
ec2_lc:
redirect: community.aws.ec2_lc
ec2_lc_find:
redirect: community.aws.ec2_lc_find
ec2_lc_info:
redirect: community.aws.ec2_lc_info
ec2_metric_alarm:
redirect: community.aws.ec2_metric_alarm
ec2_placement_group:
redirect: community.aws.ec2_placement_group
ec2_placement_group_info:
redirect: community.aws.ec2_placement_group_info
ec2_scaling_policy:
redirect: community.aws.ec2_scaling_policy
ec2_snapshot_copy:
redirect: community.aws.ec2_snapshot_copy
ec2_transit_gateway:
redirect: community.aws.ec2_transit_gateway
ec2_transit_gateway_info:
redirect: community.aws.ec2_transit_gateway_info
ec2_vpc_egress_igw:
redirect: community.aws.ec2_vpc_egress_igw
ec2_vpc_endpoint:
redirect: community.aws.ec2_vpc_endpoint
ec2_vpc_endpoint_info:
redirect: community.aws.ec2_vpc_endpoint_info
ec2_vpc_igw:
redirect: community.aws.ec2_vpc_igw
ec2_vpc_igw_info:
redirect: community.aws.ec2_vpc_igw_info
ec2_vpc_nacl:
redirect: community.aws.ec2_vpc_nacl
ec2_vpc_nacl_info:
redirect: community.aws.ec2_vpc_nacl_info
ec2_vpc_nat_gateway:
redirect: community.aws.ec2_vpc_nat_gateway
ec2_vpc_nat_gateway_info:
redirect: community.aws.ec2_vpc_nat_gateway_info
ec2_vpc_peer:
redirect: community.aws.ec2_vpc_peer
ec2_vpc_peering_info:
redirect: community.aws.ec2_vpc_peering_info
ec2_vpc_route_table:
redirect: community.aws.ec2_vpc_route_table
ec2_vpc_route_table_info:
redirect: community.aws.ec2_vpc_route_table_info
ec2_vpc_vgw:
redirect: community.aws.ec2_vpc_vgw
ec2_vpc_vgw_info:
redirect: community.aws.ec2_vpc_vgw_info
ec2_vpc_vpn:
redirect: community.aws.ec2_vpc_vpn
ec2_vpc_vpn_info:
redirect: community.aws.ec2_vpc_vpn_info
ec2_win_password:
redirect: community.aws.ec2_win_password
ecs_attribute:
redirect: community.aws.ecs_attribute
ecs_cluster:
redirect: community.aws.ecs_cluster
ecs_ecr:
redirect: community.aws.ecs_ecr
ecs_service:
redirect: community.aws.ecs_service
ecs_service_info:
redirect: community.aws.ecs_service_info
ecs_tag:
redirect: community.aws.ecs_tag
ecs_task:
redirect: community.aws.ecs_task
ecs_taskdefinition:
redirect: community.aws.ecs_taskdefinition
ecs_taskdefinition_info:
redirect: community.aws.ecs_taskdefinition_info
efs:
redirect: community.aws.efs
efs_info:
redirect: community.aws.efs_info
elasticache:
redirect: community.aws.elasticache
elasticache_info:
redirect: community.aws.elasticache_info
elasticache_parameter_group:
redirect: community.aws.elasticache_parameter_group
elasticache_snapshot:
redirect: community.aws.elasticache_snapshot
elasticache_subnet_group:
redirect: community.aws.elasticache_subnet_group
elb_application_lb:
redirect: community.aws.elb_application_lb
elb_application_lb_info:
redirect: community.aws.elb_application_lb_info
elb_classic_lb:
redirect: community.aws.elb_classic_lb
elb_classic_lb_info:
redirect: community.aws.elb_classic_lb_info
elb_instance:
redirect: community.aws.elb_instance
elb_network_lb:
redirect: community.aws.elb_network_lb
elb_target:
redirect: community.aws.elb_target
elb_target_group:
redirect: community.aws.elb_target_group
elb_target_group_info:
redirect: community.aws.elb_target_group_info
elb_target_info:
redirect: community.aws.elb_target_info
execute_lambda:
redirect: community.aws.execute_lambda
iam:
redirect: community.aws.iam
iam_cert:
redirect: community.aws.iam_cert
iam_group:
redirect: community.aws.iam_group
iam_managed_policy:
redirect: community.aws.iam_managed_policy
iam_mfa_device_info:
redirect: community.aws.iam_mfa_device_info
iam_password_policy:
redirect: community.aws.iam_password_policy
iam_policy:
redirect: community.aws.iam_policy
iam_policy_info:
redirect: community.aws.iam_policy_info
iam_role:
redirect: community.aws.iam_role
iam_role_info:
redirect: community.aws.iam_role_info
iam_saml_federation:
redirect: community.aws.iam_saml_federation
iam_server_certificate_info:
redirect: community.aws.iam_server_certificate_info
iam_user:
redirect: community.aws.iam_user
iam_user_info:
redirect: community.aws.iam_user_info
kinesis_stream:
redirect: community.aws.kinesis_stream
lambda:
redirect: community.aws.lambda
lambda_alias:
redirect: community.aws.lambda_alias
lambda_event:
redirect: community.aws.lambda_event
lambda_info:
redirect: community.aws.lambda_info
lambda_policy:
redirect: community.aws.lambda_policy
lightsail:
redirect: community.aws.lightsail
rds:
redirect: community.aws.rds
rds_instance:
redirect: community.aws.rds_instance
rds_instance_info:
redirect: community.aws.rds_instance_info
rds_param_group:
redirect: community.aws.rds_param_group
rds_snapshot:
redirect: community.aws.rds_snapshot
rds_snapshot_info:
redirect: community.aws.rds_snapshot_info
rds_subnet_group:
redirect: community.aws.rds_subnet_group
redshift:
redirect: community.aws.redshift
redshift_cross_region_snapshots:
redirect: community.aws.redshift_cross_region_snapshots
redshift_info:
redirect: community.aws.redshift_info
redshift_subnet_group:
redirect: community.aws.redshift_subnet_group
route53:
redirect: community.aws.route53
route53_health_check:
redirect: community.aws.route53_health_check
route53_info:
redirect: community.aws.route53_info
route53_zone:
redirect: community.aws.route53_zone
s3_bucket_notification:
redirect: community.aws.s3_bucket_notification
s3_lifecycle:
redirect: community.aws.s3_lifecycle
s3_logging:
redirect: community.aws.s3_logging
s3_sync:
redirect: community.aws.s3_sync
s3_website:
redirect: community.aws.s3_website
sns:
redirect: community.aws.sns
sns_topic:
redirect: community.aws.sns_topic
sqs_queue:
redirect: community.aws.sqs_queue
sts_assume_role:
redirect: community.aws.sts_assume_role
sts_session_token:
redirect: community.aws.sts_session_token
ali_instance_facts:
redirect: community.general.ali_instance_facts
ali_instance:
redirect: community.general.ali_instance
ali_instance_info:
redirect: community.general.ali_instance_info
atomic_container:
redirect: community.general.atomic_container
atomic_host:
redirect: community.general.atomic_host
atomic_image:
redirect: community.general.atomic_image
clc_aa_policy:
redirect: community.general.clc_aa_policy
clc_alert_policy:
redirect: community.general.clc_alert_policy
clc_blueprint_package:
redirect: community.general.clc_blueprint_package
clc_firewall_policy:
redirect: community.general.clc_firewall_policy
clc_group:
redirect: community.general.clc_group
clc_loadbalancer:
redirect: community.general.clc_loadbalancer
clc_modify_server:
redirect: community.general.clc_modify_server
clc_publicip:
redirect: community.general.clc_publicip
clc_server:
redirect: community.general.clc_server
clc_server_snapshot:
redirect: community.general.clc_server_snapshot
cloudscale_floating_ip:
redirect: community.general.cloudscale_floating_ip
cloudscale_server:
redirect: community.general.cloudscale_server
cloudscale_server_group:
redirect: community.general.cloudscale_server_group
cloudscale_volume:
redirect: community.general.cloudscale_volume
cs_instance_facts:
redirect: community.general.cs_instance_facts
cs_zone_facts:
redirect: community.general.cs_zone_facts
cs_account:
redirect: community.general.cs_account
cs_affinitygroup:
redirect: community.general.cs_affinitygroup
cs_cluster:
redirect: community.general.cs_cluster
cs_configuration:
redirect: community.general.cs_configuration
cs_disk_offering:
redirect: community.general.cs_disk_offering
cs_domain:
redirect: community.general.cs_domain
cs_facts:
redirect: community.general.cs_facts
cs_firewall:
redirect: community.general.cs_firewall
cs_host:
redirect: community.general.cs_host
cs_image_store:
redirect: community.general.cs_image_store
cs_instance:
redirect: community.general.cs_instance
cs_instance_info:
redirect: community.general.cs_instance_info
cs_instance_nic:
redirect: community.general.cs_instance_nic
cs_instance_nic_secondaryip:
redirect: community.general.cs_instance_nic_secondaryip
cs_instance_password_reset:
redirect: community.general.cs_instance_password_reset
cs_instancegroup:
redirect: community.general.cs_instancegroup
cs_ip_address:
redirect: community.general.cs_ip_address
cs_iso:
redirect: community.general.cs_iso
cs_loadbalancer_rule:
redirect: community.general.cs_loadbalancer_rule
cs_loadbalancer_rule_member:
redirect: community.general.cs_loadbalancer_rule_member
cs_network:
redirect: community.general.cs_network
cs_network_acl:
redirect: community.general.cs_network_acl
cs_network_acl_rule:
redirect: community.general.cs_network_acl_rule
cs_network_offering:
redirect: community.general.cs_network_offering
cs_physical_network:
redirect: community.general.cs_physical_network
cs_pod:
redirect: community.general.cs_pod
cs_portforward:
redirect: community.general.cs_portforward
cs_project:
redirect: community.general.cs_project
cs_region:
redirect: community.general.cs_region
cs_resourcelimit:
redirect: community.general.cs_resourcelimit
cs_role:
redirect: community.general.cs_role
cs_role_permission:
redirect: community.general.cs_role_permission
cs_router:
redirect: community.general.cs_router
cs_securitygroup:
redirect: community.general.cs_securitygroup
cs_securitygroup_rule:
redirect: community.general.cs_securitygroup_rule
cs_service_offering:
redirect: community.general.cs_service_offering
cs_snapshot_policy:
redirect: community.general.cs_snapshot_policy
cs_sshkeypair:
redirect: community.general.cs_sshkeypair
cs_staticnat:
redirect: community.general.cs_staticnat
cs_storage_pool:
redirect: community.general.cs_storage_pool
cs_template:
redirect: community.general.cs_template
cs_traffic_type:
redirect: community.general.cs_traffic_type
cs_user:
redirect: community.general.cs_user
cs_vlan_ip_range:
redirect: community.general.cs_vlan_ip_range
cs_vmsnapshot:
redirect: community.general.cs_vmsnapshot
cs_volume:
redirect: community.general.cs_volume
cs_vpc:
redirect: community.general.cs_vpc
cs_vpc_offering:
redirect: community.general.cs_vpc_offering
cs_vpn_connection:
redirect: community.general.cs_vpn_connection
cs_vpn_customer_gateway:
redirect: community.general.cs_vpn_customer_gateway
cs_vpn_gateway:
redirect: community.general.cs_vpn_gateway
cs_zone:
redirect: community.general.cs_zone
cs_zone_info:
redirect: community.general.cs_zone_info
digital_ocean:
redirect: community.general.digital_ocean
digital_ocean_account_facts:
redirect: community.general.digital_ocean_account_facts
digital_ocean_certificate_facts:
redirect: community.general.digital_ocean_certificate_facts
digital_ocean_domain_facts:
redirect: community.general.digital_ocean_domain_facts
digital_ocean_firewall_facts:
redirect: community.general.digital_ocean_firewall_facts
digital_ocean_floating_ip_facts:
redirect: community.general.digital_ocean_floating_ip_facts
digital_ocean_image_facts:
redirect: community.general.digital_ocean_image_facts
digital_ocean_load_balancer_facts:
redirect: community.general.digital_ocean_load_balancer_facts
digital_ocean_region_facts:
redirect: community.general.digital_ocean_region_facts
digital_ocean_size_facts:
redirect: community.general.digital_ocean_size_facts
digital_ocean_snapshot_facts:
redirect: community.general.digital_ocean_snapshot_facts
digital_ocean_sshkey_facts:
redirect: community.general.digital_ocean_sshkey_facts
digital_ocean_tag_facts:
redirect: community.general.digital_ocean_tag_facts
digital_ocean_volume_facts:
redirect: community.general.digital_ocean_volume_facts
digital_ocean_account_info:
redirect: community.general.digital_ocean_account_info
digital_ocean_block_storage:
redirect: community.general.digital_ocean_block_storage
digital_ocean_certificate:
redirect: community.general.digital_ocean_certificate
digital_ocean_certificate_info:
redirect: community.general.digital_ocean_certificate_info
digital_ocean_domain:
redirect: community.general.digital_ocean_domain
digital_ocean_domain_info:
redirect: community.general.digital_ocean_domain_info
digital_ocean_droplet:
redirect: community.general.digital_ocean_droplet
digital_ocean_firewall_info:
redirect: community.general.digital_ocean_firewall_info
digital_ocean_floating_ip:
redirect: community.general.digital_ocean_floating_ip
digital_ocean_floating_ip_info:
redirect: community.general.digital_ocean_floating_ip_info
digital_ocean_image_info:
redirect: community.general.digital_ocean_image_info
digital_ocean_load_balancer_info:
redirect: community.general.digital_ocean_load_balancer_info
digital_ocean_region_info:
redirect: community.general.digital_ocean_region_info
digital_ocean_size_info:
redirect: community.general.digital_ocean_size_info
digital_ocean_snapshot_info:
redirect: community.general.digital_ocean_snapshot_info
digital_ocean_sshkey:
redirect: community.general.digital_ocean_sshkey
digital_ocean_sshkey_info:
redirect: community.general.digital_ocean_sshkey_info
digital_ocean_tag:
redirect: community.general.digital_ocean_tag
digital_ocean_tag_info:
redirect: community.general.digital_ocean_tag_info
digital_ocean_volume_info:
redirect: community.general.digital_ocean_volume_info
dimensiondata_network:
redirect: community.general.dimensiondata_network
dimensiondata_vlan:
redirect: community.general.dimensiondata_vlan
docker_image_facts:
redirect: community.general.docker_image_facts
docker_service:
redirect: community.general.docker_service
docker_compose:
redirect: community.general.docker_compose
docker_config:
redirect: community.general.docker_config
docker_container:
redirect: community.general.docker_container
docker_container_info:
redirect: community.general.docker_container_info
docker_host_info:
redirect: community.general.docker_host_info
docker_image:
redirect: community.general.docker_image
docker_image_info:
redirect: community.general.docker_image_info
docker_login:
redirect: community.general.docker_login
docker_network:
redirect: community.general.docker_network
docker_network_info:
redirect: community.general.docker_network_info
docker_node:
redirect: community.general.docker_node
docker_node_info:
redirect: community.general.docker_node_info
docker_prune:
redirect: community.general.docker_prune
docker_secret:
redirect: community.general.docker_secret
docker_stack:
redirect: community.general.docker_stack
docker_swarm:
redirect: community.general.docker_swarm
docker_swarm_info:
redirect: community.general.docker_swarm_info
docker_swarm_service:
redirect: community.general.docker_swarm_service
docker_swarm_service_info:
redirect: community.general.docker_swarm_service_info
docker_volume:
redirect: community.general.docker_volume
docker_volume_info:
redirect: community.general.docker_volume_info
gcdns_record:
redirect: community.general.gcdns_record
gcdns_zone:
redirect: community.general.gcdns_zone
gce:
redirect: community.general.gce
gcp_backend_service:
redirect: community.general.gcp_backend_service
gcp_bigquery_dataset_facts:
redirect: community.general.gcp_bigquery_dataset_facts
gcp_bigquery_table_facts:
redirect: community.general.gcp_bigquery_table_facts
gcp_cloudbuild_trigger_facts:
redirect: community.general.gcp_cloudbuild_trigger_facts
gcp_compute_address_facts:
redirect: community.general.gcp_compute_address_facts
gcp_compute_backend_bucket_facts:
redirect: community.general.gcp_compute_backend_bucket_facts
gcp_compute_backend_service_facts:
redirect: community.general.gcp_compute_backend_service_facts
gcp_compute_disk_facts:
redirect: community.general.gcp_compute_disk_facts
gcp_compute_firewall_facts:
redirect: community.general.gcp_compute_firewall_facts
gcp_compute_forwarding_rule_facts:
redirect: community.general.gcp_compute_forwarding_rule_facts
gcp_compute_global_address_facts:
redirect: community.general.gcp_compute_global_address_facts
gcp_compute_global_forwarding_rule_facts:
redirect: community.general.gcp_compute_global_forwarding_rule_facts
gcp_compute_health_check_facts:
redirect: community.general.gcp_compute_health_check_facts
gcp_compute_http_health_check_facts:
redirect: community.general.gcp_compute_http_health_check_facts
gcp_compute_https_health_check_facts:
redirect: community.general.gcp_compute_https_health_check_facts
gcp_compute_image_facts:
redirect: community.general.gcp_compute_image_facts
gcp_compute_instance_facts:
redirect: community.general.gcp_compute_instance_facts
gcp_compute_instance_group_facts:
redirect: community.general.gcp_compute_instance_group_facts
gcp_compute_instance_group_manager_facts:
redirect: community.general.gcp_compute_instance_group_manager_facts
gcp_compute_instance_template_facts:
redirect: community.general.gcp_compute_instance_template_facts
gcp_compute_interconnect_attachment_facts:
redirect: community.general.gcp_compute_interconnect_attachment_facts
gcp_compute_network_facts:
redirect: community.general.gcp_compute_network_facts
gcp_compute_region_disk_facts:
redirect: community.general.gcp_compute_region_disk_facts
gcp_compute_route_facts:
redirect: community.general.gcp_compute_route_facts
gcp_compute_router_facts:
redirect: community.general.gcp_compute_router_facts
gcp_compute_ssl_certificate_facts:
redirect: community.general.gcp_compute_ssl_certificate_facts
gcp_compute_ssl_policy_facts:
redirect: community.general.gcp_compute_ssl_policy_facts
gcp_compute_subnetwork_facts:
redirect: community.general.gcp_compute_subnetwork_facts
gcp_compute_target_http_proxy_facts:
redirect: community.general.gcp_compute_target_http_proxy_facts
gcp_compute_target_https_proxy_facts:
redirect: community.general.gcp_compute_target_https_proxy_facts
gcp_compute_target_pool_facts:
redirect: community.general.gcp_compute_target_pool_facts
gcp_compute_target_ssl_proxy_facts:
redirect: community.general.gcp_compute_target_ssl_proxy_facts
gcp_compute_target_tcp_proxy_facts:
redirect: community.general.gcp_compute_target_tcp_proxy_facts
gcp_compute_target_vpn_gateway_facts:
redirect: community.general.gcp_compute_target_vpn_gateway_facts
gcp_compute_url_map_facts:
redirect: community.general.gcp_compute_url_map_facts
gcp_compute_vpn_tunnel_facts:
redirect: community.general.gcp_compute_vpn_tunnel_facts
gcp_container_cluster_facts:
redirect: community.general.gcp_container_cluster_facts
gcp_container_node_pool_facts:
redirect: community.general.gcp_container_node_pool_facts
gcp_dns_managed_zone_facts:
redirect: community.general.gcp_dns_managed_zone_facts
gcp_dns_resource_record_set_facts:
redirect: community.general.gcp_dns_resource_record_set_facts
gcp_forwarding_rule:
redirect: community.general.gcp_forwarding_rule
gcp_healthcheck:
redirect: community.general.gcp_healthcheck
gcp_iam_role_facts:
redirect: community.general.gcp_iam_role_facts
gcp_iam_service_account_facts:
redirect: community.general.gcp_iam_service_account_facts
gcp_pubsub_subscription_facts:
redirect: community.general.gcp_pubsub_subscription_facts
gcp_pubsub_topic_facts:
redirect: community.general.gcp_pubsub_topic_facts
gcp_redis_instance_facts:
redirect: community.general.gcp_redis_instance_facts
gcp_resourcemanager_project_facts:
redirect: community.general.gcp_resourcemanager_project_facts
gcp_sourcerepo_repository_facts:
redirect: community.general.gcp_sourcerepo_repository_facts
gcp_spanner_database_facts:
redirect: community.general.gcp_spanner_database_facts
gcp_spanner_instance_facts:
redirect: community.general.gcp_spanner_instance_facts
gcp_sql_database_facts:
redirect: community.general.gcp_sql_database_facts
gcp_sql_instance_facts:
redirect: community.general.gcp_sql_instance_facts
gcp_sql_user_facts:
redirect: community.general.gcp_sql_user_facts
gcp_target_proxy:
redirect: community.general.gcp_target_proxy
gcp_tpu_node_facts:
redirect: community.general.gcp_tpu_node_facts
gcp_url_map:
redirect: community.general.gcp_url_map
gcpubsub_facts:
redirect: community.general.gcpubsub_facts
gcspanner:
redirect: community.general.gcspanner
gc_storage:
redirect: community.general.gc_storage
gce_eip:
redirect: community.general.gce_eip
gce_img:
redirect: community.general.gce_img
gce_instance_template:
redirect: community.general.gce_instance_template
gce_labels:
redirect: community.general.gce_labels
gce_lb:
redirect: community.general.gce_lb
gce_mig:
redirect: community.general.gce_mig
gce_net:
redirect: community.general.gce_net
gce_pd:
redirect: community.general.gce_pd
gce_snapshot:
redirect: community.general.gce_snapshot
gce_tag:
redirect: community.general.gce_tag
gcpubsub:
redirect: community.general.gcpubsub
gcpubsub_info:
redirect: community.general.gcpubsub_info
heroku_collaborator:
redirect: community.general.heroku_collaborator
hwc_ecs_instance:
redirect: community.general.hwc_ecs_instance
hwc_evs_disk:
redirect: community.general.hwc_evs_disk
hwc_network_vpc:
redirect: community.general.hwc_network_vpc
hwc_smn_topic:
redirect: community.general.hwc_smn_topic
hwc_vpc_eip:
redirect: community.general.hwc_vpc_eip
hwc_vpc_peering_connect:
redirect: community.general.hwc_vpc_peering_connect
hwc_vpc_port:
redirect: community.general.hwc_vpc_port
hwc_vpc_private_ip:
redirect: community.general.hwc_vpc_private_ip
hwc_vpc_route:
redirect: community.general.hwc_vpc_route
hwc_vpc_security_group:
redirect: community.general.hwc_vpc_security_group
hwc_vpc_security_group_rule:
redirect: community.general.hwc_vpc_security_group_rule
hwc_vpc_subnet:
redirect: community.general.hwc_vpc_subnet
kubevirt_cdi_upload:
redirect: community.general.kubevirt_cdi_upload
kubevirt_preset:
redirect: community.general.kubevirt_preset
kubevirt_pvc:
redirect: community.general.kubevirt_pvc
kubevirt_rs:
redirect: community.general.kubevirt_rs
kubevirt_template:
redirect: community.general.kubevirt_template
kubevirt_vm:
redirect: community.general.kubevirt_vm
linode:
redirect: community.general.linode
linode_v4:
redirect: community.general.linode_v4
lxc_container:
redirect: community.general.lxc_container
lxd_container:
redirect: community.general.lxd_container
lxd_profile:
redirect: community.general.lxd_profile
memset_memstore_facts:
redirect: community.general.memset_memstore_facts
memset_server_facts:
redirect: community.general.memset_server_facts
memset_dns_reload:
redirect: community.general.memset_dns_reload
memset_memstore_info:
redirect: community.general.memset_memstore_info
memset_server_info:
redirect: community.general.memset_server_info
memset_zone:
redirect: community.general.memset_zone
memset_zone_domain:
redirect: community.general.memset_zone_domain
memset_zone_record:
redirect: community.general.memset_zone_record
cloud_init_data_facts:
redirect: community.general.cloud_init_data_facts
helm:
redirect: community.general.helm
ovirt:
redirect: community.general.ovirt
proxmox:
redirect: community.general.proxmox
proxmox_kvm:
redirect: community.general.proxmox_kvm
proxmox_template:
redirect: community.general.proxmox_template
rhevm:
redirect: community.general.rhevm
serverless:
redirect: community.general.serverless
terraform:
redirect: community.general.terraform
virt:
redirect: community.general.virt
virt_net:
redirect: community.general.virt_net
virt_pool:
redirect: community.general.virt_pool
xenserver_facts:
redirect: community.general.xenserver_facts
oneandone_firewall_policy:
redirect: community.general.oneandone_firewall_policy
oneandone_load_balancer:
redirect: community.general.oneandone_load_balancer
oneandone_monitoring_policy:
redirect: community.general.oneandone_monitoring_policy
oneandone_private_network:
redirect: community.general.oneandone_private_network
oneandone_public_ip:
redirect: community.general.oneandone_public_ip
oneandone_server:
redirect: community.general.oneandone_server
online_server_facts:
redirect: community.general.online_server_facts
online_user_facts:
redirect: community.general.online_user_facts
online_server_info:
redirect: community.general.online_server_info
online_user_info:
redirect: community.general.online_user_info
one_image_facts:
redirect: community.general.one_image_facts
one_host:
redirect: community.general.one_host
one_image:
redirect: community.general.one_image
one_image_info:
redirect: community.general.one_image_info
one_service:
redirect: community.general.one_service
one_vm:
redirect: community.general.one_vm
os_flavor_facts:
redirect: community.general.os_flavor_facts
os_image_facts:
redirect: community.general.os_image_facts
os_keystone_domain_facts:
redirect: community.general.os_keystone_domain_facts
os_networks_facts:
redirect: community.general.os_networks_facts
os_port_facts:
redirect: community.general.os_port_facts
os_project_facts:
redirect: community.general.os_project_facts
os_server_facts:
redirect: community.general.os_server_facts
os_subnets_facts:
redirect: community.general.os_subnets_facts
os_user_facts:
redirect: community.general.os_user_facts
oci_vcn:
redirect: community.general.oci_vcn
ovh_ip_failover:
redirect: community.general.ovh_ip_failover
ovh_ip_loadbalancing_backend:
redirect: community.general.ovh_ip_loadbalancing_backend
ovh_monthly_billing:
redirect: community.general.ovh_monthly_billing
ovirt_affinity_label_facts:
redirect: community.general.ovirt_affinity_label_facts
ovirt_api_facts:
redirect: community.general.ovirt_api_facts
ovirt_cluster_facts:
redirect: community.general.ovirt_cluster_facts
ovirt_datacenter_facts:
redirect: community.general.ovirt_datacenter_facts
ovirt_disk_facts:
redirect: community.general.ovirt_disk_facts
ovirt_event_facts:
redirect: community.general.ovirt_event_facts
ovirt_external_provider_facts:
redirect: community.general.ovirt_external_provider_facts
ovirt_group_facts:
redirect: community.general.ovirt_group_facts
ovirt_host_facts:
redirect: community.general.ovirt_host_facts
ovirt_host_storage_facts:
redirect: community.general.ovirt_host_storage_facts
ovirt_network_facts:
redirect: community.general.ovirt_network_facts
ovirt_nic_facts:
redirect: community.general.ovirt_nic_facts
ovirt_permission_facts:
redirect: community.general.ovirt_permission_facts
ovirt_quota_facts:
redirect: community.general.ovirt_quota_facts
ovirt_scheduling_policy_facts:
redirect: community.general.ovirt_scheduling_policy_facts
ovirt_snapshot_facts:
redirect: community.general.ovirt_snapshot_facts
ovirt_storage_domain_facts:
redirect: community.general.ovirt_storage_domain_facts
ovirt_storage_template_facts:
redirect: community.general.ovirt_storage_template_facts
ovirt_storage_vm_facts:
redirect: community.general.ovirt_storage_vm_facts
ovirt_tag_facts:
redirect: community.general.ovirt_tag_facts
ovirt_template_facts:
redirect: community.general.ovirt_template_facts
ovirt_user_facts:
redirect: community.general.ovirt_user_facts
ovirt_vm_facts:
redirect: community.general.ovirt_vm_facts
ovirt_vmpool_facts:
redirect: community.general.ovirt_vmpool_facts
packet_device:
redirect: community.general.packet_device
packet_ip_subnet:
redirect: community.general.packet_ip_subnet
packet_project:
redirect: community.general.packet_project
packet_sshkey:
redirect: community.general.packet_sshkey
packet_volume:
redirect: community.general.packet_volume
packet_volume_attachment:
redirect: community.general.packet_volume_attachment
profitbricks:
redirect: community.general.profitbricks
profitbricks_datacenter:
redirect: community.general.profitbricks_datacenter
profitbricks_nic:
redirect: community.general.profitbricks_nic
profitbricks_volume:
redirect: community.general.profitbricks_volume
profitbricks_volume_attachments:
redirect: community.general.profitbricks_volume_attachments
pubnub_blocks:
redirect: community.general.pubnub_blocks
rax:
redirect: community.general.rax
rax_cbs:
redirect: community.general.rax_cbs
rax_cbs_attachments:
redirect: community.general.rax_cbs_attachments
rax_cdb:
redirect: community.general.rax_cdb
rax_cdb_database:
redirect: community.general.rax_cdb_database
rax_cdb_user:
redirect: community.general.rax_cdb_user
rax_clb:
redirect: community.general.rax_clb
rax_clb_nodes:
redirect: community.general.rax_clb_nodes
rax_clb_ssl:
redirect: community.general.rax_clb_ssl
rax_dns:
redirect: community.general.rax_dns
rax_dns_record:
redirect: community.general.rax_dns_record
rax_facts:
redirect: community.general.rax_facts
rax_files:
redirect: community.general.rax_files
rax_files_objects:
redirect: community.general.rax_files_objects
rax_identity:
redirect: community.general.rax_identity
rax_keypair:
redirect: community.general.rax_keypair
rax_meta:
redirect: community.general.rax_meta
rax_mon_alarm:
redirect: community.general.rax_mon_alarm
rax_mon_check:
redirect: community.general.rax_mon_check
rax_mon_entity:
redirect: community.general.rax_mon_entity
rax_mon_notification:
redirect: community.general.rax_mon_notification
rax_mon_notification_plan:
redirect: community.general.rax_mon_notification_plan
rax_network:
redirect: community.general.rax_network
rax_queue:
redirect: community.general.rax_queue
rax_scaling_group:
redirect: community.general.rax_scaling_group
rax_scaling_policy:
redirect: community.general.rax_scaling_policy
scaleway_image_facts:
redirect: community.general.scaleway_image_facts
scaleway_ip_facts:
redirect: community.general.scaleway_ip_facts
scaleway_organization_facts:
redirect: community.general.scaleway_organization_facts
scaleway_security_group_facts:
redirect: community.general.scaleway_security_group_facts
scaleway_server_facts:
redirect: community.general.scaleway_server_facts
scaleway_snapshot_facts:
redirect: community.general.scaleway_snapshot_facts
scaleway_volume_facts:
redirect: community.general.scaleway_volume_facts
scaleway_compute:
redirect: community.general.scaleway_compute
scaleway_image_info:
redirect: community.general.scaleway_image_info
scaleway_ip:
redirect: community.general.scaleway_ip
scaleway_ip_info:
redirect: community.general.scaleway_ip_info
scaleway_lb:
redirect: community.general.scaleway_lb
scaleway_organization_info:
redirect: community.general.scaleway_organization_info
scaleway_security_group:
redirect: community.general.scaleway_security_group
scaleway_security_group_info:
redirect: community.general.scaleway_security_group_info
scaleway_security_group_rule:
redirect: community.general.scaleway_security_group_rule
scaleway_server_info:
redirect: community.general.scaleway_server_info
scaleway_snapshot_info:
redirect: community.general.scaleway_snapshot_info
scaleway_sshkey:
redirect: community.general.scaleway_sshkey
scaleway_user_data:
redirect: community.general.scaleway_user_data
scaleway_volume:
redirect: community.general.scaleway_volume
scaleway_volume_info:
redirect: community.general.scaleway_volume_info
smartos_image_facts:
redirect: community.general.smartos_image_facts
imgadm:
redirect: community.general.imgadm
nictagadm:
redirect: community.general.nictagadm
smartos_image_info:
redirect: community.general.smartos_image_info
vmadm:
redirect: community.general.vmadm
sl_vm:
redirect: community.general.sl_vm
spotinst_aws_elastigroup:
redirect: community.general.spotinst_aws_elastigroup
udm_dns_record:
redirect: community.general.udm_dns_record
udm_dns_zone:
redirect: community.general.udm_dns_zone
udm_group:
redirect: community.general.udm_group
udm_share:
redirect: community.general.udm_share
udm_user:
redirect: community.general.udm_user
vr_account_facts:
redirect: community.general.vr_account_facts
vr_dns_domain:
redirect: community.general.vr_dns_domain
vr_dns_record:
redirect: community.general.vr_dns_record
vr_firewall_group:
redirect: community.general.vr_firewall_group
vr_firewall_rule:
redirect: community.general.vr_firewall_rule
vr_server:
redirect: community.general.vr_server
vr_ssh_key:
redirect: community.general.vr_ssh_key
vr_startup_script:
redirect: community.general.vr_startup_script
vr_user:
redirect: community.general.vr_user
vultr_account_facts:
redirect: community.general.vultr_account_facts
vultr_block_storage_facts:
redirect: community.general.vultr_block_storage_facts
vultr_dns_domain_facts:
redirect: community.general.vultr_dns_domain_facts
vultr_firewall_group_facts:
redirect: community.general.vultr_firewall_group_facts
vultr_network_facts:
redirect: community.general.vultr_network_facts
vultr_os_facts:
redirect: community.general.vultr_os_facts
vultr_plan_facts:
redirect: community.general.vultr_plan_facts
vultr_region_facts:
redirect: community.general.vultr_region_facts
vultr_server_facts:
redirect: community.general.vultr_server_facts
vultr_ssh_key_facts:
redirect: community.general.vultr_ssh_key_facts
vultr_startup_script_facts:
redirect: community.general.vultr_startup_script_facts
vultr_user_facts:
redirect: community.general.vultr_user_facts
vultr_account_info:
redirect: community.general.vultr_account_info
vultr_block_storage:
redirect: community.general.vultr_block_storage
vultr_block_storage_info:
redirect: community.general.vultr_block_storage_info
vultr_dns_domain:
redirect: community.general.vultr_dns_domain
vultr_dns_domain_info:
redirect: community.general.vultr_dns_domain_info
vultr_dns_record:
redirect: community.general.vultr_dns_record
vultr_firewall_group:
redirect: community.general.vultr_firewall_group
vultr_firewall_group_info:
redirect: community.general.vultr_firewall_group_info
vultr_firewall_rule:
redirect: community.general.vultr_firewall_rule
vultr_network:
redirect: community.general.vultr_network
vultr_network_info:
redirect: community.general.vultr_network_info
vultr_os_info:
redirect: community.general.vultr_os_info
vultr_plan_info:
redirect: community.general.vultr_plan_info
vultr_region_info:
redirect: community.general.vultr_region_info
vultr_server:
redirect: community.general.vultr_server
vultr_server_info:
redirect: community.general.vultr_server_info
vultr_ssh_key:
redirect: community.general.vultr_ssh_key
vultr_ssh_key_info:
redirect: community.general.vultr_ssh_key_info
vultr_startup_script:
redirect: community.general.vultr_startup_script
vultr_startup_script_info:
redirect: community.general.vultr_startup_script_info
vultr_user:
redirect: community.general.vultr_user
vultr_user_info:
redirect: community.general.vultr_user_info
webfaction_app:
redirect: community.general.webfaction_app
webfaction_db:
redirect: community.general.webfaction_db
webfaction_domain:
redirect: community.general.webfaction_domain
webfaction_mailbox:
redirect: community.general.webfaction_mailbox
webfaction_site:
redirect: community.general.webfaction_site
xenserver_guest_facts:
redirect: community.general.xenserver_guest_facts
xenserver_guest:
redirect: community.general.xenserver_guest
xenserver_guest_info:
redirect: community.general.xenserver_guest_info
xenserver_guest_powerstate:
redirect: community.general.xenserver_guest_powerstate
consul:
redirect: community.general.consul
consul_acl:
redirect: community.general.consul_acl
consul_kv:
redirect: community.general.consul_kv
consul_session:
redirect: community.general.consul_session
etcd3:
redirect: community.general.etcd3
pacemaker_cluster:
redirect: community.general.pacemaker_cluster
znode:
redirect: community.general.znode
aerospike_migrations:
redirect: community.general.aerospike_migrations
influxdb_database:
redirect: community.general.influxdb_database
influxdb_query:
redirect: community.general.influxdb_query
influxdb_retention_policy:
redirect: community.general.influxdb_retention_policy
influxdb_user:
redirect: community.general.influxdb_user
influxdb_write:
redirect: community.general.influxdb_write
elasticsearch_plugin:
redirect: community.general.elasticsearch_plugin
kibana_plugin:
redirect: community.general.kibana_plugin
redis:
redirect: community.general.redis
riak:
redirect: community.general.riak
mssql_db:
redirect: community.general.mssql_db
mysql_db:
redirect: community.general.mysql_db
mysql_info:
redirect: community.general.mysql_info
mysql_query:
redirect: community.general.mysql_query
mysql_replication:
redirect: community.general.mysql_replication
mysql_user:
redirect: community.general.mysql_user
mysql_variables:
redirect: community.general.mysql_variables
postgresql_copy:
redirect: community.general.postgresql_copy
postgresql_db:
redirect: community.general.postgresql_db
postgresql_ext:
redirect: community.general.postgresql_ext
postgresql_idx:
redirect: community.general.postgresql_idx
postgresql_info:
redirect: community.general.postgresql_info
postgresql_lang:
redirect: community.general.postgresql_lang
postgresql_membership:
redirect: community.general.postgresql_membership
postgresql_owner:
redirect: community.general.postgresql_owner
postgresql_pg_hba:
redirect: community.general.postgresql_pg_hba
postgresql_ping:
redirect: community.general.postgresql_ping
postgresql_privs:
redirect: community.general.postgresql_privs
postgresql_publication:
redirect: community.general.postgresql_publication
postgresql_query:
redirect: community.general.postgresql_query
postgresql_schema:
redirect: community.general.postgresql_schema
postgresql_sequence:
redirect: community.general.postgresql_sequence
postgresql_set:
redirect: community.general.postgresql_set
postgresql_slot:
redirect: community.general.postgresql_slot
postgresql_subscription:
redirect: community.general.postgresql_subscription
postgresql_table:
redirect: community.general.postgresql_table
postgresql_tablespace:
redirect: community.general.postgresql_tablespace
postgresql_user:
redirect: community.general.postgresql_user
postgresql_user_obj_stat_info:
redirect: community.general.postgresql_user_obj_stat_info
proxysql_backend_servers:
redirect: community.general.proxysql_backend_servers
proxysql_global_variables:
redirect: community.general.proxysql_global_variables
proxysql_manage_config:
redirect: community.general.proxysql_manage_config
proxysql_mysql_users:
redirect: community.general.proxysql_mysql_users
proxysql_query_rules:
redirect: community.general.proxysql_query_rules
proxysql_replication_hostgroups:
redirect: community.general.proxysql_replication_hostgroups
proxysql_scheduler:
redirect: community.general.proxysql_scheduler
vertica_facts:
redirect: community.general.vertica_facts
vertica_configuration:
redirect: community.general.vertica_configuration
vertica_info:
redirect: community.general.vertica_info
vertica_role:
redirect: community.general.vertica_role
vertica_schema:
redirect: community.general.vertica_schema
vertica_user:
redirect: community.general.vertica_user
archive:
redirect: community.general.archive
ini_file:
redirect: community.general.ini_file
iso_extract:
redirect: community.general.iso_extract
patch:
redirect: community.general.patch
read_csv:
redirect: community.general.read_csv
xattr:
redirect: community.general.xattr
xml:
redirect: community.general.xml
onepassword_facts:
redirect: community.general.onepassword_facts
ipa_config:
redirect: community.general.ipa_config
ipa_dnsrecord:
redirect: community.general.ipa_dnsrecord
ipa_dnszone:
redirect: community.general.ipa_dnszone
ipa_group:
redirect: community.general.ipa_group
ipa_hbacrule:
redirect: community.general.ipa_hbacrule
ipa_host:
redirect: community.general.ipa_host
ipa_hostgroup:
redirect: community.general.ipa_hostgroup
ipa_role:
redirect: community.general.ipa_role
ipa_service:
redirect: community.general.ipa_service
ipa_subca:
redirect: community.general.ipa_subca
ipa_sudocmd:
redirect: community.general.ipa_sudocmd
ipa_sudocmdgroup:
redirect: community.general.ipa_sudocmdgroup
ipa_sudorule:
redirect: community.general.ipa_sudorule
ipa_user:
redirect: community.general.ipa_user
ipa_vault:
redirect: community.general.ipa_vault
keycloak_client:
redirect: community.general.keycloak_client
keycloak_clienttemplate:
redirect: community.general.keycloak_clienttemplate
keycloak_group:
redirect: community.general.keycloak_group
onepassword_info:
redirect: community.general.onepassword_info
opendj_backendprop:
redirect: community.general.opendj_backendprop
rabbitmq_binding:
redirect: community.general.rabbitmq_binding
rabbitmq_exchange:
redirect: community.general.rabbitmq_exchange
rabbitmq_global_parameter:
redirect: community.general.rabbitmq_global_parameter
rabbitmq_parameter:
redirect: community.general.rabbitmq_parameter
rabbitmq_plugin:
redirect: community.general.rabbitmq_plugin
rabbitmq_policy:
redirect: community.general.rabbitmq_policy
rabbitmq_queue:
redirect: community.general.rabbitmq_queue
rabbitmq_user:
redirect: community.general.rabbitmq_user
rabbitmq_vhost:
redirect: community.general.rabbitmq_vhost
rabbitmq_vhost_limits:
redirect: community.general.rabbitmq_vhost_limits
airbrake_deployment:
redirect: community.general.airbrake_deployment
bigpanda:
redirect: community.general.bigpanda
circonus_annotation:
redirect: community.general.circonus_annotation
datadog_event:
redirect: community.general.datadog_event
datadog_monitor:
redirect: community.general.datadog_monitor
honeybadger_deployment:
redirect: community.general.honeybadger_deployment
icinga2_feature:
redirect: community.general.icinga2_feature
icinga2_host:
redirect: community.general.icinga2_host
librato_annotation:
redirect: community.general.librato_annotation
logentries:
redirect: community.general.logentries
logicmonitor:
redirect: community.general.logicmonitor
logicmonitor_facts:
redirect: community.general.logicmonitor_facts
logstash_plugin:
redirect: community.general.logstash_plugin
monit:
redirect: community.general.monit
nagios:
redirect: community.general.nagios
newrelic_deployment:
redirect: community.general.newrelic_deployment
pagerduty:
redirect: community.general.pagerduty
pagerduty_alert:
redirect: community.general.pagerduty_alert
pingdom:
redirect: community.general.pingdom
rollbar_deployment:
redirect: community.general.rollbar_deployment
sensu_check:
redirect: community.general.sensu_check
sensu_client:
redirect: community.general.sensu_client
sensu_handler:
redirect: community.general.sensu_handler
sensu_silence:
redirect: community.general.sensu_silence
sensu_subscription:
redirect: community.general.sensu_subscription
spectrum_device:
redirect: community.general.spectrum_device
stackdriver:
redirect: community.general.stackdriver
statusio_maintenance:
redirect: community.general.statusio_maintenance
uptimerobot:
redirect: community.general.uptimerobot
zabbix_group_facts:
redirect: community.general.zabbix_group_facts
zabbix_host_facts:
redirect: community.general.zabbix_host_facts
zabbix_action:
redirect: community.general.zabbix_action
zabbix_group:
redirect: community.general.zabbix_group
zabbix_group_info:
redirect: community.general.zabbix_group_info
zabbix_host:
redirect: community.general.zabbix_host
zabbix_host_events_info:
redirect: community.general.zabbix_host_events_info
zabbix_host_info:
redirect: community.general.zabbix_host_info
zabbix_hostmacro:
redirect: community.general.zabbix_hostmacro
zabbix_maintenance:
redirect: community.general.zabbix_maintenance
zabbix_map:
redirect: community.general.zabbix_map
zabbix_mediatype:
redirect: community.general.zabbix_mediatype
zabbix_proxy:
redirect: community.general.zabbix_proxy
zabbix_screen:
redirect: community.general.zabbix_screen
zabbix_service:
redirect: community.general.zabbix_service
zabbix_template:
redirect: community.general.zabbix_template
zabbix_template_info:
redirect: community.general.zabbix_template_info
zabbix_user:
redirect: community.general.zabbix_user
zabbix_user_info:
redirect: community.general.zabbix_user_info
zabbix_valuemap:
redirect: community.general.zabbix_valuemap
cloudflare_dns:
redirect: community.general.cloudflare_dns
dnsimple:
redirect: community.general.dnsimple
dnsmadeeasy:
redirect: community.general.dnsmadeeasy
exo_dns_domain:
redirect: community.general.exo_dns_domain
exo_dns_record:
redirect: community.general.exo_dns_record
haproxy:
redirect: community.general.haproxy
hetzner_failover_ip:
redirect: community.general.hetzner_failover_ip
hetzner_failover_ip_info:
redirect: community.general.hetzner_failover_ip_info
hetzner_firewall:
redirect: community.general.hetzner_firewall
hetzner_firewall_info:
redirect: community.general.hetzner_firewall_info
infinity:
redirect: community.general.infinity
ip_netns:
redirect: community.general.ip_netns
ipify_facts:
redirect: community.general.ipify_facts
ipinfoio_facts:
redirect: community.general.ipinfoio_facts
ipwcli_dns:
redirect: community.general.ipwcli_dns
ldap_attr:
redirect: community.general.ldap_attr
ldap_attrs:
redirect: community.general.ldap_attrs
ldap_entry:
redirect: community.general.ldap_entry
ldap_passwd:
redirect: community.general.ldap_passwd
lldp:
redirect: community.general.lldp
netcup_dns:
redirect: community.general.netcup_dns
nios_a_record:
redirect: community.general.nios_a_record
nios_aaaa_record:
redirect: community.general.nios_aaaa_record
nios_cname_record:
redirect: community.general.nios_cname_record
nios_dns_view:
redirect: community.general.nios_dns_view
nios_fixed_address:
redirect: community.general.nios_fixed_address
nios_host_record:
redirect: community.general.nios_host_record
nios_member:
redirect: community.general.nios_member
nios_mx_record:
redirect: community.general.nios_mx_record
nios_naptr_record:
redirect: community.general.nios_naptr_record
nios_network:
redirect: community.general.nios_network
nios_network_view:
redirect: community.general.nios_network_view
nios_nsgroup:
redirect: community.general.nios_nsgroup
nios_ptr_record:
redirect: community.general.nios_ptr_record
nios_srv_record:
redirect: community.general.nios_srv_record
nios_txt_record:
redirect: community.general.nios_txt_record
nios_zone:
redirect: community.general.nios_zone
nmcli:
redirect: community.general.nmcli
nsupdate:
redirect: community.general.nsupdate
omapi_host:
redirect: community.general.omapi_host
snmp_facts:
redirect: community.general.snmp_facts
a10_server:
redirect: community.general.a10_server
a10_server_axapi3:
redirect: community.general.a10_server_axapi3
a10_service_group:
redirect: community.general.a10_service_group
a10_virtual_server:
redirect: community.general.a10_virtual_server
aci_intf_policy_fc:
redirect: community.general.aci_intf_policy_fc
aci_intf_policy_l2:
redirect: community.general.aci_intf_policy_l2
aci_intf_policy_lldp:
redirect: community.general.aci_intf_policy_lldp
aci_intf_policy_mcp:
redirect: community.general.aci_intf_policy_mcp
aci_intf_policy_port_channel:
redirect: community.general.aci_intf_policy_port_channel
aci_intf_policy_port_security:
redirect: community.general.aci_intf_policy_port_security
mso_schema_template_external_epg_contract:
redirect: community.general.mso_schema_template_external_epg_contract
mso_schema_template_external_epg_subnet:
redirect: community.general.mso_schema_template_external_epg_subnet
aireos_command:
redirect: community.general.aireos_command
aireos_config:
redirect: community.general.aireos_config
apconos_command:
redirect: community.general.apconos_command
aruba_command:
redirect: community.general.aruba_command
aruba_config:
redirect: community.general.aruba_config
avi_actiongroupconfig:
redirect: community.general.avi_actiongroupconfig
avi_alertconfig:
redirect: community.general.avi_alertconfig
avi_alertemailconfig:
redirect: community.general.avi_alertemailconfig
avi_alertscriptconfig:
redirect: community.general.avi_alertscriptconfig
avi_alertsyslogconfig:
redirect: community.general.avi_alertsyslogconfig
avi_analyticsprofile:
redirect: community.general.avi_analyticsprofile
avi_api_session:
redirect: community.general.avi_api_session
avi_api_version:
redirect: community.general.avi_api_version
avi_applicationpersistenceprofile:
redirect: community.general.avi_applicationpersistenceprofile
avi_applicationprofile:
redirect: community.general.avi_applicationprofile
avi_authprofile:
redirect: community.general.avi_authprofile
avi_autoscalelaunchconfig:
redirect: community.general.avi_autoscalelaunchconfig
avi_backup:
redirect: community.general.avi_backup
avi_backupconfiguration:
redirect: community.general.avi_backupconfiguration
avi_certificatemanagementprofile:
redirect: community.general.avi_certificatemanagementprofile
avi_cloud:
redirect: community.general.avi_cloud
avi_cloudconnectoruser:
redirect: community.general.avi_cloudconnectoruser
avi_cloudproperties:
redirect: community.general.avi_cloudproperties
avi_cluster:
redirect: community.general.avi_cluster
avi_clusterclouddetails:
redirect: community.general.avi_clusterclouddetails
avi_controllerproperties:
redirect: community.general.avi_controllerproperties
avi_customipamdnsprofile:
redirect: community.general.avi_customipamdnsprofile
avi_dnspolicy:
redirect: community.general.avi_dnspolicy
avi_errorpagebody:
redirect: community.general.avi_errorpagebody
avi_errorpageprofile:
redirect: community.general.avi_errorpageprofile
avi_gslb:
redirect: community.general.avi_gslb
avi_gslbgeodbprofile:
redirect: community.general.avi_gslbgeodbprofile
avi_gslbservice:
redirect: community.general.avi_gslbservice
avi_gslbservice_patch_member:
redirect: community.general.avi_gslbservice_patch_member
avi_hardwaresecuritymodulegroup:
redirect: community.general.avi_hardwaresecuritymodulegroup
avi_healthmonitor:
redirect: community.general.avi_healthmonitor
avi_httppolicyset:
redirect: community.general.avi_httppolicyset
avi_ipaddrgroup:
redirect: community.general.avi_ipaddrgroup
avi_ipamdnsproviderprofile:
redirect: community.general.avi_ipamdnsproviderprofile
avi_l4policyset:
redirect: community.general.avi_l4policyset
avi_microservicegroup:
redirect: community.general.avi_microservicegroup
avi_network:
redirect: community.general.avi_network
avi_networkprofile:
redirect: community.general.avi_networkprofile
avi_networksecuritypolicy:
redirect: community.general.avi_networksecuritypolicy
avi_pkiprofile:
redirect: community.general.avi_pkiprofile
avi_pool:
redirect: community.general.avi_pool
avi_poolgroup:
redirect: community.general.avi_poolgroup
avi_poolgroupdeploymentpolicy:
redirect: community.general.avi_poolgroupdeploymentpolicy
avi_prioritylabels:
redirect: community.general.avi_prioritylabels
avi_role:
redirect: community.general.avi_role
avi_scheduler:
redirect: community.general.avi_scheduler
avi_seproperties:
redirect: community.general.avi_seproperties
avi_serverautoscalepolicy:
redirect: community.general.avi_serverautoscalepolicy
avi_serviceengine:
redirect: community.general.avi_serviceengine
avi_serviceenginegroup:
redirect: community.general.avi_serviceenginegroup
avi_snmptrapprofile:
redirect: community.general.avi_snmptrapprofile
avi_sslkeyandcertificate:
redirect: community.general.avi_sslkeyandcertificate
avi_sslprofile:
redirect: community.general.avi_sslprofile
avi_stringgroup:
redirect: community.general.avi_stringgroup
avi_systemconfiguration:
redirect: community.general.avi_systemconfiguration
avi_tenant:
redirect: community.general.avi_tenant
avi_trafficcloneprofile:
redirect: community.general.avi_trafficcloneprofile
avi_user:
redirect: community.general.avi_user
avi_useraccount:
redirect: community.general.avi_useraccount
avi_useraccountprofile:
redirect: community.general.avi_useraccountprofile
avi_virtualservice:
redirect: community.general.avi_virtualservice
avi_vrfcontext:
redirect: community.general.avi_vrfcontext
avi_vsdatascriptset:
redirect: community.general.avi_vsdatascriptset
avi_vsvip:
redirect: community.general.avi_vsvip
avi_webhook:
redirect: community.general.avi_webhook
bcf_switch:
redirect: community.general.bcf_switch
bigmon_chain:
redirect: community.general.bigmon_chain
bigmon_policy:
redirect: community.general.bigmon_policy
checkpoint_access_layer_facts:
redirect: community.general.checkpoint_access_layer_facts
checkpoint_access_rule:
redirect: community.general.checkpoint_access_rule
checkpoint_access_rule_facts:
redirect: community.general.checkpoint_access_rule_facts
checkpoint_host:
redirect: community.general.checkpoint_host
checkpoint_host_facts:
redirect: community.general.checkpoint_host_facts
checkpoint_object_facts:
redirect: community.general.checkpoint_object_facts
checkpoint_run_script:
redirect: community.general.checkpoint_run_script
checkpoint_session:
redirect: community.general.checkpoint_session
checkpoint_task_facts:
redirect: community.general.checkpoint_task_facts
cp_publish:
redirect: community.general.cp_publish
ce_aaa_server:
redirect: community.general.ce_aaa_server
ce_aaa_server_host:
redirect: community.general.ce_aaa_server_host
ce_acl:
redirect: community.general.ce_acl
ce_acl_advance:
redirect: community.general.ce_acl_advance
ce_acl_interface:
redirect: community.general.ce_acl_interface
ce_bfd_global:
redirect: community.general.ce_bfd_global
ce_bfd_session:
redirect: community.general.ce_bfd_session
ce_bfd_view:
redirect: community.general.ce_bfd_view
ce_bgp:
redirect: community.general.ce_bgp
ce_bgp_af:
redirect: community.general.ce_bgp_af
ce_bgp_neighbor:
redirect: community.general.ce_bgp_neighbor
ce_bgp_neighbor_af:
redirect: community.general.ce_bgp_neighbor_af
ce_command:
redirect: community.general.ce_command
ce_config:
redirect: community.general.ce_config
ce_dldp:
redirect: community.general.ce_dldp
ce_dldp_interface:
redirect: community.general.ce_dldp_interface
ce_eth_trunk:
redirect: community.general.ce_eth_trunk
ce_evpn_bd_vni:
redirect: community.general.ce_evpn_bd_vni
ce_evpn_bgp:
redirect: community.general.ce_evpn_bgp
ce_evpn_bgp_rr:
redirect: community.general.ce_evpn_bgp_rr
ce_evpn_global:
redirect: community.general.ce_evpn_global
ce_facts:
redirect: community.general.ce_facts
ce_file_copy:
redirect: community.general.ce_file_copy
ce_info_center_debug:
redirect: community.general.ce_info_center_debug
ce_info_center_global:
redirect: community.general.ce_info_center_global
ce_info_center_log:
redirect: community.general.ce_info_center_log
ce_info_center_trap:
redirect: community.general.ce_info_center_trap
ce_interface:
redirect: community.general.ce_interface
ce_interface_ospf:
redirect: community.general.ce_interface_ospf
ce_ip_interface:
redirect: community.general.ce_ip_interface
ce_is_is_instance:
redirect: community.general.ce_is_is_instance
ce_is_is_interface:
redirect: community.general.ce_is_is_interface
ce_is_is_view:
redirect: community.general.ce_is_is_view
ce_lacp:
redirect: community.general.ce_lacp
ce_link_status:
redirect: community.general.ce_link_status
ce_lldp:
redirect: community.general.ce_lldp
ce_lldp_interface:
redirect: community.general.ce_lldp_interface
ce_mdn_interface:
redirect: community.general.ce_mdn_interface
ce_mlag_config:
redirect: community.general.ce_mlag_config
ce_mlag_interface:
redirect: community.general.ce_mlag_interface
ce_mtu:
redirect: community.general.ce_mtu
ce_multicast_global:
redirect: community.general.ce_multicast_global
ce_multicast_igmp_enable:
redirect: community.general.ce_multicast_igmp_enable
ce_netconf:
redirect: community.general.ce_netconf
ce_netstream_aging:
redirect: community.general.ce_netstream_aging
ce_netstream_export:
redirect: community.general.ce_netstream_export
ce_netstream_global:
redirect: community.general.ce_netstream_global
ce_netstream_template:
redirect: community.general.ce_netstream_template
ce_ntp:
redirect: community.general.ce_ntp
ce_ntp_auth:
redirect: community.general.ce_ntp_auth
ce_ospf:
redirect: community.general.ce_ospf
ce_ospf_vrf:
redirect: community.general.ce_ospf_vrf
ce_reboot:
redirect: community.general.ce_reboot
ce_rollback:
redirect: community.general.ce_rollback
ce_sflow:
redirect: community.general.ce_sflow
ce_snmp_community:
redirect: community.general.ce_snmp_community
ce_snmp_contact:
redirect: community.general.ce_snmp_contact
ce_snmp_location:
redirect: community.general.ce_snmp_location
ce_snmp_target_host:
redirect: community.general.ce_snmp_target_host
ce_snmp_traps:
redirect: community.general.ce_snmp_traps
ce_snmp_user:
redirect: community.general.ce_snmp_user
ce_startup:
redirect: community.general.ce_startup
ce_static_route:
redirect: community.general.ce_static_route
ce_static_route_bfd:
redirect: community.general.ce_static_route_bfd
ce_stp:
redirect: community.general.ce_stp
ce_switchport:
redirect: community.general.ce_switchport
ce_vlan:
redirect: community.general.ce_vlan
ce_vrf:
redirect: community.general.ce_vrf
ce_vrf_af:
redirect: community.general.ce_vrf_af
ce_vrf_interface:
redirect: community.general.ce_vrf_interface
ce_vrrp:
redirect: community.general.ce_vrrp
ce_vxlan_arp:
redirect: community.general.ce_vxlan_arp
ce_vxlan_gateway:
redirect: community.general.ce_vxlan_gateway
ce_vxlan_global:
redirect: community.general.ce_vxlan_global
ce_vxlan_tunnel:
redirect: community.general.ce_vxlan_tunnel
ce_vxlan_vap:
redirect: community.general.ce_vxlan_vap
cv_server_provision:
redirect: community.general.cv_server_provision
cnos_backup:
redirect: community.general.cnos_backup
cnos_banner:
redirect: community.general.cnos_banner
cnos_bgp:
redirect: community.general.cnos_bgp
cnos_command:
redirect: community.general.cnos_command
cnos_conditional_command:
redirect: community.general.cnos_conditional_command
cnos_conditional_template:
redirect: community.general.cnos_conditional_template
cnos_config:
redirect: community.general.cnos_config
cnos_factory:
redirect: community.general.cnos_factory
cnos_facts:
redirect: community.general.cnos_facts
cnos_image:
redirect: community.general.cnos_image
cnos_interface:
redirect: community.general.cnos_interface
cnos_l2_interface:
redirect: community.general.cnos_l2_interface
cnos_l3_interface:
redirect: community.general.cnos_l3_interface
cnos_linkagg:
redirect: community.general.cnos_linkagg
cnos_lldp:
redirect: community.general.cnos_lldp
cnos_logging:
redirect: community.general.cnos_logging
cnos_reload:
redirect: community.general.cnos_reload
cnos_rollback:
redirect: community.general.cnos_rollback
cnos_save:
redirect: community.general.cnos_save
cnos_showrun:
redirect: community.general.cnos_showrun
cnos_static_route:
redirect: community.general.cnos_static_route
cnos_system:
redirect: community.general.cnos_system
cnos_template:
redirect: community.general.cnos_template
cnos_user:
redirect: community.general.cnos_user
cnos_vlag:
redirect: community.general.cnos_vlag
cnos_vlan:
redirect: community.general.cnos_vlan
cnos_vrf:
redirect: community.general.cnos_vrf
nclu:
redirect: community.general.nclu
edgeos_command:
redirect: community.general.edgeos_command
edgeos_config:
redirect: community.general.edgeos_config
edgeos_facts:
redirect: community.general.edgeos_facts
edgeswitch_facts:
redirect: community.general.edgeswitch_facts
edgeswitch_vlan:
redirect: community.general.edgeswitch_vlan
enos_command:
redirect: community.general.enos_command
enos_config:
redirect: community.general.enos_config
enos_facts:
redirect: community.general.enos_facts
eric_eccli_command:
redirect: community.general.eric_eccli_command
exos_command:
redirect: community.general.exos_command
exos_config:
redirect: community.general.exos_config
exos_facts:
redirect: community.general.exos_facts
exos_l2_interfaces:
redirect: community.general.exos_l2_interfaces
exos_lldp_global:
redirect: community.general.exos_lldp_global
exos_lldp_interfaces:
redirect: community.general.exos_lldp_interfaces
exos_vlans:
redirect: community.general.exos_vlans
bigip_asm_policy:
redirect: community.general.bigip_asm_policy
bigip_device_facts:
redirect: community.general.bigip_device_facts
bigip_facts:
redirect: community.general.bigip_facts
bigip_gtm_facts:
redirect: community.general.bigip_gtm_facts
bigip_iapplx_package:
redirect: community.general.bigip_iapplx_package
bigip_security_address_list:
redirect: community.general.bigip_security_address_list
bigip_security_port_list:
redirect: community.general.bigip_security_port_list
bigip_traffic_group:
redirect: community.general.bigip_traffic_group
bigiq_device_facts:
redirect: community.general.bigiq_device_facts
faz_device:
redirect: community.general.faz_device
fmgr_device:
redirect: community.general.fmgr_device
fmgr_device_config:
redirect: community.general.fmgr_device_config
fmgr_device_group:
redirect: community.general.fmgr_device_group
fmgr_device_provision_template:
redirect: community.general.fmgr_device_provision_template
fmgr_fwobj_address:
redirect: community.general.fmgr_fwobj_address
fmgr_fwobj_ippool:
redirect: community.general.fmgr_fwobj_ippool
fmgr_fwobj_ippool6:
redirect: community.general.fmgr_fwobj_ippool6
fmgr_fwobj_service:
redirect: community.general.fmgr_fwobj_service
fmgr_fwobj_vip:
redirect: community.general.fmgr_fwobj_vip
fmgr_fwpol_ipv4:
redirect: community.general.fmgr_fwpol_ipv4
fmgr_fwpol_package:
redirect: community.general.fmgr_fwpol_package
fmgr_ha:
redirect: community.general.fmgr_ha
fmgr_provisioning:
redirect: community.general.fmgr_provisioning
fmgr_query:
redirect: community.general.fmgr_query
fmgr_script:
redirect: community.general.fmgr_script
fmgr_secprof_appctrl:
redirect: community.general.fmgr_secprof_appctrl
fmgr_secprof_av:
redirect: community.general.fmgr_secprof_av
fmgr_secprof_dns:
redirect: community.general.fmgr_secprof_dns
fmgr_secprof_ips:
redirect: community.general.fmgr_secprof_ips
fmgr_secprof_profile_group:
redirect: community.general.fmgr_secprof_profile_group
fmgr_secprof_proxy:
redirect: community.general.fmgr_secprof_proxy
fmgr_secprof_spam:
redirect: community.general.fmgr_secprof_spam
fmgr_secprof_ssl_ssh:
redirect: community.general.fmgr_secprof_ssl_ssh
fmgr_secprof_voip:
redirect: community.general.fmgr_secprof_voip
fmgr_secprof_waf:
redirect: community.general.fmgr_secprof_waf
fmgr_secprof_wanopt:
redirect: community.general.fmgr_secprof_wanopt
fmgr_secprof_web:
redirect: community.general.fmgr_secprof_web
ftd_configuration:
redirect: community.general.ftd_configuration
ftd_file_download:
redirect: community.general.ftd_file_download
ftd_file_upload:
redirect: community.general.ftd_file_upload
ftd_install:
redirect: community.general.ftd_install
icx_banner:
redirect: community.general.icx_banner
icx_command:
redirect: community.general.icx_command
icx_config:
redirect: community.general.icx_config
icx_copy:
redirect: community.general.icx_copy
icx_facts:
redirect: community.general.icx_facts
icx_interface:
redirect: community.general.icx_interface
icx_l3_interface:
redirect: community.general.icx_l3_interface
icx_linkagg:
redirect: community.general.icx_linkagg
icx_lldp:
redirect: community.general.icx_lldp
icx_logging:
redirect: community.general.icx_logging
icx_ping:
redirect: community.general.icx_ping
icx_static_route:
redirect: community.general.icx_static_route
icx_system:
redirect: community.general.icx_system
icx_user:
redirect: community.general.icx_user
icx_vlan:
redirect: community.general.icx_vlan
dladm_etherstub:
redirect: community.general.dladm_etherstub
dladm_iptun:
redirect: community.general.dladm_iptun
dladm_linkprop:
redirect: community.general.dladm_linkprop
dladm_vlan:
redirect: community.general.dladm_vlan
dladm_vnic:
redirect: community.general.dladm_vnic
flowadm:
redirect: community.general.flowadm
ipadm_addr:
redirect: community.general.ipadm_addr
ipadm_addrprop:
redirect: community.general.ipadm_addrprop
ipadm_if:
redirect: community.general.ipadm_if
ipadm_ifprop:
redirect: community.general.ipadm_ifprop
ipadm_prop:
redirect: community.general.ipadm_prop
ig_config:
redirect: community.general.ig_config
ig_unit_information:
redirect: community.general.ig_unit_information
ironware_command:
redirect: community.general.ironware_command
ironware_config:
redirect: community.general.ironware_config
ironware_facts:
redirect: community.general.ironware_facts
iap_start_workflow:
redirect: community.general.iap_start_workflow
iap_token:
redirect: community.general.iap_token
netact_cm_command:
redirect: community.general.netact_cm_command
netscaler_cs_action:
redirect: community.general.netscaler_cs_action
netscaler_cs_policy:
redirect: community.general.netscaler_cs_policy
netscaler_cs_vserver:
redirect: community.general.netscaler_cs_vserver
netscaler_gslb_service:
redirect: community.general.netscaler_gslb_service
netscaler_gslb_site:
redirect: community.general.netscaler_gslb_site
netscaler_gslb_vserver:
redirect: community.general.netscaler_gslb_vserver
netscaler_lb_monitor:
redirect: community.general.netscaler_lb_monitor
netscaler_lb_vserver:
redirect: community.general.netscaler_lb_vserver
netscaler_nitro_request:
redirect: community.general.netscaler_nitro_request
netscaler_save_config:
redirect: community.general.netscaler_save_config
netscaler_server:
redirect: community.general.netscaler_server
netscaler_service:
redirect: community.general.netscaler_service
netscaler_servicegroup:
redirect: community.general.netscaler_servicegroup
netscaler_ssl_certkey:
redirect: community.general.netscaler_ssl_certkey
pn_cluster:
redirect: community.general.pn_cluster
pn_ospf:
redirect: community.general.pn_ospf
pn_ospfarea:
redirect: community.general.pn_ospfarea
pn_show:
redirect: community.general.pn_show
pn_trunk:
redirect: community.general.pn_trunk
pn_vlag:
redirect: community.general.pn_vlag
pn_vlan:
redirect: community.general.pn_vlan
pn_vrouter:
redirect: community.general.pn_vrouter
pn_vrouterbgp:
redirect: community.general.pn_vrouterbgp
pn_vrouterif:
redirect: community.general.pn_vrouterif
pn_vrouterlbif:
redirect: community.general.pn_vrouterlbif
pn_access_list:
redirect: community.general.pn_access_list
pn_access_list_ip:
redirect: community.general.pn_access_list_ip
pn_admin_service:
redirect: community.general.pn_admin_service
pn_admin_session_timeout:
redirect: community.general.pn_admin_session_timeout
pn_admin_syslog:
redirect: community.general.pn_admin_syslog
pn_connection_stats_settings:
redirect: community.general.pn_connection_stats_settings
pn_cpu_class:
redirect: community.general.pn_cpu_class
pn_cpu_mgmt_class:
redirect: community.general.pn_cpu_mgmt_class
pn_dhcp_filter:
redirect: community.general.pn_dhcp_filter
pn_dscp_map:
redirect: community.general.pn_dscp_map
pn_dscp_map_pri_map:
redirect: community.general.pn_dscp_map_pri_map
pn_fabric_local:
redirect: community.general.pn_fabric_local
pn_igmp_snooping:
redirect: community.general.pn_igmp_snooping
pn_ipv6security_raguard:
redirect: community.general.pn_ipv6security_raguard
pn_ipv6security_raguard_port:
redirect: community.general.pn_ipv6security_raguard_port
pn_ipv6security_raguard_vlan:
redirect: community.general.pn_ipv6security_raguard_vlan
pn_log_audit_exception:
redirect: community.general.pn_log_audit_exception
pn_port_config:
redirect: community.general.pn_port_config
pn_port_cos_bw:
redirect: community.general.pn_port_cos_bw
pn_port_cos_rate_setting:
redirect: community.general.pn_port_cos_rate_setting
pn_prefix_list:
redirect: community.general.pn_prefix_list
pn_prefix_list_network:
redirect: community.general.pn_prefix_list_network
pn_role:
redirect: community.general.pn_role
pn_snmp_community:
redirect: community.general.pn_snmp_community
pn_snmp_trap_sink:
redirect: community.general.pn_snmp_trap_sink
pn_snmp_vacm:
redirect: community.general.pn_snmp_vacm
pn_stp:
redirect: community.general.pn_stp
pn_stp_port:
redirect: community.general.pn_stp_port
pn_switch_setup:
redirect: community.general.pn_switch_setup
pn_user:
redirect: community.general.pn_user
pn_vflow_table_profile:
redirect: community.general.pn_vflow_table_profile
pn_vrouter_bgp:
redirect: community.general.pn_vrouter_bgp
pn_vrouter_bgp_network:
redirect: community.general.pn_vrouter_bgp_network
pn_vrouter_interface_ip:
redirect: community.general.pn_vrouter_interface_ip
pn_vrouter_loopback_interface:
redirect: community.general.pn_vrouter_loopback_interface
pn_vrouter_ospf:
redirect: community.general.pn_vrouter_ospf
pn_vrouter_ospf6:
redirect: community.general.pn_vrouter_ospf6
pn_vrouter_packet_relay:
redirect: community.general.pn_vrouter_packet_relay
pn_vrouter_pim_config:
redirect: community.general.pn_vrouter_pim_config
pn_vtep:
redirect: community.general.pn_vtep
nos_command:
redirect: community.general.nos_command
nos_config:
redirect: community.general.nos_config
nos_facts:
redirect: community.general.nos_facts
nso_action:
redirect: community.general.nso_action
nso_config:
redirect: community.general.nso_config
nso_query:
redirect: community.general.nso_query
nso_show:
redirect: community.general.nso_show
nso_verify:
redirect: community.general.nso_verify
nuage_vspk:
redirect: community.general.nuage_vspk
onyx_aaa:
redirect: community.general.onyx_aaa
onyx_bfd:
redirect: community.general.onyx_bfd
onyx_bgp:
redirect: community.general.onyx_bgp
onyx_buffer_pool:
redirect: community.general.onyx_buffer_pool
onyx_command:
redirect: community.general.onyx_command
onyx_config:
redirect: community.general.onyx_config
onyx_facts:
redirect: community.general.onyx_facts
onyx_igmp:
redirect: community.general.onyx_igmp
onyx_igmp_interface:
redirect: community.general.onyx_igmp_interface
onyx_igmp_vlan:
redirect: community.general.onyx_igmp_vlan
onyx_interface:
redirect: community.general.onyx_interface
onyx_l2_interface:
redirect: community.general.onyx_l2_interface
onyx_l3_interface:
redirect: community.general.onyx_l3_interface
onyx_linkagg:
redirect: community.general.onyx_linkagg
onyx_lldp:
redirect: community.general.onyx_lldp
onyx_lldp_interface:
redirect: community.general.onyx_lldp_interface
onyx_magp:
redirect: community.general.onyx_magp
onyx_mlag_ipl:
redirect: community.general.onyx_mlag_ipl
onyx_mlag_vip:
redirect: community.general.onyx_mlag_vip
onyx_ntp:
redirect: community.general.onyx_ntp
onyx_ntp_servers_peers:
redirect: community.general.onyx_ntp_servers_peers
onyx_ospf:
redirect: community.general.onyx_ospf
onyx_pfc_interface:
redirect: community.general.onyx_pfc_interface
onyx_protocol:
redirect: community.general.onyx_protocol
onyx_ptp_global:
redirect: community.general.onyx_ptp_global
onyx_ptp_interface:
redirect: community.general.onyx_ptp_interface
onyx_qos:
redirect: community.general.onyx_qos
onyx_snmp:
redirect: community.general.onyx_snmp
onyx_snmp_hosts:
redirect: community.general.onyx_snmp_hosts
onyx_snmp_users:
redirect: community.general.onyx_snmp_users
onyx_syslog_files:
redirect: community.general.onyx_syslog_files
onyx_syslog_remote:
redirect: community.general.onyx_syslog_remote
onyx_traffic_class:
redirect: community.general.onyx_traffic_class
onyx_username:
redirect: community.general.onyx_username
onyx_vlan:
redirect: community.general.onyx_vlan
onyx_vxlan:
redirect: community.general.onyx_vxlan
onyx_wjh:
redirect: community.general.onyx_wjh
opx_cps:
redirect: community.general.opx_cps
ordnance_config:
redirect: community.general.ordnance_config
ordnance_facts:
redirect: community.general.ordnance_facts
panos_admin:
redirect: community.general.panos_admin
panos_admpwd:
redirect: community.general.panos_admpwd
panos_cert_gen_ssh:
redirect: community.general.panos_cert_gen_ssh
panos_check:
redirect: community.general.panos_check
panos_commit:
redirect: community.general.panos_commit
panos_dag:
redirect: community.general.panos_dag
panos_dag_tags:
redirect: community.general.panos_dag_tags
panos_import:
redirect: community.general.panos_import
panos_interface:
redirect: community.general.panos_interface
panos_lic:
redirect: community.general.panos_lic
panos_loadcfg:
redirect: community.general.panos_loadcfg
panos_match_rule:
redirect: community.general.panos_match_rule
panos_mgtconfig:
redirect: community.general.panos_mgtconfig
panos_nat_rule:
redirect: community.general.panos_nat_rule
panos_object:
redirect: community.general.panos_object
panos_op:
redirect: community.general.panos_op
panos_pg:
redirect: community.general.panos_pg
panos_query_rules:
redirect: community.general.panos_query_rules
panos_restart:
redirect: community.general.panos_restart
panos_sag:
redirect: community.general.panos_sag
panos_security_rule:
redirect: community.general.panos_security_rule
panos_set:
redirect: community.general.panos_set
vdirect_commit:
redirect: community.general.vdirect_commit
vdirect_file:
redirect: community.general.vdirect_file
vdirect_runnable:
redirect: community.general.vdirect_runnable
routeros_command:
redirect: community.general.routeros_command
routeros_facts:
redirect: community.general.routeros_facts
slxos_command:
redirect: community.general.slxos_command
slxos_config:
redirect: community.general.slxos_config
slxos_facts:
redirect: community.general.slxos_facts
slxos_interface:
redirect: community.general.slxos_interface
slxos_l2_interface:
redirect: community.general.slxos_l2_interface
slxos_l3_interface:
redirect: community.general.slxos_l3_interface
slxos_linkagg:
redirect: community.general.slxos_linkagg
slxos_lldp:
redirect: community.general.slxos_lldp
slxos_vlan:
redirect: community.general.slxos_vlan
sros_command:
redirect: community.general.sros_command
sros_config:
redirect: community.general.sros_config
sros_rollback:
redirect: community.general.sros_rollback
voss_command:
redirect: community.general.voss_command
voss_config:
redirect: community.general.voss_config
voss_facts:
redirect: community.general.voss_facts
osx_say:
redirect: community.general.osx_say
bearychat:
redirect: community.general.bearychat
campfire:
redirect: community.general.campfire
catapult:
redirect: community.general.catapult
cisco_spark:
redirect: community.general.cisco_spark
flowdock:
redirect: community.general.flowdock
grove:
redirect: community.general.grove
hipchat:
redirect: community.general.hipchat
irc:
redirect: community.general.irc
jabber:
redirect: community.general.jabber
logentries_msg:
redirect: community.general.logentries_msg
mail:
redirect: community.general.mail
matrix:
redirect: community.general.matrix
mattermost:
redirect: community.general.mattermost
mqtt:
redirect: community.general.mqtt
nexmo:
redirect: community.general.nexmo
office_365_connector_card:
redirect: community.general.office_365_connector_card
pushbullet:
redirect: community.general.pushbullet
pushover:
redirect: community.general.pushover
rabbitmq_publish:
redirect: community.general.rabbitmq_publish
rocketchat:
redirect: community.general.rocketchat
say:
redirect: community.general.say
sendgrid:
redirect: community.general.sendgrid
slack:
redirect: community.general.slack
syslogger:
redirect: community.general.syslogger
telegram:
redirect: community.general.telegram
twilio:
redirect: community.general.twilio
typetalk:
redirect: community.general.typetalk
bower:
redirect: community.general.bower
bundler:
redirect: community.general.bundler
composer:
redirect: community.general.composer
cpanm:
redirect: community.general.cpanm
easy_install:
redirect: community.general.easy_install
gem:
redirect: community.general.gem
maven_artifact:
redirect: community.general.maven_artifact
npm:
redirect: community.general.npm
pear:
redirect: community.general.pear
pip_package_info:
redirect: community.general.pip_package_info
yarn:
redirect: community.general.yarn
apk:
redirect: community.general.apk
apt_rpm:
redirect: community.general.apt_rpm
flatpak:
redirect: community.general.flatpak
flatpak_remote:
redirect: community.general.flatpak_remote
homebrew:
redirect: community.general.homebrew
homebrew_cask:
redirect: community.general.homebrew_cask
homebrew_tap:
redirect: community.general.homebrew_tap
installp:
redirect: community.general.installp
layman:
redirect: community.general.layman
macports:
redirect: community.general.macports
mas:
redirect: community.general.mas
openbsd_pkg:
redirect: community.general.openbsd_pkg
opkg:
redirect: community.general.opkg
pacman:
redirect: community.general.pacman
pkg5:
redirect: community.general.pkg5
pkg5_publisher:
redirect: community.general.pkg5_publisher
pkgin:
redirect: community.general.pkgin
pkgng:
redirect: community.general.pkgng
pkgutil:
redirect: community.general.pkgutil
portage:
redirect: community.general.portage
portinstall:
redirect: community.general.portinstall
pulp_repo:
redirect: community.general.pulp_repo
redhat_subscription:
redirect: community.general.redhat_subscription
rhn_channel:
redirect: community.general.rhn_channel
rhn_register:
redirect: community.general.rhn_register
rhsm_release:
redirect: community.general.rhsm_release
rhsm_repository:
redirect: community.general.rhsm_repository
slackpkg:
redirect: community.general.slackpkg
snap:
redirect: community.general.snap
sorcery:
redirect: community.general.sorcery
svr4pkg:
redirect: community.general.svr4pkg
swdepot:
redirect: community.general.swdepot
swupd:
redirect: community.general.swupd
urpmi:
redirect: community.general.urpmi
xbps:
redirect: community.general.xbps
zypper:
redirect: community.general.zypper
zypper_repository:
redirect: community.general.zypper_repository
cobbler_sync:
redirect: community.general.cobbler_sync
cobbler_system:
redirect: community.general.cobbler_system
idrac_firmware:
redirect: community.general.idrac_firmware
idrac_server_config_profile:
redirect: community.general.idrac_server_config_profile
ome_device_info:
redirect: community.general.ome_device_info
foreman:
redirect: community.general.foreman
katello:
redirect: community.general.katello
hpilo_facts:
redirect: community.general.hpilo_facts
hpilo_boot:
redirect: community.general.hpilo_boot
hpilo_info:
redirect: community.general.hpilo_info
hponcfg:
redirect: community.general.hponcfg
imc_rest:
redirect: community.general.imc_rest
intersight_info:
redirect: community.general.intersight_info
ipmi_boot:
redirect: community.general.ipmi_boot
ipmi_power:
redirect: community.general.ipmi_power
lxca_cmms:
redirect: community.general.lxca_cmms
lxca_nodes:
redirect: community.general.lxca_nodes
manageiq_alert_profiles:
redirect: community.general.manageiq_alert_profiles
manageiq_alerts:
redirect: community.general.manageiq_alerts
manageiq_group:
redirect: community.general.manageiq_group
manageiq_policies:
redirect: community.general.manageiq_policies
manageiq_provider:
redirect: community.general.manageiq_provider
manageiq_tags:
redirect: community.general.manageiq_tags
manageiq_tenant:
redirect: community.general.manageiq_tenant
manageiq_user:
redirect: community.general.manageiq_user
oneview_datacenter_facts:
redirect: community.general.oneview_datacenter_facts
oneview_enclosure_facts:
redirect: community.general.oneview_enclosure_facts
oneview_ethernet_network_facts:
redirect: community.general.oneview_ethernet_network_facts
oneview_fc_network_facts:
redirect: community.general.oneview_fc_network_facts
oneview_fcoe_network_facts:
redirect: community.general.oneview_fcoe_network_facts
oneview_logical_interconnect_group_facts:
redirect: community.general.oneview_logical_interconnect_group_facts
oneview_network_set_facts:
redirect: community.general.oneview_network_set_facts
oneview_san_manager_facts:
redirect: community.general.oneview_san_manager_facts
oneview_datacenter_info:
redirect: community.general.oneview_datacenter_info
oneview_enclosure_info:
redirect: community.general.oneview_enclosure_info
oneview_ethernet_network:
redirect: community.general.oneview_ethernet_network
oneview_ethernet_network_info:
redirect: community.general.oneview_ethernet_network_info
oneview_fc_network:
redirect: community.general.oneview_fc_network
oneview_fc_network_info:
redirect: community.general.oneview_fc_network_info
oneview_fcoe_network:
redirect: community.general.oneview_fcoe_network
oneview_fcoe_network_info:
redirect: community.general.oneview_fcoe_network_info
oneview_logical_interconnect_group:
redirect: community.general.oneview_logical_interconnect_group
oneview_logical_interconnect_group_info:
redirect: community.general.oneview_logical_interconnect_group_info
oneview_network_set:
redirect: community.general.oneview_network_set
oneview_network_set_info:
redirect: community.general.oneview_network_set_info
oneview_san_manager:
redirect: community.general.oneview_san_manager
oneview_san_manager_info:
redirect: community.general.oneview_san_manager_info
idrac_redfish_facts:
redirect: community.general.idrac_redfish_facts
redfish_facts:
redirect: community.general.redfish_facts
idrac_redfish_command:
redirect: community.general.idrac_redfish_command
idrac_redfish_config:
redirect: community.general.idrac_redfish_config
idrac_redfish_info:
redirect: community.general.idrac_redfish_info
redfish_command:
redirect: community.general.redfish_command
redfish_config:
redirect: community.general.redfish_config
redfish_info:
redirect: community.general.redfish_info
stacki_host:
redirect: community.general.stacki_host
wakeonlan:
redirect: community.general.wakeonlan
bitbucket_access_key:
redirect: community.general.bitbucket_access_key
bitbucket_pipeline_key_pair:
redirect: community.general.bitbucket_pipeline_key_pair
bitbucket_pipeline_known_host:
redirect: community.general.bitbucket_pipeline_known_host
bitbucket_pipeline_variable:
redirect: community.general.bitbucket_pipeline_variable
bzr:
redirect: community.general.bzr
git_config:
redirect: community.general.git_config
github_hooks:
redirect: community.general.github_hooks
github_webhook_facts:
redirect: community.general.github_webhook_facts
github_deploy_key:
redirect: community.general.github_deploy_key
github_issue:
redirect: community.general.github_issue
github_key:
redirect: community.general.github_key
github_release:
redirect: community.general.github_release
github_webhook:
redirect: community.general.github_webhook
github_webhook_info:
redirect: community.general.github_webhook_info
gitlab_hooks:
redirect: community.general.gitlab_hooks
gitlab_deploy_key:
redirect: community.general.gitlab_deploy_key
gitlab_group:
redirect: community.general.gitlab_group
gitlab_hook:
redirect: community.general.gitlab_hook
gitlab_project:
redirect: community.general.gitlab_project
gitlab_project_variable:
redirect: community.general.gitlab_project_variable
gitlab_runner:
redirect: community.general.gitlab_runner
gitlab_user:
redirect: community.general.gitlab_user
hg:
redirect: community.general.hg
emc_vnx_sg_member:
redirect: community.general.emc_vnx_sg_member
gluster_heal_facts:
redirect: community.general.gluster_heal_facts
gluster_heal_info:
redirect: community.general.gluster_heal_info
gluster_peer:
redirect: community.general.gluster_peer
gluster_volume:
redirect: community.general.gluster_volume
ss_3par_cpg:
redirect: community.general.ss_3par_cpg
ibm_sa_domain:
redirect: community.general.ibm_sa_domain
ibm_sa_host:
redirect: community.general.ibm_sa_host
ibm_sa_host_ports:
redirect: community.general.ibm_sa_host_ports
ibm_sa_pool:
redirect: community.general.ibm_sa_pool
ibm_sa_vol:
redirect: community.general.ibm_sa_vol
ibm_sa_vol_map:
redirect: community.general.ibm_sa_vol_map
infini_export:
redirect: community.general.infini_export
infini_export_client:
redirect: community.general.infini_export_client
infini_fs:
redirect: community.general.infini_fs
infini_host:
redirect: community.general.infini_host
infini_pool:
redirect: community.general.infini_pool
infini_vol:
redirect: community.general.infini_vol
na_cdot_aggregate:
redirect: community.general.na_cdot_aggregate
na_cdot_license:
redirect: community.general.na_cdot_license
na_cdot_lun:
redirect: community.general.na_cdot_lun
na_cdot_qtree:
redirect: community.general.na_cdot_qtree
na_cdot_svm:
redirect: community.general.na_cdot_svm
na_cdot_user:
redirect: community.general.na_cdot_user
na_cdot_user_role:
redirect: community.general.na_cdot_user_role
na_cdot_volume:
redirect: community.general.na_cdot_volume
na_ontap_gather_facts:
redirect: community.general.na_ontap_gather_facts
sf_account_manager:
redirect: community.general.sf_account_manager
sf_check_connections:
redirect: community.general.sf_check_connections
sf_snapshot_schedule_manager:
redirect: community.general.sf_snapshot_schedule_manager
sf_volume_access_group_manager:
redirect: community.general.sf_volume_access_group_manager
sf_volume_manager:
redirect: community.general.sf_volume_manager
netapp_e_alerts:
redirect: community.general.netapp_e_alerts
netapp_e_amg:
redirect: community.general.netapp_e_amg
netapp_e_amg_role:
redirect: community.general.netapp_e_amg_role
netapp_e_amg_sync:
redirect: community.general.netapp_e_amg_sync
netapp_e_asup:
redirect: community.general.netapp_e_asup
netapp_e_auditlog:
redirect: community.general.netapp_e_auditlog
netapp_e_auth:
redirect: community.general.netapp_e_auth
netapp_e_drive_firmware:
redirect: community.general.netapp_e_drive_firmware
netapp_e_facts:
redirect: community.general.netapp_e_facts
netapp_e_firmware:
redirect: community.general.netapp_e_firmware
netapp_e_flashcache:
redirect: community.general.netapp_e_flashcache
netapp_e_global:
redirect: community.general.netapp_e_global
netapp_e_host:
redirect: community.general.netapp_e_host
netapp_e_hostgroup:
redirect: community.general.netapp_e_hostgroup
netapp_e_iscsi_interface:
redirect: community.general.netapp_e_iscsi_interface
netapp_e_iscsi_target:
redirect: community.general.netapp_e_iscsi_target
netapp_e_ldap:
redirect: community.general.netapp_e_ldap
netapp_e_lun_mapping:
redirect: community.general.netapp_e_lun_mapping
netapp_e_mgmt_interface:
redirect: community.general.netapp_e_mgmt_interface
netapp_e_snapshot_group:
redirect: community.general.netapp_e_snapshot_group
netapp_e_snapshot_images:
redirect: community.general.netapp_e_snapshot_images
netapp_e_snapshot_volume:
redirect: community.general.netapp_e_snapshot_volume
netapp_e_storage_system:
redirect: community.general.netapp_e_storage_system
netapp_e_storagepool:
redirect: community.general.netapp_e_storagepool
netapp_e_syslog:
redirect: community.general.netapp_e_syslog
netapp_e_volume:
redirect: community.general.netapp_e_volume
netapp_e_volume_copy:
redirect: community.general.netapp_e_volume_copy
purefa_facts:
redirect: community.general.purefa_facts
purefb_facts:
redirect: community.general.purefb_facts
vexata_eg:
redirect: community.general.vexata_eg
vexata_volume:
redirect: community.general.vexata_volume
zfs:
redirect: community.general.zfs
zfs_delegate_admin:
redirect: community.general.zfs_delegate_admin
zfs_facts:
redirect: community.general.zfs_facts
zpool_facts:
redirect: community.general.zpool_facts
python_requirements_facts:
redirect: community.general.python_requirements_facts
aix_devices:
redirect: community.general.aix_devices
aix_filesystem:
redirect: community.general.aix_filesystem
aix_inittab:
redirect: community.general.aix_inittab
aix_lvg:
redirect: community.general.aix_lvg
aix_lvol:
redirect: community.general.aix_lvol
alternatives:
redirect: community.general.alternatives
awall:
redirect: community.general.awall
beadm:
redirect: community.general.beadm
capabilities:
redirect: community.general.capabilities
cronvar:
redirect: community.general.cronvar
crypttab:
redirect: community.general.crypttab
dconf:
redirect: community.general.dconf
facter:
redirect: community.general.facter
filesystem:
redirect: community.general.filesystem
firewalld:
redirect: community.general.firewalld
gconftool2:
redirect: community.general.gconftool2
interfaces_file:
redirect: community.general.interfaces_file
java_cert:
redirect: community.general.java_cert
java_keystore:
redirect: community.general.java_keystore
kernel_blacklist:
redirect: community.general.kernel_blacklist
lbu:
redirect: community.general.lbu
listen_ports_facts:
redirect: community.general.listen_ports_facts
locale_gen:
redirect: community.general.locale_gen
lvg:
redirect: community.general.lvg
lvol:
redirect: community.general.lvol
make:
redirect: community.general.make
mksysb:
redirect: community.general.mksysb
modprobe:
redirect: community.general.modprobe
nosh:
redirect: community.general.nosh
ohai:
redirect: community.general.ohai
open_iscsi:
redirect: community.general.open_iscsi
openwrt_init:
redirect: community.general.openwrt_init
osx_defaults:
redirect: community.general.osx_defaults
pam_limits:
redirect: community.general.pam_limits
pamd:
redirect: community.general.pamd
parted:
redirect: community.general.parted
pids:
redirect: community.general.pids
puppet:
redirect: community.general.puppet
python_requirements_info:
redirect: community.general.python_requirements_info
runit:
redirect: community.general.runit
sefcontext:
redirect: community.general.sefcontext
selinux_permissive:
redirect: community.general.selinux_permissive
selogin:
redirect: community.general.selogin
seport:
redirect: community.general.seport
solaris_zone:
redirect: community.general.solaris_zone
svc:
redirect: community.general.svc
syspatch:
redirect: community.general.syspatch
timezone:
redirect: community.general.timezone
ufw:
redirect: community.general.ufw
vdo:
redirect: community.general.vdo
xfconf:
redirect: community.general.xfconf
xfs_quota:
redirect: community.general.xfs_quota
jenkins_job_facts:
redirect: community.general.jenkins_job_facts
nginx_status_facts:
redirect: community.general.nginx_status_facts
apache2_mod_proxy:
redirect: community.general.apache2_mod_proxy
apache2_module:
redirect: community.general.apache2_module
deploy_helper:
redirect: community.general.deploy_helper
django_manage:
redirect: community.general.django_manage
ejabberd_user:
redirect: community.general.ejabberd_user
gunicorn:
redirect: community.general.gunicorn
htpasswd:
redirect: community.general.htpasswd
jboss:
redirect: community.general.jboss
jenkins_job:
redirect: community.general.jenkins_job
jenkins_job_info:
redirect: community.general.jenkins_job_info
jenkins_plugin:
redirect: community.general.jenkins_plugin
jenkins_script:
redirect: community.general.jenkins_script
jira:
redirect: community.general.jira
nginx_status_info:
redirect: community.general.nginx_status_info
rundeck_acl_policy:
redirect: community.general.rundeck_acl_policy
rundeck_project:
redirect: community.general.rundeck_project
utm_aaa_group:
redirect: community.general.utm_aaa_group
utm_aaa_group_info:
redirect: community.general.utm_aaa_group_info
utm_ca_host_key_cert:
redirect: community.general.utm_ca_host_key_cert
utm_ca_host_key_cert_info:
redirect: community.general.utm_ca_host_key_cert_info
utm_dns_host:
redirect: community.general.utm_dns_host
utm_network_interface_address:
redirect: community.general.utm_network_interface_address
utm_network_interface_address_info:
redirect: community.general.utm_network_interface_address_info
utm_proxy_auth_profile:
redirect: community.general.utm_proxy_auth_profile
utm_proxy_exception:
redirect: community.general.utm_proxy_exception
utm_proxy_frontend:
redirect: community.general.utm_proxy_frontend
utm_proxy_frontend_info:
redirect: community.general.utm_proxy_frontend_info
utm_proxy_location:
redirect: community.general.utm_proxy_location
utm_proxy_location_info:
redirect: community.general.utm_proxy_location_info
supervisorctl:
redirect: community.general.supervisorctl
taiga_issue:
redirect: community.general.taiga_issue
grafana_dashboard:
redirect: community.grafana.grafana_dashboard
grafana_datasource:
redirect: community.grafana.grafana_datasource
grafana_plugin:
redirect: community.grafana.grafana_plugin
k8s_facts:
redirect: community.kubernetes.k8s_facts
k8s_raw:
redirect: community.kubernetes.k8s_raw
k8s:
redirect: community.kubernetes.k8s
k8s_auth:
redirect: community.kubernetes.k8s_auth
k8s_info:
redirect: community.kubernetes.k8s_info
k8s_scale:
redirect: community.kubernetes.k8s_scale
k8s_service:
redirect: community.kubernetes.k8s_service
openshift_raw:
redirect: community.kubernetes.openshift_raw
openshift_scale:
redirect: community.kubernetes.openshift_scale
openssh_cert:
redirect: community.crypto.openssh_cert
openssl_pkcs12:
redirect: community.crypto.openssl_pkcs12
openssl_csr:
redirect: community.crypto.openssl_csr
openssl_certificate:
redirect: community.crypto.openssl_certificate
openssl_certificate_info:
redirect: community.crypto.openssl_certificate_info
x509_crl:
redirect: community.crypto.x509_crl
openssl_privatekey_info:
redirect: community.crypto.openssl_privatekey_info
x509_crl_info:
redirect: community.crypto.x509_crl_info
get_certificate:
redirect: community.crypto.get_certificate
openssh_keypair:
redirect: community.crypto.openssh_keypair
openssl_publickey:
redirect: community.crypto.openssl_publickey
openssl_csr_info:
redirect: community.crypto.openssl_csr_info
luks_device:
redirect: community.crypto.luks_device
openssl_dhparam:
redirect: community.crypto.openssl_dhparam
openssl_privatekey:
redirect: community.crypto.openssl_privatekey
certificate_complete_chain:
redirect: community.crypto.certificate_complete_chain
acme_inspect:
redirect: community.crypto.acme_inspect
acme_certificate_revoke:
redirect: community.crypto.acme_certificate_revoke
acme_certificate:
redirect: community.crypto.acme_certificate
acme_account:
redirect: community.crypto.acme_account
acme_account_facts:
redirect: community.crypto.acme_account_facts
acme_challenge_cert_helper:
redirect: community.crypto.acme_challenge_cert_helper
acme_account_info:
redirect: community.crypto.acme_account_info
ecs_domain:
redirect: community.crypto.ecs_domain
ecs_certificate:
redirect: community.crypto.ecs_certificate
mongodb_parameter:
redirect: community.mongo.mongodb_parameter
mongodb_info:
redirect: community.mongo.mongodb_info
mongodb_replicaset:
redirect: community.mongo.mongodb_replicaset
mongodb_user:
redirect: community.mongo.mongodb_user
mongodb_shard:
redirect: community.mongo.mongodb_shard
vmware_appliance_access_info:
redirect: community.vmware_rest.vmware_appliance_access_info
vmware_appliance_health_info:
redirect: community.vmware_rest.vmware_appliance_health_info
vmware_cis_category_info:
redirect: community.vmware_rest.vmware_cis_category_info
vmware_core_info:
redirect: community.vmware_rest.vmware_core_info
vcenter_extension_facts:
redirect: community.vmware.vcenter_extension_facts
vmware_about_facts:
redirect: community.vmware.vmware_about_facts
vmware_category_facts:
redirect: community.vmware.vmware_category_facts
vmware_cluster_facts:
redirect: community.vmware.vmware_cluster_facts
vmware_datastore_facts:
redirect: community.vmware.vmware_datastore_facts
vmware_dns_config:
redirect: community.vmware.vmware_dns_config
vmware_drs_group_facts:
redirect: community.vmware.vmware_drs_group_facts
vmware_drs_rule_facts:
redirect: community.vmware.vmware_drs_rule_facts
vmware_dvs_portgroup_facts:
redirect: community.vmware.vmware_dvs_portgroup_facts
vmware_guest_boot_facts:
redirect: community.vmware.vmware_guest_boot_facts
vmware_guest_customization_facts:
redirect: community.vmware.vmware_guest_customization_facts
vmware_guest_disk_facts:
redirect: community.vmware.vmware_guest_disk_facts
vmware_guest_facts:
redirect: community.vmware.vmware_guest_facts
vmware_guest_snapshot_facts:
redirect: community.vmware.vmware_guest_snapshot_facts
vmware_host_capability_facts:
redirect: community.vmware.vmware_host_capability_facts
vmware_host_config_facts:
redirect: community.vmware.vmware_host_config_facts
vmware_host_dns_facts:
redirect: community.vmware.vmware_host_dns_facts
vmware_host_feature_facts:
redirect: community.vmware.vmware_host_feature_facts
vmware_host_firewall_facts:
redirect: community.vmware.vmware_host_firewall_facts
vmware_host_ntp_facts:
redirect: community.vmware.vmware_host_ntp_facts
vmware_host_package_facts:
redirect: community.vmware.vmware_host_package_facts
vmware_host_service_facts:
redirect: community.vmware.vmware_host_service_facts
vmware_host_ssl_facts:
redirect: community.vmware.vmware_host_ssl_facts
vmware_host_vmhba_facts:
redirect: community.vmware.vmware_host_vmhba_facts
vmware_host_vmnic_facts:
redirect: community.vmware.vmware_host_vmnic_facts
vmware_local_role_facts:
redirect: community.vmware.vmware_local_role_facts
vmware_local_user_facts:
redirect: community.vmware.vmware_local_user_facts
vmware_portgroup_facts:
redirect: community.vmware.vmware_portgroup_facts
vmware_resource_pool_facts:
redirect: community.vmware.vmware_resource_pool_facts
vmware_tag_facts:
redirect: community.vmware.vmware_tag_facts
vmware_target_canonical_facts:
redirect: community.vmware.vmware_target_canonical_facts
vmware_vm_facts:
redirect: community.vmware.vmware_vm_facts
vmware_vmkernel_facts:
redirect: community.vmware.vmware_vmkernel_facts
vmware_vswitch_facts:
redirect: community.vmware.vmware_vswitch_facts
vca_fw:
redirect: community.vmware.vca_fw
vca_nat:
redirect: community.vmware.vca_nat
vca_vapp:
redirect: community.vmware.vca_vapp
vcenter_extension:
redirect: community.vmware.vcenter_extension
vcenter_extension_info:
redirect: community.vmware.vcenter_extension_info
vcenter_folder:
redirect: community.vmware.vcenter_folder
vcenter_license:
redirect: community.vmware.vcenter_license
vmware_about_info:
redirect: community.vmware.vmware_about_info
vmware_category:
redirect: community.vmware.vmware_category
vmware_category_info:
redirect: community.vmware.vmware_category_info
vmware_cfg_backup:
redirect: community.vmware.vmware_cfg_backup
vmware_cluster:
redirect: community.vmware.vmware_cluster
vmware_cluster_drs:
redirect: community.vmware.vmware_cluster_drs
vmware_cluster_ha:
redirect: community.vmware.vmware_cluster_ha
vmware_cluster_info:
redirect: community.vmware.vmware_cluster_info
vmware_cluster_vsan:
redirect: community.vmware.vmware_cluster_vsan
vmware_content_deploy_template:
redirect: community.vmware.vmware_content_deploy_template
vmware_content_library_info:
redirect: community.vmware.vmware_content_library_info
vmware_content_library_manager:
redirect: community.vmware.vmware_content_library_manager
vmware_datacenter:
redirect: community.vmware.vmware_datacenter
vmware_datastore_cluster:
redirect: community.vmware.vmware_datastore_cluster
vmware_datastore_info:
redirect: community.vmware.vmware_datastore_info
vmware_datastore_maintenancemode:
redirect: community.vmware.vmware_datastore_maintenancemode
vmware_deploy_ovf:
redirect: community.vmware.vmware_deploy_ovf
vmware_drs_group:
redirect: community.vmware.vmware_drs_group
vmware_drs_group_info:
redirect: community.vmware.vmware_drs_group_info
vmware_drs_rule_info:
redirect: community.vmware.vmware_drs_rule_info
vmware_dvs_host:
redirect: community.vmware.vmware_dvs_host
vmware_dvs_portgroup:
redirect: community.vmware.vmware_dvs_portgroup
vmware_dvs_portgroup_find:
redirect: community.vmware.vmware_dvs_portgroup_find
vmware_dvs_portgroup_info:
redirect: community.vmware.vmware_dvs_portgroup_info
vmware_dvswitch:
redirect: community.vmware.vmware_dvswitch
vmware_dvswitch_lacp:
redirect: community.vmware.vmware_dvswitch_lacp
vmware_dvswitch_nioc:
redirect: community.vmware.vmware_dvswitch_nioc
vmware_dvswitch_pvlans:
redirect: community.vmware.vmware_dvswitch_pvlans
vmware_dvswitch_uplink_pg:
redirect: community.vmware.vmware_dvswitch_uplink_pg
vmware_evc_mode:
redirect: community.vmware.vmware_evc_mode
vmware_export_ovf:
redirect: community.vmware.vmware_export_ovf
vmware_folder_info:
redirect: community.vmware.vmware_folder_info
vmware_guest:
redirect: community.vmware.vmware_guest
vmware_guest_boot_info:
redirect: community.vmware.vmware_guest_boot_info
vmware_guest_boot_manager:
redirect: community.vmware.vmware_guest_boot_manager
vmware_guest_controller:
redirect: community.vmware.vmware_guest_controller
vmware_guest_cross_vc_clone:
redirect: community.vmware.vmware_guest_cross_vc_clone
vmware_guest_custom_attribute_defs:
redirect: community.vmware.vmware_guest_custom_attribute_defs
vmware_guest_custom_attributes:
redirect: community.vmware.vmware_guest_custom_attributes
vmware_guest_customization_info:
redirect: community.vmware.vmware_guest_customization_info
vmware_guest_disk:
redirect: community.vmware.vmware_guest_disk
vmware_guest_disk_info:
redirect: community.vmware.vmware_guest_disk_info
vmware_guest_file_operation:
redirect: community.vmware.vmware_guest_file_operation
vmware_guest_find:
redirect: community.vmware.vmware_guest_find
vmware_guest_info:
redirect: community.vmware.vmware_guest_info
vmware_guest_move:
redirect: community.vmware.vmware_guest_move
vmware_guest_network:
redirect: community.vmware.vmware_guest_network
vmware_guest_powerstate:
redirect: community.vmware.vmware_guest_powerstate
vmware_guest_register_operation:
redirect: community.vmware.vmware_guest_register_operation
vmware_guest_screenshot:
redirect: community.vmware.vmware_guest_screenshot
vmware_guest_sendkey:
redirect: community.vmware.vmware_guest_sendkey
vmware_guest_serial_port:
redirect: community.vmware.vmware_guest_serial_port
vmware_guest_snapshot:
redirect: community.vmware.vmware_guest_snapshot
vmware_guest_snapshot_info:
redirect: community.vmware.vmware_guest_snapshot_info
vmware_guest_tools_info:
redirect: community.vmware.vmware_guest_tools_info
vmware_guest_tools_upgrade:
redirect: community.vmware.vmware_guest_tools_upgrade
vmware_guest_tools_wait:
redirect: community.vmware.vmware_guest_tools_wait
vmware_guest_video:
redirect: community.vmware.vmware_guest_video
vmware_guest_vnc:
redirect: community.vmware.vmware_guest_vnc
vmware_host:
redirect: community.vmware.vmware_host
vmware_host_acceptance:
redirect: community.vmware.vmware_host_acceptance
vmware_host_active_directory:
redirect: community.vmware.vmware_host_active_directory
vmware_host_auto_start:
redirect: community.vmware.vmware_host_auto_start
vmware_host_capability_info:
redirect: community.vmware.vmware_host_capability_info
vmware_host_config_info:
redirect: community.vmware.vmware_host_config_info
vmware_host_config_manager:
redirect: community.vmware.vmware_host_config_manager
vmware_host_datastore:
redirect: community.vmware.vmware_host_datastore
vmware_host_dns:
redirect: community.vmware.vmware_host_dns
vmware_host_dns_info:
redirect: community.vmware.vmware_host_dns_info
vmware_host_facts:
redirect: community.vmware.vmware_host_facts
vmware_host_feature_info:
redirect: community.vmware.vmware_host_feature_info
vmware_host_firewall_info:
redirect: community.vmware.vmware_host_firewall_info
vmware_host_firewall_manager:
redirect: community.vmware.vmware_host_firewall_manager
vmware_host_hyperthreading:
redirect: community.vmware.vmware_host_hyperthreading
vmware_host_ipv6:
redirect: community.vmware.vmware_host_ipv6
vmware_host_kernel_manager:
redirect: community.vmware.vmware_host_kernel_manager
vmware_host_lockdown:
redirect: community.vmware.vmware_host_lockdown
vmware_host_ntp:
redirect: community.vmware.vmware_host_ntp
vmware_host_ntp_info:
redirect: community.vmware.vmware_host_ntp_info
vmware_host_package_info:
redirect: community.vmware.vmware_host_package_info
vmware_host_powermgmt_policy:
redirect: community.vmware.vmware_host_powermgmt_policy
vmware_host_powerstate:
redirect: community.vmware.vmware_host_powerstate
vmware_host_scanhba:
redirect: community.vmware.vmware_host_scanhba
vmware_host_service_info:
redirect: community.vmware.vmware_host_service_info
vmware_host_service_manager:
redirect: community.vmware.vmware_host_service_manager
vmware_host_snmp:
redirect: community.vmware.vmware_host_snmp
vmware_host_ssl_info:
redirect: community.vmware.vmware_host_ssl_info
vmware_host_vmhba_info:
redirect: community.vmware.vmware_host_vmhba_info
vmware_host_vmnic_info:
redirect: community.vmware.vmware_host_vmnic_info
vmware_local_role_info:
redirect: community.vmware.vmware_local_role_info
vmware_local_role_manager:
redirect: community.vmware.vmware_local_role_manager
vmware_local_user_info:
redirect: community.vmware.vmware_local_user_info
vmware_local_user_manager:
redirect: community.vmware.vmware_local_user_manager
vmware_maintenancemode:
redirect: community.vmware.vmware_maintenancemode
vmware_migrate_vmk:
redirect: community.vmware.vmware_migrate_vmk
vmware_object_role_permission:
redirect: community.vmware.vmware_object_role_permission
vmware_portgroup:
redirect: community.vmware.vmware_portgroup
vmware_portgroup_info:
redirect: community.vmware.vmware_portgroup_info
vmware_resource_pool:
redirect: community.vmware.vmware_resource_pool
vmware_resource_pool_info:
redirect: community.vmware.vmware_resource_pool_info
vmware_tag:
redirect: community.vmware.vmware_tag
vmware_tag_info:
redirect: community.vmware.vmware_tag_info
vmware_tag_manager:
redirect: community.vmware.vmware_tag_manager
vmware_target_canonical_info:
redirect: community.vmware.vmware_target_canonical_info
vmware_vcenter_settings:
redirect: community.vmware.vmware_vcenter_settings
vmware_vcenter_statistics:
redirect: community.vmware.vmware_vcenter_statistics
vmware_vm_host_drs_rule:
redirect: community.vmware.vmware_vm_host_drs_rule
vmware_vm_info:
redirect: community.vmware.vmware_vm_info
vmware_vm_shell:
redirect: community.vmware.vmware_vm_shell
vmware_vm_storage_policy_info:
redirect: community.vmware.vmware_vm_storage_policy_info
vmware_vm_vm_drs_rule:
redirect: community.vmware.vmware_vm_vm_drs_rule
vmware_vm_vss_dvs_migrate:
redirect: community.vmware.vmware_vm_vss_dvs_migrate
vmware_vmkernel:
redirect: community.vmware.vmware_vmkernel
vmware_vmkernel_info:
redirect: community.vmware.vmware_vmkernel_info
vmware_vmkernel_ip_config:
redirect: community.vmware.vmware_vmkernel_ip_config
vmware_vmotion:
redirect: community.vmware.vmware_vmotion
vmware_vsan_cluster:
redirect: community.vmware.vmware_vsan_cluster
vmware_vsan_health_info:
redirect: community.vmware.vmware_vsan_health_info
vmware_vspan_session:
redirect: community.vmware.vmware_vspan_session
vmware_vswitch:
redirect: community.vmware.vmware_vswitch
vmware_vswitch_info:
redirect: community.vmware.vmware_vswitch_info
vsphere_copy:
redirect: community.vmware.vsphere_copy
vsphere_file:
redirect: community.vmware.vsphere_file
psexec:
redirect: community.windows.psexec
win_audit_policy_system:
redirect: community.windows.win_audit_policy_system
win_audit_rule:
redirect: community.windows.win_audit_rule
win_auto_logon:
redirect: community.windows.win_auto_logon
win_certificate_info:
redirect: community.windows.win_certificate_info
win_chocolatey:
redirect: chocolatey.chocolatey.win_chocolatey
win_chocolatey_config:
redirect: chocolatey.chocolatey.win_chocolatey_config
win_chocolatey_facts:
redirect: chocolatey.chocolatey.win_chocolatey_facts
win_chocolatey_feature:
redirect: chocolatey.chocolatey.win_chocolatey_feature
win_chocolatey_source:
redirect: chocolatey.chocolatey.win_chocolatey_source
win_computer_description:
redirect: community.windows.win_computer_description
win_credential:
redirect: community.windows.win_credential
win_data_deduplication:
redirect: community.windows.win_data_deduplication
win_defrag:
redirect: community.windows.win_defrag
win_disk_facts:
redirect: community.windows.win_disk_facts
win_disk_image:
redirect: community.windows.win_disk_image
win_dns_record:
redirect: community.windows.win_dns_record
win_domain_computer:
redirect: community.windows.win_domain_computer
win_domain_group:
redirect: community.windows.win_domain_group
win_domain_group_membership:
redirect: community.windows.win_domain_group_membership
win_domain_object_info:
redirect: community.windows.win_domain_object_info
win_domain_user:
redirect: community.windows.win_domain_user
win_dotnet_ngen:
redirect: community.windows.win_dotnet_ngen
win_eventlog:
redirect: community.windows.win_eventlog
win_eventlog_entry:
redirect: community.windows.win_eventlog_entry
win_file_compression:
redirect: community.windows.win_file_compression
win_file_version:
redirect: community.windows.win_file_version
win_firewall:
redirect: community.windows.win_firewall
win_firewall_rule:
redirect: community.windows.win_firewall_rule
win_format:
redirect: community.windows.win_format
win_hosts:
redirect: community.windows.win_hosts
win_hotfix:
redirect: community.windows.win_hotfix
win_http_proxy:
redirect: community.windows.win_http_proxy
win_iis_virtualdirectory:
redirect: community.windows.win_iis_virtualdirectory
win_iis_webapplication:
redirect: community.windows.win_iis_webapplication
win_iis_webapppool:
redirect: community.windows.win_iis_webapppool
win_iis_webbinding:
redirect: community.windows.win_iis_webbinding
win_iis_website:
redirect: community.windows.win_iis_website
win_inet_proxy:
redirect: community.windows.win_inet_proxy
win_initialize_disk:
redirect: community.windows.win_initialize_disk
win_lineinfile:
redirect: community.windows.win_lineinfile
win_mapped_drive:
redirect: community.windows.win_mapped_drive
win_msg:
redirect: community.windows.win_msg
win_netbios:
redirect: community.windows.win_netbios
win_nssm:
redirect: community.windows.win_nssm
win_pagefile:
redirect: community.windows.win_pagefile
win_partition:
redirect: community.windows.win_partition
win_pester:
redirect: community.windows.win_pester
win_power_plan:
redirect: community.windows.win_power_plan
win_product_facts:
redirect: community.windows.win_product_facts
win_psexec:
redirect: community.windows.win_psexec
win_psmodule:
redirect: community.windows.win_psmodule
win_psrepository:
redirect: community.windows.win_psrepository
win_psrepository_info:
redirect: community.windows.win_psrepository_info
win_rabbitmq_plugin:
redirect: community.windows.win_rabbitmq_plugin
win_rds_cap:
redirect: community.windows.win_rds_cap
win_rds_rap:
redirect: community.windows.win_rds_rap
win_rds_settings:
redirect: community.windows.win_rds_settings
win_region:
redirect: community.windows.win_region
win_regmerge:
redirect: community.windows.win_regmerge
win_robocopy:
redirect: community.windows.win_robocopy
win_route:
redirect: community.windows.win_route
win_say:
redirect: community.windows.win_say
win_scheduled_task:
redirect: community.windows.win_scheduled_task
win_scheduled_task_stat:
redirect: community.windows.win_scheduled_task_stat
win_security_policy:
redirect: community.windows.win_security_policy
win_shortcut:
redirect: community.windows.win_shortcut
win_snmp:
redirect: community.windows.win_snmp
win_timezone:
redirect: community.windows.win_timezone
win_toast:
redirect: community.windows.win_toast
win_unzip:
redirect: community.windows.win_unzip
win_user_profile:
redirect: community.windows.win_user_profile
win_wait_for_process:
redirect: community.windows.win_wait_for_process
win_wakeonlan:
redirect: community.windows.win_wakeonlan
win_webpicmd:
redirect: community.windows.win_webpicmd
win_xml:
redirect: community.windows.win_xml
azure_rm_aks_facts:
redirect: community.azure.azure_rm_aks_facts
azure_rm_dnsrecordset_facts:
redirect: community.azure.azure_rm_dnsrecordset_facts
azure_rm_dnszone_facts:
redirect: community.azure.azure_rm_dnszone_facts
azure_rm_networkinterface_facts:
redirect: community.azure.azure_rm_networkinterface_facts
azure_rm_publicipaddress_facts:
redirect: community.azure.azure_rm_publicipaddress_facts
azure_rm_securitygroup_facts:
redirect: community.azure.azure_rm_securitygroup_facts
azure_rm_storageaccount_facts:
redirect: community.azure.azure_rm_storageaccount_facts
azure_rm_virtualmachine_facts:
redirect: community.azure.azure_rm_virtualmachine_facts
azure_rm_virtualnetwork_facts:
redirect: community.azure.azure_rm_virtualnetwork_facts
azure_rm_roledefinition_facts:
redirect: community.azure.azure_rm_roledefinition_facts
azure_rm_autoscale_facts:
redirect: community.azure.azure_rm_autoscale_facts
azure_rm_mysqldatabase_facts:
redirect: community.azure.azure_rm_mysqldatabase_facts
azure_rm_devtestlabschedule_facts:
redirect: community.azure.azure_rm_devtestlabschedule_facts
azure_rm_virtualmachinescaleset_facts:
redirect: community.azure.azure_rm_virtualmachinescaleset_facts
azure_rm_devtestlabcustomimage_facts:
redirect: community.azure.azure_rm_devtestlabcustomimage_facts
azure_rm_cosmosdbaccount_facts:
redirect: community.azure.azure_rm_cosmosdbaccount_facts
azure_rm_subnet_facts:
redirect: community.azure.azure_rm_subnet_facts
azure_rm_aksversion_facts:
redirect: community.azure.azure_rm_aksversion_facts
azure_rm_hdinsightcluster_facts:
redirect: community.azure.azure_rm_hdinsightcluster_facts
azure_rm_virtualmachinescalesetextension_facts:
redirect: community.azure.azure_rm_virtualmachinescalesetextension_facts
azure_rm_loadbalancer_facts:
redirect: community.azure.azure_rm_loadbalancer_facts
azure_rm_roleassignment_facts:
redirect: community.azure.azure_rm_roleassignment_facts
azure_rm_manageddisk_facts:
redirect: community.azure.azure_rm_manageddisk_facts
azure_rm_mysqlserver_facts:
redirect: community.azure.azure_rm_mysqlserver_facts
azure_rm_servicebus_facts:
redirect: community.azure.azure_rm_servicebus_facts
azure_rm_rediscache_facts:
redirect: community.azure.azure_rm_rediscache_facts
azure_rm_resource_facts:
redirect: community.azure.azure_rm_resource_facts
azure_rm_routetable_facts:
redirect: community.azure.azure_rm_routetable_facts
azure_rm_virtualmachine_extension:
redirect: community.azure.azure_rm_virtualmachine_extension
azure_rm_loganalyticsworkspace_facts:
redirect: community.azure.azure_rm_loganalyticsworkspace_facts
azure_rm_sqldatabase_facts:
redirect: community.azure.azure_rm_sqldatabase_facts
azure_rm_devtestlabartifactsource_facts:
redirect: community.azure.azure_rm_devtestlabartifactsource_facts
azure_rm_deployment_facts:
redirect: community.azure.azure_rm_deployment_facts
azure_rm_virtualmachineextension_facts:
redirect: community.azure.azure_rm_virtualmachineextension_facts
azure_rm_applicationsecuritygroup_facts:
redirect: community.azure.azure_rm_applicationsecuritygroup_facts
azure_rm_availabilityset_facts:
redirect: community.azure.azure_rm_availabilityset_facts
azure_rm_mariadbdatabase_facts:
redirect: community.azure.azure_rm_mariadbdatabase_facts
azure_rm_devtestlabenvironment_facts:
redirect: community.azure.azure_rm_devtestlabenvironment_facts
azure_rm_appserviceplan_facts:
redirect: community.azure.azure_rm_appserviceplan_facts
azure_rm_containerinstance_facts:
redirect: community.azure.azure_rm_containerinstance_facts
azure_rm_devtestlabarmtemplate_facts:
redirect: community.azure.azure_rm_devtestlabarmtemplate_facts
azure_rm_devtestlabartifact_facts:
redirect: community.azure.azure_rm_devtestlabartifact_facts
azure_rm_virtualmachinescalesetinstance_facts:
redirect: community.azure.azure_rm_virtualmachinescalesetinstance_facts
azure_rm_cdnendpoint_facts:
redirect: community.azure.azure_rm_cdnendpoint_facts
azure_rm_trafficmanagerprofile_facts:
redirect: community.azure.azure_rm_trafficmanagerprofile_facts
azure_rm_functionapp_facts:
redirect: community.azure.azure_rm_functionapp_facts
azure_rm_virtualmachineimage_facts:
redirect: community.azure.azure_rm_virtualmachineimage_facts
azure_rm_mariadbconfiguration_facts:
redirect: community.azure.azure_rm_mariadbconfiguration_facts
azure_rm_virtualnetworkpeering_facts:
redirect: community.azure.azure_rm_virtualnetworkpeering_facts
azure_rm_sqlserver_facts:
redirect: community.azure.azure_rm_sqlserver_facts
azure_rm_mariadbfirewallrule_facts:
redirect: community.azure.azure_rm_mariadbfirewallrule_facts
azure_rm_mysqlconfiguration_facts:
redirect: community.azure.azure_rm_mysqlconfiguration_facts
azure_rm_mysqlfirewallrule_facts:
redirect: community.azure.azure_rm_mysqlfirewallrule_facts
azure_rm_postgresqlfirewallrule_facts:
redirect: community.azure.azure_rm_postgresqlfirewallrule_facts
azure_rm_mariadbserver_facts:
redirect: community.azure.azure_rm_mariadbserver_facts
azure_rm_postgresqldatabase_facts:
redirect: community.azure.azure_rm_postgresqldatabase_facts
azure_rm_devtestlabvirtualnetwork_facts:
redirect: community.azure.azure_rm_devtestlabvirtualnetwork_facts
azure_rm_devtestlabpolicy_facts:
redirect: community.azure.azure_rm_devtestlabpolicy_facts
azure_rm_trafficmanagerendpoint_facts:
redirect: community.azure.azure_rm_trafficmanagerendpoint_facts
azure_rm_sqlfirewallrule_facts:
redirect: community.azure.azure_rm_sqlfirewallrule_facts
azure_rm_containerregistry_facts:
redirect: community.azure.azure_rm_containerregistry_facts
azure_rm_postgresqlconfiguration_facts:
redirect: community.azure.azure_rm_postgresqlconfiguration_facts
azure_rm_postgresqlserver_facts:
redirect: community.azure.azure_rm_postgresqlserver_facts
azure_rm_devtestlab_facts:
redirect: community.azure.azure_rm_devtestlab_facts
azure_rm_cdnprofile_facts:
redirect: community.azure.azure_rm_cdnprofile_facts
azure_rm_virtualmachine_scaleset:
redirect: community.azure.azure_rm_virtualmachine_scaleset
azure_rm_webapp_facts:
redirect: community.azure.azure_rm_webapp_facts
azure_rm_devtestlabvirtualmachine_facts:
redirect: community.azure.azure_rm_devtestlabvirtualmachine_facts
azure_rm_image_facts:
redirect: community.azure.azure_rm_image_facts
azure_rm_managed_disk:
redirect: community.azure.azure_rm_managed_disk
azure_rm_automationaccount_facts:
redirect: community.azure.azure_rm_automationaccount_facts
azure_rm_lock_facts:
redirect: community.azure.azure_rm_lock_facts
azure_rm_managed_disk_facts:
redirect: community.azure.azure_rm_managed_disk_facts
azure_rm_resourcegroup_facts:
redirect: community.azure.azure_rm_resourcegroup_facts
azure_rm_virtualmachine_scaleset_facts:
redirect: community.azure.azure_rm_virtualmachine_scaleset_facts
snow_record:
redirect: servicenow.servicenow.snow_record
snow_record_find:
redirect: servicenow.servicenow.snow_record_find
aws_az_facts:
redirect: amazon.aws.aws_az_facts
aws_caller_facts:
redirect: amazon.aws.aws_caller_facts
cloudformation_facts:
redirect: amazon.aws.cloudformation_facts
ec2_ami_facts:
redirect: amazon.aws.ec2_ami_facts
ec2_eni_facts:
redirect: amazon.aws.ec2_eni_facts
ec2_group_facts:
redirect: amazon.aws.ec2_group_facts
ec2_snapshot_facts:
redirect: amazon.aws.ec2_snapshot_facts
ec2_vol_facts:
redirect: amazon.aws.ec2_vol_facts
ec2_vpc_dhcp_option_facts:
redirect: amazon.aws.ec2_vpc_dhcp_option_facts
ec2_vpc_net_facts:
redirect: amazon.aws.ec2_vpc_net_facts
ec2_vpc_subnet_facts:
redirect: amazon.aws.ec2_vpc_subnet_facts
aws_az_info:
redirect: amazon.aws.aws_az_info
aws_caller_info:
redirect: amazon.aws.aws_caller_info
aws_s3:
redirect: amazon.aws.aws_s3
cloudformation:
redirect: amazon.aws.cloudformation
cloudformation_info:
redirect: amazon.aws.cloudformation_info
ec2:
redirect: amazon.aws.ec2
ec2_ami:
redirect: amazon.aws.ec2_ami
ec2_ami_info:
redirect: amazon.aws.ec2_ami_info
ec2_elb_lb:
redirect: amazon.aws.ec2_elb_lb
ec2_eni:
redirect: amazon.aws.ec2_eni
ec2_eni_info:
redirect: amazon.aws.ec2_eni_info
ec2_group:
redirect: amazon.aws.ec2_group
ec2_group_info:
redirect: amazon.aws.ec2_group_info
ec2_key:
redirect: amazon.aws.ec2_key
ec2_metadata_facts:
redirect: amazon.aws.ec2_metadata_facts
ec2_snapshot:
redirect: amazon.aws.ec2_snapshot
ec2_snapshot_info:
redirect: amazon.aws.ec2_snapshot_info
ec2_tag:
redirect: amazon.aws.ec2_tag
ec2_tag_info:
redirect: amazon.aws.ec2_tag_info
ec2_vol:
redirect: amazon.aws.ec2_vol
ec2_vol_info:
redirect: amazon.aws.ec2_vol_info
ec2_vpc_dhcp_option:
redirect: amazon.aws.ec2_vpc_dhcp_option
ec2_vpc_dhcp_option_info:
redirect: amazon.aws.ec2_vpc_dhcp_option_info
ec2_vpc_net:
redirect: amazon.aws.ec2_vpc_net
ec2_vpc_net_info:
redirect: amazon.aws.ec2_vpc_net_info
ec2_vpc_subnet:
redirect: amazon.aws.ec2_vpc_subnet
ec2_vpc_subnet_info:
redirect: amazon.aws.ec2_vpc_subnet_info
s3_bucket:
redirect: amazon.aws.s3_bucket
telnet:
redirect: ansible.netcommon.telnet
cli_command:
redirect: ansible.netcommon.cli_command
cli_config:
redirect: ansible.netcommon.cli_config
net_put:
redirect: ansible.netcommon.net_put
net_get:
redirect: ansible.netcommon.net_get
net_linkagg:
redirect: ansible.netcommon.net_linkagg
net_interface:
redirect: ansible.netcommon.net_interface
net_lldp_interface:
redirect: ansible.netcommon.net_lldp_interface
net_vlan:
redirect: ansible.netcommon.net_vlan
net_l2_interface:
redirect: ansible.netcommon.net_l2_interface
net_l3_interface:
redirect: ansible.netcommon.net_l3_interface
net_vrf:
redirect: ansible.netcommon.net_vrf
netconf_config:
redirect: ansible.netcommon.netconf_config
netconf_rpc:
redirect: ansible.netcommon.netconf_rpc
netconf_get:
redirect: ansible.netcommon.netconf_get
net_lldp:
redirect: ansible.netcommon.net_lldp
restconf_get:
redirect: ansible.netcommon.restconf_get
restconf_config:
redirect: ansible.netcommon.restconf_config
net_static_route:
redirect: ansible.netcommon.net_static_route
net_system:
redirect: ansible.netcommon.net_system
net_logging:
redirect: ansible.netcommon.net_logging
net_user:
redirect: ansible.netcommon.net_user
net_ping:
redirect: ansible.netcommon.net_ping
net_banner:
redirect: ansible.netcommon.net_banner
acl:
redirect: ansible.posix.acl
synchronize:
redirect: ansible.posix.synchronize
at:
redirect: ansible.posix.at
authorized_key:
redirect: ansible.posix.authorized_key
mount:
redirect: ansible.posix.mount
seboolean:
redirect: ansible.posix.seboolean
selinux:
redirect: ansible.posix.selinux
sysctl:
redirect: ansible.posix.sysctl
async_status:
redirect: ansible.windows.async_status
setup:
redirect: ansible.windows.setup
slurp:
redirect: ansible.windows.slurp
win_acl:
redirect: ansible.windows.win_acl
win_acl_inheritance:
redirect: ansible.windows.win_acl_inheritance
win_certificate_store:
redirect: ansible.windows.win_certificate_store
win_command:
redirect: ansible.windows.win_command
win_copy:
redirect: ansible.windows.win_copy
win_dns_client:
redirect: ansible.windows.win_dns_client
win_domain:
redirect: ansible.windows.win_domain
win_domain_controller:
redirect: ansible.windows.win_domain_controller
win_domain_membership:
redirect: ansible.windows.win_domain_membership
win_dsc:
redirect: ansible.windows.win_dsc
win_environment:
redirect: ansible.windows.win_environment
win_feature:
redirect: ansible.windows.win_feature
win_file:
redirect: ansible.windows.win_file
win_find:
redirect: ansible.windows.win_find
win_get_url:
redirect: ansible.windows.win_get_url
win_group:
redirect: ansible.windows.win_group
win_group_membership:
redirect: ansible.windows.win_group_membership
win_hostname:
redirect: ansible.windows.win_hostname
win_optional_feature:
redirect: ansible.windows.win_optional_feature
win_owner:
redirect: ansible.windows.win_owner
win_package:
redirect: ansible.windows.win_package
win_path:
redirect: ansible.windows.win_path
win_ping:
redirect: ansible.windows.win_ping
win_reboot:
redirect: ansible.windows.win_reboot
win_reg_stat:
redirect: ansible.windows.win_reg_stat
win_regedit:
redirect: ansible.windows.win_regedit
win_service:
redirect: ansible.windows.win_service
win_service_info:
redirect: ansible.windows.win_service_info
win_share:
redirect: ansible.windows.win_share
win_shell:
redirect: ansible.windows.win_shell
win_stat:
redirect: ansible.windows.win_stat
win_tempfile:
redirect: ansible.windows.win_tempfile
win_template:
redirect: ansible.windows.win_template
win_updates:
redirect: ansible.windows.win_updates
win_uri:
redirect: ansible.windows.win_uri
win_user:
redirect: ansible.windows.win_user
win_user_right:
redirect: ansible.windows.win_user_right
win_wait_for:
redirect: ansible.windows.win_wait_for
win_whoami:
redirect: ansible.windows.win_whoami
fortios_address:
redirect: fortinet.fortios.fortios_address
fortios_alertemail_setting:
redirect: fortinet.fortios.fortios_alertemail_setting
fortios_antivirus_heuristic:
redirect: fortinet.fortios.fortios_antivirus_heuristic
fortios_antivirus_profile:
redirect: fortinet.fortios.fortios_antivirus_profile
fortios_antivirus_quarantine:
redirect: fortinet.fortios.fortios_antivirus_quarantine
fortios_antivirus_settings:
redirect: fortinet.fortios.fortios_antivirus_settings
fortios_application_custom:
redirect: fortinet.fortios.fortios_application_custom
fortios_application_group:
redirect: fortinet.fortios.fortios_application_group
fortios_application_list:
redirect: fortinet.fortios.fortios_application_list
fortios_application_name:
redirect: fortinet.fortios.fortios_application_name
fortios_application_rule_settings:
redirect: fortinet.fortios.fortios_application_rule_settings
fortios_authentication_rule:
redirect: fortinet.fortios.fortios_authentication_rule
fortios_authentication_scheme:
redirect: fortinet.fortios.fortios_authentication_scheme
fortios_authentication_setting:
redirect: fortinet.fortios.fortios_authentication_setting
fortios_config:
redirect: fortinet.fortios.fortios_config
fortios_dlp_filepattern:
redirect: fortinet.fortios.fortios_dlp_filepattern
fortios_dlp_fp_doc_source:
redirect: fortinet.fortios.fortios_dlp_fp_doc_source
fortios_dlp_fp_sensitivity:
redirect: fortinet.fortios.fortios_dlp_fp_sensitivity
fortios_dlp_sensor:
redirect: fortinet.fortios.fortios_dlp_sensor
fortios_dlp_settings:
redirect: fortinet.fortios.fortios_dlp_settings
fortios_dnsfilter_domain_filter:
redirect: fortinet.fortios.fortios_dnsfilter_domain_filter
fortios_dnsfilter_profile:
redirect: fortinet.fortios.fortios_dnsfilter_profile
fortios_endpoint_control_client:
redirect: fortinet.fortios.fortios_endpoint_control_client
fortios_endpoint_control_forticlient_ems:
redirect: fortinet.fortios.fortios_endpoint_control_forticlient_ems
fortios_endpoint_control_forticlient_registration_sync:
redirect: fortinet.fortios.fortios_endpoint_control_forticlient_registration_sync
fortios_endpoint_control_profile:
redirect: fortinet.fortios.fortios_endpoint_control_profile
fortios_endpoint_control_settings:
redirect: fortinet.fortios.fortios_endpoint_control_settings
fortios_extender_controller_extender:
redirect: fortinet.fortios.fortios_extender_controller_extender
fortios_facts:
redirect: fortinet.fortios.fortios_facts
fortios_firewall_address:
redirect: fortinet.fortios.fortios_firewall_address
fortios_firewall_address6:
redirect: fortinet.fortios.fortios_firewall_address6
fortios_firewall_address6_template:
redirect: fortinet.fortios.fortios_firewall_address6_template
fortios_firewall_addrgrp:
redirect: fortinet.fortios.fortios_firewall_addrgrp
fortios_firewall_addrgrp6:
redirect: fortinet.fortios.fortios_firewall_addrgrp6
fortios_firewall_auth_portal:
redirect: fortinet.fortios.fortios_firewall_auth_portal
fortios_firewall_central_snat_map:
redirect: fortinet.fortios.fortios_firewall_central_snat_map
fortios_firewall_DoS_policy:
redirect: fortinet.fortios.fortios_firewall_DoS_policy
fortios_firewall_DoS_policy6:
redirect: fortinet.fortios.fortios_firewall_DoS_policy6
fortios_firewall_dnstranslation:
redirect: fortinet.fortios.fortios_firewall_dnstranslation
fortios_firewall_identity_based_route:
redirect: fortinet.fortios.fortios_firewall_identity_based_route
fortios_firewall_interface_policy:
redirect: fortinet.fortios.fortios_firewall_interface_policy
fortios_firewall_interface_policy6:
redirect: fortinet.fortios.fortios_firewall_interface_policy6
fortios_firewall_internet_service:
redirect: fortinet.fortios.fortios_firewall_internet_service
fortios_firewall_internet_service_custom:
redirect: fortinet.fortios.fortios_firewall_internet_service_custom
fortios_firewall_internet_service_group:
redirect: fortinet.fortios.fortios_firewall_internet_service_group
fortios_firewall_ip_translation:
redirect: fortinet.fortios.fortios_firewall_ip_translation
fortios_firewall_ipmacbinding_setting:
redirect: fortinet.fortios.fortios_firewall_ipmacbinding_setting
fortios_firewall_ipmacbinding_table:
redirect: fortinet.fortios.fortios_firewall_ipmacbinding_table
fortios_firewall_ippool:
redirect: fortinet.fortios.fortios_firewall_ippool
fortios_firewall_ippool6:
redirect: fortinet.fortios.fortios_firewall_ippool6
fortios_firewall_ipv6_eh_filter:
redirect: fortinet.fortios.fortios_firewall_ipv6_eh_filter
fortios_firewall_ldb_monitor:
redirect: fortinet.fortios.fortios_firewall_ldb_monitor
fortios_firewall_local_in_policy:
redirect: fortinet.fortios.fortios_firewall_local_in_policy
fortios_firewall_local_in_policy6:
redirect: fortinet.fortios.fortios_firewall_local_in_policy6
fortios_firewall_multicast_address:
redirect: fortinet.fortios.fortios_firewall_multicast_address
fortios_firewall_multicast_address6:
redirect: fortinet.fortios.fortios_firewall_multicast_address6
fortios_firewall_multicast_policy:
redirect: fortinet.fortios.fortios_firewall_multicast_policy
fortios_firewall_multicast_policy6:
redirect: fortinet.fortios.fortios_firewall_multicast_policy6
fortios_firewall_policy:
redirect: fortinet.fortios.fortios_firewall_policy
fortios_firewall_policy46:
redirect: fortinet.fortios.fortios_firewall_policy46
fortios_firewall_policy6:
redirect: fortinet.fortios.fortios_firewall_policy6
fortios_firewall_policy64:
redirect: fortinet.fortios.fortios_firewall_policy64
fortios_firewall_profile_group:
redirect: fortinet.fortios.fortios_firewall_profile_group
fortios_firewall_profile_protocol_options:
redirect: fortinet.fortios.fortios_firewall_profile_protocol_options
fortios_firewall_proxy_address:
redirect: fortinet.fortios.fortios_firewall_proxy_address
fortios_firewall_proxy_addrgrp:
redirect: fortinet.fortios.fortios_firewall_proxy_addrgrp
fortios_firewall_proxy_policy:
redirect: fortinet.fortios.fortios_firewall_proxy_policy
fortios_firewall_schedule_group:
redirect: fortinet.fortios.fortios_firewall_schedule_group
fortios_firewall_schedule_onetime:
redirect: fortinet.fortios.fortios_firewall_schedule_onetime
fortios_firewall_schedule_recurring:
redirect: fortinet.fortios.fortios_firewall_schedule_recurring
fortios_firewall_service_category:
redirect: fortinet.fortios.fortios_firewall_service_category
fortios_firewall_service_custom:
redirect: fortinet.fortios.fortios_firewall_service_custom
fortios_firewall_service_group:
redirect: fortinet.fortios.fortios_firewall_service_group
fortios_firewall_shaper_per_ip_shaper:
redirect: fortinet.fortios.fortios_firewall_shaper_per_ip_shaper
fortios_firewall_shaper_traffic_shaper:
redirect: fortinet.fortios.fortios_firewall_shaper_traffic_shaper
fortios_firewall_shaping_policy:
redirect: fortinet.fortios.fortios_firewall_shaping_policy
fortios_firewall_shaping_profile:
redirect: fortinet.fortios.fortios_firewall_shaping_profile
fortios_firewall_sniffer:
redirect: fortinet.fortios.fortios_firewall_sniffer
fortios_firewall_ssh_host_key:
redirect: fortinet.fortios.fortios_firewall_ssh_host_key
fortios_firewall_ssh_local_ca:
redirect: fortinet.fortios.fortios_firewall_ssh_local_ca
fortios_firewall_ssh_local_key:
redirect: fortinet.fortios.fortios_firewall_ssh_local_key
fortios_firewall_ssh_setting:
redirect: fortinet.fortios.fortios_firewall_ssh_setting
fortios_firewall_ssl_server:
redirect: fortinet.fortios.fortios_firewall_ssl_server
fortios_firewall_ssl_setting:
redirect: fortinet.fortios.fortios_firewall_ssl_setting
fortios_firewall_ssl_ssh_profile:
redirect: fortinet.fortios.fortios_firewall_ssl_ssh_profile
fortios_firewall_ttl_policy:
redirect: fortinet.fortios.fortios_firewall_ttl_policy
fortios_firewall_vip:
redirect: fortinet.fortios.fortios_firewall_vip
fortios_firewall_vip46:
redirect: fortinet.fortios.fortios_firewall_vip46
fortios_firewall_vip6:
redirect: fortinet.fortios.fortios_firewall_vip6
fortios_firewall_vip64:
redirect: fortinet.fortios.fortios_firewall_vip64
fortios_firewall_vipgrp:
redirect: fortinet.fortios.fortios_firewall_vipgrp
fortios_firewall_vipgrp46:
redirect: fortinet.fortios.fortios_firewall_vipgrp46
fortios_firewall_vipgrp6:
redirect: fortinet.fortios.fortios_firewall_vipgrp6
fortios_firewall_vipgrp64:
redirect: fortinet.fortios.fortios_firewall_vipgrp64
fortios_firewall_wildcard_fqdn_custom:
redirect: fortinet.fortios.fortios_firewall_wildcard_fqdn_custom
fortios_firewall_wildcard_fqdn_group:
redirect: fortinet.fortios.fortios_firewall_wildcard_fqdn_group
fortios_ftp_proxy_explicit:
redirect: fortinet.fortios.fortios_ftp_proxy_explicit
fortios_icap_profile:
redirect: fortinet.fortios.fortios_icap_profile
fortios_icap_server:
redirect: fortinet.fortios.fortios_icap_server
fortios_ips_custom:
redirect: fortinet.fortios.fortios_ips_custom
fortios_ips_decoder:
redirect: fortinet.fortios.fortios_ips_decoder
fortios_ips_global:
redirect: fortinet.fortios.fortios_ips_global
fortios_ips_rule:
redirect: fortinet.fortios.fortios_ips_rule
fortios_ips_rule_settings:
redirect: fortinet.fortios.fortios_ips_rule_settings
fortios_ips_sensor:
redirect: fortinet.fortios.fortios_ips_sensor
fortios_ips_settings:
redirect: fortinet.fortios.fortios_ips_settings
fortios_ipv4_policy:
redirect: fortinet.fortios.fortios_ipv4_policy
fortios_log_custom_field:
redirect: fortinet.fortios.fortios_log_custom_field
fortios_log_disk_filter:
redirect: fortinet.fortios.fortios_log_disk_filter
fortios_log_disk_setting:
redirect: fortinet.fortios.fortios_log_disk_setting
fortios_log_eventfilter:
redirect: fortinet.fortios.fortios_log_eventfilter
fortios_log_fortianalyzer2_filter:
redirect: fortinet.fortios.fortios_log_fortianalyzer2_filter
fortios_log_fortianalyzer2_setting:
redirect: fortinet.fortios.fortios_log_fortianalyzer2_setting
fortios_log_fortianalyzer3_filter:
redirect: fortinet.fortios.fortios_log_fortianalyzer3_filter
fortios_log_fortianalyzer3_setting:
redirect: fortinet.fortios.fortios_log_fortianalyzer3_setting
fortios_log_fortianalyzer_filter:
redirect: fortinet.fortios.fortios_log_fortianalyzer_filter
fortios_log_fortianalyzer_override_filter:
redirect: fortinet.fortios.fortios_log_fortianalyzer_override_filter
fortios_log_fortianalyzer_override_setting:
redirect: fortinet.fortios.fortios_log_fortianalyzer_override_setting
fortios_log_fortianalyzer_setting:
redirect: fortinet.fortios.fortios_log_fortianalyzer_setting
fortios_log_fortiguard_filter:
redirect: fortinet.fortios.fortios_log_fortiguard_filter
fortios_log_fortiguard_override_filter:
redirect: fortinet.fortios.fortios_log_fortiguard_override_filter
fortios_log_fortiguard_override_setting:
redirect: fortinet.fortios.fortios_log_fortiguard_override_setting
fortios_log_fortiguard_setting:
redirect: fortinet.fortios.fortios_log_fortiguard_setting
fortios_log_gui_display:
redirect: fortinet.fortios.fortios_log_gui_display
fortios_log_memory_filter:
redirect: fortinet.fortios.fortios_log_memory_filter
fortios_log_memory_global_setting:
redirect: fortinet.fortios.fortios_log_memory_global_setting
fortios_log_memory_setting:
redirect: fortinet.fortios.fortios_log_memory_setting
fortios_log_null_device_filter:
redirect: fortinet.fortios.fortios_log_null_device_filter
fortios_log_null_device_setting:
redirect: fortinet.fortios.fortios_log_null_device_setting
fortios_log_setting:
redirect: fortinet.fortios.fortios_log_setting
fortios_log_syslogd2_filter:
redirect: fortinet.fortios.fortios_log_syslogd2_filter
fortios_log_syslogd2_setting:
redirect: fortinet.fortios.fortios_log_syslogd2_setting
fortios_log_syslogd3_filter:
redirect: fortinet.fortios.fortios_log_syslogd3_filter
fortios_log_syslogd3_setting:
redirect: fortinet.fortios.fortios_log_syslogd3_setting
fortios_log_syslogd4_filter:
redirect: fortinet.fortios.fortios_log_syslogd4_filter
fortios_log_syslogd4_setting:
redirect: fortinet.fortios.fortios_log_syslogd4_setting
fortios_log_syslogd_filter:
redirect: fortinet.fortios.fortios_log_syslogd_filter
fortios_log_syslogd_override_filter:
redirect: fortinet.fortios.fortios_log_syslogd_override_filter
fortios_log_syslogd_override_setting:
redirect: fortinet.fortios.fortios_log_syslogd_override_setting
fortios_log_syslogd_setting:
redirect: fortinet.fortios.fortios_log_syslogd_setting
fortios_log_threat_weight:
redirect: fortinet.fortios.fortios_log_threat_weight
fortios_log_webtrends_filter:
redirect: fortinet.fortios.fortios_log_webtrends_filter
fortios_log_webtrends_setting:
redirect: fortinet.fortios.fortios_log_webtrends_setting
fortios_report_chart:
redirect: fortinet.fortios.fortios_report_chart
fortios_report_dataset:
redirect: fortinet.fortios.fortios_report_dataset
fortios_report_layout:
redirect: fortinet.fortios.fortios_report_layout
fortios_report_setting:
redirect: fortinet.fortios.fortios_report_setting
fortios_report_style:
redirect: fortinet.fortios.fortios_report_style
fortios_report_theme:
redirect: fortinet.fortios.fortios_report_theme
fortios_router_access_list:
redirect: fortinet.fortios.fortios_router_access_list
fortios_router_access_list6:
redirect: fortinet.fortios.fortios_router_access_list6
fortios_router_aspath_list:
redirect: fortinet.fortios.fortios_router_aspath_list
fortios_router_auth_path:
redirect: fortinet.fortios.fortios_router_auth_path
fortios_router_bfd:
redirect: fortinet.fortios.fortios_router_bfd
fortios_router_bfd6:
redirect: fortinet.fortios.fortios_router_bfd6
fortios_router_bgp:
redirect: fortinet.fortios.fortios_router_bgp
fortios_router_community_list:
redirect: fortinet.fortios.fortios_router_community_list
fortios_router_isis:
redirect: fortinet.fortios.fortios_router_isis
fortios_router_key_chain:
redirect: fortinet.fortios.fortios_router_key_chain
fortios_router_multicast:
redirect: fortinet.fortios.fortios_router_multicast
fortios_router_multicast6:
redirect: fortinet.fortios.fortios_router_multicast6
fortios_router_multicast_flow:
redirect: fortinet.fortios.fortios_router_multicast_flow
fortios_router_ospf:
redirect: fortinet.fortios.fortios_router_ospf
fortios_router_ospf6:
redirect: fortinet.fortios.fortios_router_ospf6
fortios_router_policy:
redirect: fortinet.fortios.fortios_router_policy
fortios_router_policy6:
redirect: fortinet.fortios.fortios_router_policy6
fortios_router_prefix_list:
redirect: fortinet.fortios.fortios_router_prefix_list
fortios_router_prefix_list6:
redirect: fortinet.fortios.fortios_router_prefix_list6
fortios_router_rip:
redirect: fortinet.fortios.fortios_router_rip
fortios_router_ripng:
redirect: fortinet.fortios.fortios_router_ripng
fortios_router_route_map:
redirect: fortinet.fortios.fortios_router_route_map
fortios_router_setting:
redirect: fortinet.fortios.fortios_router_setting
fortios_router_static:
redirect: fortinet.fortios.fortios_router_static
fortios_router_static6:
redirect: fortinet.fortios.fortios_router_static6
fortios_spamfilter_bwl:
redirect: fortinet.fortios.fortios_spamfilter_bwl
fortios_spamfilter_bword:
redirect: fortinet.fortios.fortios_spamfilter_bword
fortios_spamfilter_dnsbl:
redirect: fortinet.fortios.fortios_spamfilter_dnsbl
fortios_spamfilter_fortishield:
redirect: fortinet.fortios.fortios_spamfilter_fortishield
fortios_spamfilter_iptrust:
redirect: fortinet.fortios.fortios_spamfilter_iptrust
fortios_spamfilter_mheader:
redirect: fortinet.fortios.fortios_spamfilter_mheader
fortios_spamfilter_options:
redirect: fortinet.fortios.fortios_spamfilter_options
fortios_spamfilter_profile:
redirect: fortinet.fortios.fortios_spamfilter_profile
fortios_ssh_filter_profile:
redirect: fortinet.fortios.fortios_ssh_filter_profile
fortios_switch_controller_802_1X_settings:
redirect: fortinet.fortios.fortios_switch_controller_802_1X_settings
fortios_switch_controller_custom_command:
redirect: fortinet.fortios.fortios_switch_controller_custom_command
fortios_switch_controller_global:
redirect: fortinet.fortios.fortios_switch_controller_global
fortios_switch_controller_igmp_snooping:
redirect: fortinet.fortios.fortios_switch_controller_igmp_snooping
fortios_switch_controller_lldp_profile:
redirect: fortinet.fortios.fortios_switch_controller_lldp_profile
fortios_switch_controller_lldp_settings:
redirect: fortinet.fortios.fortios_switch_controller_lldp_settings
fortios_switch_controller_mac_sync_settings:
redirect: fortinet.fortios.fortios_switch_controller_mac_sync_settings
fortios_switch_controller_managed_switch:
redirect: fortinet.fortios.fortios_switch_controller_managed_switch
fortios_switch_controller_network_monitor_settings:
redirect: fortinet.fortios.fortios_switch_controller_network_monitor_settings
fortios_switch_controller_qos_dot1p_map:
redirect: fortinet.fortios.fortios_switch_controller_qos_dot1p_map
fortios_switch_controller_qos_ip_dscp_map:
redirect: fortinet.fortios.fortios_switch_controller_qos_ip_dscp_map
fortios_switch_controller_qos_qos_policy:
redirect: fortinet.fortios.fortios_switch_controller_qos_qos_policy
fortios_switch_controller_qos_queue_policy:
redirect: fortinet.fortios.fortios_switch_controller_qos_queue_policy
fortios_switch_controller_quarantine:
redirect: fortinet.fortios.fortios_switch_controller_quarantine
fortios_switch_controller_security_policy_802_1X:
redirect: fortinet.fortios.fortios_switch_controller_security_policy_802_1X
fortios_switch_controller_security_policy_captive_portal:
redirect: fortinet.fortios.fortios_switch_controller_security_policy_captive_portal
fortios_switch_controller_sflow:
redirect: fortinet.fortios.fortios_switch_controller_sflow
fortios_switch_controller_storm_control:
redirect: fortinet.fortios.fortios_switch_controller_storm_control
fortios_switch_controller_stp_settings:
redirect: fortinet.fortios.fortios_switch_controller_stp_settings
fortios_switch_controller_switch_group:
redirect: fortinet.fortios.fortios_switch_controller_switch_group
fortios_switch_controller_switch_interface_tag:
redirect: fortinet.fortios.fortios_switch_controller_switch_interface_tag
fortios_switch_controller_switch_log:
redirect: fortinet.fortios.fortios_switch_controller_switch_log
fortios_switch_controller_switch_profile:
redirect: fortinet.fortios.fortios_switch_controller_switch_profile
fortios_switch_controller_system:
redirect: fortinet.fortios.fortios_switch_controller_system
fortios_switch_controller_virtual_port_pool:
redirect: fortinet.fortios.fortios_switch_controller_virtual_port_pool
fortios_switch_controller_vlan:
redirect: fortinet.fortios.fortios_switch_controller_vlan
fortios_system_accprofile:
redirect: fortinet.fortios.fortios_system_accprofile
fortios_system_admin:
redirect: fortinet.fortios.fortios_system_admin
fortios_system_affinity_interrupt:
redirect: fortinet.fortios.fortios_system_affinity_interrupt
fortios_system_affinity_packet_redistribution:
redirect: fortinet.fortios.fortios_system_affinity_packet_redistribution
fortios_system_alarm:
redirect: fortinet.fortios.fortios_system_alarm
fortios_system_alias:
redirect: fortinet.fortios.fortios_system_alias
fortios_system_api_user:
redirect: fortinet.fortios.fortios_system_api_user
fortios_system_arp_table:
redirect: fortinet.fortios.fortios_system_arp_table
fortios_system_auto_install:
redirect: fortinet.fortios.fortios_system_auto_install
fortios_system_auto_script:
redirect: fortinet.fortios.fortios_system_auto_script
fortios_system_automation_action:
redirect: fortinet.fortios.fortios_system_automation_action
fortios_system_automation_destination:
redirect: fortinet.fortios.fortios_system_automation_destination
fortios_system_automation_stitch:
redirect: fortinet.fortios.fortios_system_automation_stitch
fortios_system_automation_trigger:
redirect: fortinet.fortios.fortios_system_automation_trigger
fortios_system_autoupdate_push_update:
redirect: fortinet.fortios.fortios_system_autoupdate_push_update
fortios_system_autoupdate_schedule:
redirect: fortinet.fortios.fortios_system_autoupdate_schedule
fortios_system_autoupdate_tunneling:
redirect: fortinet.fortios.fortios_system_autoupdate_tunneling
fortios_system_central_management:
redirect: fortinet.fortios.fortios_system_central_management
fortios_system_cluster_sync:
redirect: fortinet.fortios.fortios_system_cluster_sync
fortios_system_console:
redirect: fortinet.fortios.fortios_system_console
fortios_system_csf:
redirect: fortinet.fortios.fortios_system_csf
fortios_system_custom_language:
redirect: fortinet.fortios.fortios_system_custom_language
fortios_system_ddns:
redirect: fortinet.fortios.fortios_system_ddns
fortios_system_dedicated_mgmt:
redirect: fortinet.fortios.fortios_system_dedicated_mgmt
fortios_system_dhcp6_server:
redirect: fortinet.fortios.fortios_system_dhcp6_server
fortios_system_dhcp_server:
redirect: fortinet.fortios.fortios_system_dhcp_server
fortios_system_dns:
redirect: fortinet.fortios.fortios_system_dns
fortios_system_dns_database:
redirect: fortinet.fortios.fortios_system_dns_database
fortios_system_dns_server:
redirect: fortinet.fortios.fortios_system_dns_server
fortios_system_dscp_based_priority:
redirect: fortinet.fortios.fortios_system_dscp_based_priority
fortios_system_email_server:
redirect: fortinet.fortios.fortios_system_email_server
fortios_system_external_resource:
redirect: fortinet.fortios.fortios_system_external_resource
fortios_system_fips_cc:
redirect: fortinet.fortios.fortios_system_fips_cc
fortios_system_firmware_upgrade:
redirect: fortinet.fortios.fortios_system_firmware_upgrade
fortios_system_fm:
redirect: fortinet.fortios.fortios_system_fm
fortios_system_fortiguard:
redirect: fortinet.fortios.fortios_system_fortiguard
fortios_system_fortimanager:
redirect: fortinet.fortios.fortios_system_fortimanager
fortios_system_fortisandbox:
redirect: fortinet.fortios.fortios_system_fortisandbox
fortios_system_fsso_polling:
redirect: fortinet.fortios.fortios_system_fsso_polling
fortios_system_ftm_push:
redirect: fortinet.fortios.fortios_system_ftm_push
fortios_system_geoip_override:
redirect: fortinet.fortios.fortios_system_geoip_override
fortios_system_global:
redirect: fortinet.fortios.fortios_system_global
fortios_system_gre_tunnel:
redirect: fortinet.fortios.fortios_system_gre_tunnel
fortios_system_ha:
redirect: fortinet.fortios.fortios_system_ha
fortios_system_ha_monitor:
redirect: fortinet.fortios.fortios_system_ha_monitor
fortios_system_interface:
redirect: fortinet.fortios.fortios_system_interface
fortios_system_ipip_tunnel:
redirect: fortinet.fortios.fortios_system_ipip_tunnel
fortios_system_ips_urlfilter_dns:
redirect: fortinet.fortios.fortios_system_ips_urlfilter_dns
fortios_system_ips_urlfilter_dns6:
redirect: fortinet.fortios.fortios_system_ips_urlfilter_dns6
fortios_system_ipv6_neighbor_cache:
redirect: fortinet.fortios.fortios_system_ipv6_neighbor_cache
fortios_system_ipv6_tunnel:
redirect: fortinet.fortios.fortios_system_ipv6_tunnel
fortios_system_link_monitor:
redirect: fortinet.fortios.fortios_system_link_monitor
fortios_system_mac_address_table:
redirect: fortinet.fortios.fortios_system_mac_address_table
fortios_system_management_tunnel:
redirect: fortinet.fortios.fortios_system_management_tunnel
fortios_system_mobile_tunnel:
redirect: fortinet.fortios.fortios_system_mobile_tunnel
fortios_system_nat64:
redirect: fortinet.fortios.fortios_system_nat64
fortios_system_nd_proxy:
redirect: fortinet.fortios.fortios_system_nd_proxy
fortios_system_netflow:
redirect: fortinet.fortios.fortios_system_netflow
fortios_system_network_visibility:
redirect: fortinet.fortios.fortios_system_network_visibility
fortios_system_ntp:
redirect: fortinet.fortios.fortios_system_ntp
fortios_system_object_tagging:
redirect: fortinet.fortios.fortios_system_object_tagging
fortios_system_password_policy:
redirect: fortinet.fortios.fortios_system_password_policy
fortios_system_password_policy_guest_admin:
redirect: fortinet.fortios.fortios_system_password_policy_guest_admin
fortios_system_pppoe_interface:
redirect: fortinet.fortios.fortios_system_pppoe_interface
fortios_system_probe_response:
redirect: fortinet.fortios.fortios_system_probe_response
fortios_system_proxy_arp:
redirect: fortinet.fortios.fortios_system_proxy_arp
fortios_system_replacemsg_admin:
redirect: fortinet.fortios.fortios_system_replacemsg_admin
fortios_system_replacemsg_alertmail:
redirect: fortinet.fortios.fortios_system_replacemsg_alertmail
fortios_system_replacemsg_auth:
redirect: fortinet.fortios.fortios_system_replacemsg_auth
fortios_system_replacemsg_device_detection_portal:
redirect: fortinet.fortios.fortios_system_replacemsg_device_detection_portal
fortios_system_replacemsg_ec:
redirect: fortinet.fortios.fortios_system_replacemsg_ec
fortios_system_replacemsg_fortiguard_wf:
redirect: fortinet.fortios.fortios_system_replacemsg_fortiguard_wf
fortios_system_replacemsg_ftp:
redirect: fortinet.fortios.fortios_system_replacemsg_ftp
fortios_system_replacemsg_group:
redirect: fortinet.fortios.fortios_system_replacemsg_group
fortios_system_replacemsg_http:
redirect: fortinet.fortios.fortios_system_replacemsg_http
fortios_system_replacemsg_icap:
redirect: fortinet.fortios.fortios_system_replacemsg_icap
fortios_system_replacemsg_image:
redirect: fortinet.fortios.fortios_system_replacemsg_image
fortios_system_replacemsg_mail:
redirect: fortinet.fortios.fortios_system_replacemsg_mail
fortios_system_replacemsg_nac_quar:
redirect: fortinet.fortios.fortios_system_replacemsg_nac_quar
fortios_system_replacemsg_nntp:
redirect: fortinet.fortios.fortios_system_replacemsg_nntp
fortios_system_replacemsg_spam:
redirect: fortinet.fortios.fortios_system_replacemsg_spam
fortios_system_replacemsg_sslvpn:
redirect: fortinet.fortios.fortios_system_replacemsg_sslvpn
fortios_system_replacemsg_traffic_quota:
redirect: fortinet.fortios.fortios_system_replacemsg_traffic_quota
fortios_system_replacemsg_utm:
redirect: fortinet.fortios.fortios_system_replacemsg_utm
fortios_system_replacemsg_webproxy:
redirect: fortinet.fortios.fortios_system_replacemsg_webproxy
fortios_system_resource_limits:
redirect: fortinet.fortios.fortios_system_resource_limits
fortios_system_sdn_connector:
redirect: fortinet.fortios.fortios_system_sdn_connector
fortios_system_session_helper:
redirect: fortinet.fortios.fortios_system_session_helper
fortios_system_session_ttl:
redirect: fortinet.fortios.fortios_system_session_ttl
fortios_system_settings:
redirect: fortinet.fortios.fortios_system_settings
fortios_system_sflow:
redirect: fortinet.fortios.fortios_system_sflow
fortios_system_sit_tunnel:
redirect: fortinet.fortios.fortios_system_sit_tunnel
fortios_system_sms_server:
redirect: fortinet.fortios.fortios_system_sms_server
fortios_system_snmp_community:
redirect: fortinet.fortios.fortios_system_snmp_community
fortios_system_snmp_sysinfo:
redirect: fortinet.fortios.fortios_system_snmp_sysinfo
fortios_system_snmp_user:
redirect: fortinet.fortios.fortios_system_snmp_user
fortios_system_storage:
redirect: fortinet.fortios.fortios_system_storage
fortios_system_switch_interface:
redirect: fortinet.fortios.fortios_system_switch_interface
fortios_system_tos_based_priority:
redirect: fortinet.fortios.fortios_system_tos_based_priority
fortios_system_vdom:
redirect: fortinet.fortios.fortios_system_vdom
fortios_system_vdom_dns:
redirect: fortinet.fortios.fortios_system_vdom_dns
fortios_system_vdom_exception:
redirect: fortinet.fortios.fortios_system_vdom_exception
fortios_system_vdom_link:
redirect: fortinet.fortios.fortios_system_vdom_link
fortios_system_vdom_netflow:
redirect: fortinet.fortios.fortios_system_vdom_netflow
fortios_system_vdom_property:
redirect: fortinet.fortios.fortios_system_vdom_property
fortios_system_vdom_radius_server:
redirect: fortinet.fortios.fortios_system_vdom_radius_server
fortios_system_vdom_sflow:
redirect: fortinet.fortios.fortios_system_vdom_sflow
fortios_system_virtual_wan_link:
redirect: fortinet.fortios.fortios_system_virtual_wan_link
fortios_system_virtual_wire_pair:
redirect: fortinet.fortios.fortios_system_virtual_wire_pair
fortios_system_vxlan:
redirect: fortinet.fortios.fortios_system_vxlan
fortios_system_wccp:
redirect: fortinet.fortios.fortios_system_wccp
fortios_system_zone:
redirect: fortinet.fortios.fortios_system_zone
fortios_user_adgrp:
redirect: fortinet.fortios.fortios_user_adgrp
fortios_user_device:
redirect: fortinet.fortios.fortios_user_device
fortios_user_device_access_list:
redirect: fortinet.fortios.fortios_user_device_access_list
fortios_user_device_category:
redirect: fortinet.fortios.fortios_user_device_category
fortios_user_device_group:
redirect: fortinet.fortios.fortios_user_device_group
fortios_user_domain_controller:
redirect: fortinet.fortios.fortios_user_domain_controller
fortios_user_fortitoken:
redirect: fortinet.fortios.fortios_user_fortitoken
fortios_user_fsso:
redirect: fortinet.fortios.fortios_user_fsso
fortios_user_fsso_polling:
redirect: fortinet.fortios.fortios_user_fsso_polling
fortios_user_group:
redirect: fortinet.fortios.fortios_user_group
fortios_user_krb_keytab:
redirect: fortinet.fortios.fortios_user_krb_keytab
fortios_user_ldap:
redirect: fortinet.fortios.fortios_user_ldap
fortios_user_local:
redirect: fortinet.fortios.fortios_user_local
fortios_user_password_policy:
redirect: fortinet.fortios.fortios_user_password_policy
fortios_user_peer:
redirect: fortinet.fortios.fortios_user_peer
fortios_user_peergrp:
redirect: fortinet.fortios.fortios_user_peergrp
fortios_user_pop3:
redirect: fortinet.fortios.fortios_user_pop3
fortios_user_quarantine:
redirect: fortinet.fortios.fortios_user_quarantine
fortios_user_radius:
redirect: fortinet.fortios.fortios_user_radius
fortios_user_security_exempt_list:
redirect: fortinet.fortios.fortios_user_security_exempt_list
fortios_user_setting:
redirect: fortinet.fortios.fortios_user_setting
fortios_user_tacacsplus:
redirect: fortinet.fortios.fortios_user_tacacsplus
fortios_voip_profile:
redirect: fortinet.fortios.fortios_voip_profile
fortios_vpn_certificate_ca:
redirect: fortinet.fortios.fortios_vpn_certificate_ca
fortios_vpn_certificate_crl:
redirect: fortinet.fortios.fortios_vpn_certificate_crl
fortios_vpn_certificate_local:
redirect: fortinet.fortios.fortios_vpn_certificate_local
fortios_vpn_certificate_ocsp_server:
redirect: fortinet.fortios.fortios_vpn_certificate_ocsp_server
fortios_vpn_certificate_remote:
redirect: fortinet.fortios.fortios_vpn_certificate_remote
fortios_vpn_certificate_setting:
redirect: fortinet.fortios.fortios_vpn_certificate_setting
fortios_vpn_ipsec_concentrator:
redirect: fortinet.fortios.fortios_vpn_ipsec_concentrator
fortios_vpn_ipsec_forticlient:
redirect: fortinet.fortios.fortios_vpn_ipsec_forticlient
fortios_vpn_ipsec_manualkey:
redirect: fortinet.fortios.fortios_vpn_ipsec_manualkey
fortios_vpn_ipsec_manualkey_interface:
redirect: fortinet.fortios.fortios_vpn_ipsec_manualkey_interface
fortios_vpn_ipsec_phase1:
redirect: fortinet.fortios.fortios_vpn_ipsec_phase1
fortios_vpn_ipsec_phase1_interface:
redirect: fortinet.fortios.fortios_vpn_ipsec_phase1_interface
fortios_vpn_ipsec_phase2:
redirect: fortinet.fortios.fortios_vpn_ipsec_phase2
fortios_vpn_ipsec_phase2_interface:
redirect: fortinet.fortios.fortios_vpn_ipsec_phase2_interface
fortios_vpn_l2tp:
redirect: fortinet.fortios.fortios_vpn_l2tp
fortios_vpn_pptp:
redirect: fortinet.fortios.fortios_vpn_pptp
fortios_vpn_ssl_settings:
redirect: fortinet.fortios.fortios_vpn_ssl_settings
fortios_vpn_ssl_web_host_check_software:
redirect: fortinet.fortios.fortios_vpn_ssl_web_host_check_software
fortios_vpn_ssl_web_portal:
redirect: fortinet.fortios.fortios_vpn_ssl_web_portal
fortios_vpn_ssl_web_realm:
redirect: fortinet.fortios.fortios_vpn_ssl_web_realm
fortios_vpn_ssl_web_user_bookmark:
redirect: fortinet.fortios.fortios_vpn_ssl_web_user_bookmark
fortios_vpn_ssl_web_user_group_bookmark:
redirect: fortinet.fortios.fortios_vpn_ssl_web_user_group_bookmark
fortios_waf_main_class:
redirect: fortinet.fortios.fortios_waf_main_class
fortios_waf_profile:
redirect: fortinet.fortios.fortios_waf_profile
fortios_waf_signature:
redirect: fortinet.fortios.fortios_waf_signature
fortios_waf_sub_class:
redirect: fortinet.fortios.fortios_waf_sub_class
fortios_wanopt_auth_group:
redirect: fortinet.fortios.fortios_wanopt_auth_group
fortios_wanopt_cache_service:
redirect: fortinet.fortios.fortios_wanopt_cache_service
fortios_wanopt_content_delivery_network_rule:
redirect: fortinet.fortios.fortios_wanopt_content_delivery_network_rule
fortios_wanopt_peer:
redirect: fortinet.fortios.fortios_wanopt_peer
fortios_wanopt_profile:
redirect: fortinet.fortios.fortios_wanopt_profile
fortios_wanopt_remote_storage:
redirect: fortinet.fortios.fortios_wanopt_remote_storage
fortios_wanopt_settings:
redirect: fortinet.fortios.fortios_wanopt_settings
fortios_wanopt_webcache:
redirect: fortinet.fortios.fortios_wanopt_webcache
fortios_web_proxy_debug_url:
redirect: fortinet.fortios.fortios_web_proxy_debug_url
fortios_web_proxy_explicit:
redirect: fortinet.fortios.fortios_web_proxy_explicit
fortios_web_proxy_forward_server:
redirect: fortinet.fortios.fortios_web_proxy_forward_server
fortios_web_proxy_forward_server_group:
redirect: fortinet.fortios.fortios_web_proxy_forward_server_group
fortios_web_proxy_global:
redirect: fortinet.fortios.fortios_web_proxy_global
fortios_web_proxy_profile:
redirect: fortinet.fortios.fortios_web_proxy_profile
fortios_web_proxy_url_match:
redirect: fortinet.fortios.fortios_web_proxy_url_match
fortios_web_proxy_wisp:
redirect: fortinet.fortios.fortios_web_proxy_wisp
fortios_webfilter:
redirect: fortinet.fortios.fortios_webfilter
fortios_webfilter_content:
redirect: fortinet.fortios.fortios_webfilter_content
fortios_webfilter_content_header:
redirect: fortinet.fortios.fortios_webfilter_content_header
fortios_webfilter_fortiguard:
redirect: fortinet.fortios.fortios_webfilter_fortiguard
fortios_webfilter_ftgd_local_cat:
redirect: fortinet.fortios.fortios_webfilter_ftgd_local_cat
fortios_webfilter_ftgd_local_rating:
redirect: fortinet.fortios.fortios_webfilter_ftgd_local_rating
fortios_webfilter_ips_urlfilter_cache_setting:
redirect: fortinet.fortios.fortios_webfilter_ips_urlfilter_cache_setting
fortios_webfilter_ips_urlfilter_setting:
redirect: fortinet.fortios.fortios_webfilter_ips_urlfilter_setting
fortios_webfilter_ips_urlfilter_setting6:
redirect: fortinet.fortios.fortios_webfilter_ips_urlfilter_setting6
fortios_webfilter_override:
redirect: fortinet.fortios.fortios_webfilter_override
fortios_webfilter_profile:
redirect: fortinet.fortios.fortios_webfilter_profile
fortios_webfilter_search_engine:
redirect: fortinet.fortios.fortios_webfilter_search_engine
fortios_webfilter_urlfilter:
redirect: fortinet.fortios.fortios_webfilter_urlfilter
fortios_wireless_controller_ap_status:
redirect: fortinet.fortios.fortios_wireless_controller_ap_status
fortios_wireless_controller_ble_profile:
redirect: fortinet.fortios.fortios_wireless_controller_ble_profile
fortios_wireless_controller_bonjour_profile:
redirect: fortinet.fortios.fortios_wireless_controller_bonjour_profile
fortios_wireless_controller_global:
redirect: fortinet.fortios.fortios_wireless_controller_global
fortios_wireless_controller_hotspot20_anqp_3gpp_cellular:
redirect: fortinet.fortios.fortios_wireless_controller_hotspot20_anqp_3gpp_cellular
fortios_wireless_controller_hotspot20_anqp_ip_address_type:
redirect: fortinet.fortios.fortios_wireless_controller_hotspot20_anqp_ip_address_type
fortios_wireless_controller_hotspot20_anqp_nai_realm:
redirect: fortinet.fortios.fortios_wireless_controller_hotspot20_anqp_nai_realm
fortios_wireless_controller_hotspot20_anqp_network_auth_type:
redirect: fortinet.fortios.fortios_wireless_controller_hotspot20_anqp_network_auth_type
fortios_wireless_controller_hotspot20_anqp_roaming_consortium:
redirect: fortinet.fortios.fortios_wireless_controller_hotspot20_anqp_roaming_consortium
fortios_wireless_controller_hotspot20_anqp_venue_name:
redirect: fortinet.fortios.fortios_wireless_controller_hotspot20_anqp_venue_name
fortios_wireless_controller_hotspot20_h2qp_conn_capability:
redirect: fortinet.fortios.fortios_wireless_controller_hotspot20_h2qp_conn_capability
fortios_wireless_controller_hotspot20_h2qp_operator_name:
redirect: fortinet.fortios.fortios_wireless_controller_hotspot20_h2qp_operator_name
fortios_wireless_controller_hotspot20_h2qp_osu_provider:
redirect: fortinet.fortios.fortios_wireless_controller_hotspot20_h2qp_osu_provider
fortios_wireless_controller_hotspot20_h2qp_wan_metric:
redirect: fortinet.fortios.fortios_wireless_controller_hotspot20_h2qp_wan_metric
fortios_wireless_controller_hotspot20_hs_profile:
redirect: fortinet.fortios.fortios_wireless_controller_hotspot20_hs_profile
fortios_wireless_controller_hotspot20_icon:
redirect: fortinet.fortios.fortios_wireless_controller_hotspot20_icon
fortios_wireless_controller_hotspot20_qos_map:
redirect: fortinet.fortios.fortios_wireless_controller_hotspot20_qos_map
fortios_wireless_controller_inter_controller:
redirect: fortinet.fortios.fortios_wireless_controller_inter_controller
fortios_wireless_controller_qos_profile:
redirect: fortinet.fortios.fortios_wireless_controller_qos_profile
fortios_wireless_controller_setting:
redirect: fortinet.fortios.fortios_wireless_controller_setting
fortios_wireless_controller_timers:
redirect: fortinet.fortios.fortios_wireless_controller_timers
fortios_wireless_controller_utm_profile:
redirect: fortinet.fortios.fortios_wireless_controller_utm_profile
fortios_wireless_controller_vap:
redirect: fortinet.fortios.fortios_wireless_controller_vap
fortios_wireless_controller_vap_group:
redirect: fortinet.fortios.fortios_wireless_controller_vap_group
fortios_wireless_controller_wids_profile:
redirect: fortinet.fortios.fortios_wireless_controller_wids_profile
fortios_wireless_controller_wtp:
redirect: fortinet.fortios.fortios_wireless_controller_wtp
fortios_wireless_controller_wtp_group:
redirect: fortinet.fortios.fortios_wireless_controller_wtp_group
fortios_wireless_controller_wtp_profile:
redirect: fortinet.fortios.fortios_wireless_controller_wtp_profile
netbox_device:
redirect: netbox.netbox.netbox_device
netbox_ip_address:
redirect: netbox.netbox.netbox_ip_address
netbox_interface:
redirect: netbox.netbox.netbox_interface
netbox_prefix:
redirect: netbox.netbox.netbox_prefix
netbox_site:
redirect: netbox.netbox.netbox_site
aws_netapp_cvs_FileSystems:
redirect: netapp.aws.aws_netapp_cvs_FileSystems
aws_netapp_cvs_active_directory:
redirect: netapp.aws.aws_netapp_cvs_active_directory
aws_netapp_cvs_pool:
redirect: netapp.aws.aws_netapp_cvs_pool
aws_netapp_cvs_snapshots:
redirect: netapp.aws.aws_netapp_cvs_snapshots
na_elementsw_access_group:
redirect: netapp.elementsw.na_elementsw_access_group
na_elementsw_account:
redirect: netapp.elementsw.na_elementsw_account
na_elementsw_admin_users:
redirect: netapp.elementsw.na_elementsw_admin_users
na_elementsw_backup:
redirect: netapp.elementsw.na_elementsw_backup
na_elementsw_check_connections:
redirect: netapp.elementsw.na_elementsw_check_connections
na_elementsw_cluster:
redirect: netapp.elementsw.na_elementsw_cluster
na_elementsw_cluster_config:
redirect: netapp.elementsw.na_elementsw_cluster_config
na_elementsw_cluster_pair:
redirect: netapp.elementsw.na_elementsw_cluster_pair
na_elementsw_cluster_snmp:
redirect: netapp.elementsw.na_elementsw_cluster_snmp
na_elementsw_drive:
redirect: netapp.elementsw.na_elementsw_drive
na_elementsw_initiators:
redirect: netapp.elementsw.na_elementsw_initiators
na_elementsw_ldap:
redirect: netapp.elementsw.na_elementsw_ldap
na_elementsw_network_interfaces:
redirect: netapp.elementsw.na_elementsw_network_interfaces
na_elementsw_node:
redirect: netapp.elementsw.na_elementsw_node
na_elementsw_snapshot:
redirect: netapp.elementsw.na_elementsw_snapshot
na_elementsw_snapshot_restore:
redirect: netapp.elementsw.na_elementsw_snapshot_restore
na_elementsw_snapshot_schedule:
redirect: netapp.elementsw.na_elementsw_snapshot_schedule
na_elementsw_vlan:
redirect: netapp.elementsw.na_elementsw_vlan
na_elementsw_volume:
redirect: netapp.elementsw.na_elementsw_volume
na_elementsw_volume_clone:
redirect: netapp.elementsw.na_elementsw_volume_clone
na_elementsw_volume_pair:
redirect: netapp.elementsw.na_elementsw_volume_pair
na_ontap_aggregate:
redirect: netapp.ontap.na_ontap_aggregate
na_ontap_autosupport:
redirect: netapp.ontap.na_ontap_autosupport
na_ontap_broadcast_domain:
redirect: netapp.ontap.na_ontap_broadcast_domain
na_ontap_broadcast_domain_ports:
redirect: netapp.ontap.na_ontap_broadcast_domain_ports
na_ontap_cg_snapshot:
redirect: netapp.ontap.na_ontap_cg_snapshot
na_ontap_cifs:
redirect: netapp.ontap.na_ontap_cifs
na_ontap_cifs_acl:
redirect: netapp.ontap.na_ontap_cifs_acl
na_ontap_cifs_server:
redirect: netapp.ontap.na_ontap_cifs_server
na_ontap_cluster:
redirect: netapp.ontap.na_ontap_cluster
na_ontap_cluster_ha:
redirect: netapp.ontap.na_ontap_cluster_ha
na_ontap_cluster_peer:
redirect: netapp.ontap.na_ontap_cluster_peer
na_ontap_command:
redirect: netapp.ontap.na_ontap_command
na_ontap_disks:
redirect: netapp.ontap.na_ontap_disks
na_ontap_dns:
redirect: netapp.ontap.na_ontap_dns
na_ontap_export_policy:
redirect: netapp.ontap.na_ontap_export_policy
na_ontap_export_policy_rule:
redirect: netapp.ontap.na_ontap_export_policy_rule
na_ontap_fcp:
redirect: netapp.ontap.na_ontap_fcp
na_ontap_firewall_policy:
redirect: netapp.ontap.na_ontap_firewall_policy
na_ontap_firmware_upgrade:
redirect: netapp.ontap.na_ontap_firmware_upgrade
na_ontap_flexcache:
redirect: netapp.ontap.na_ontap_flexcache
na_ontap_igroup:
redirect: netapp.ontap.na_ontap_igroup
na_ontap_igroup_initiator:
redirect: netapp.ontap.na_ontap_igroup_initiator
na_ontap_info:
redirect: netapp.ontap.na_ontap_info
na_ontap_interface:
redirect: netapp.ontap.na_ontap_interface
na_ontap_ipspace:
redirect: netapp.ontap.na_ontap_ipspace
na_ontap_iscsi:
redirect: netapp.ontap.na_ontap_iscsi
na_ontap_job_schedule:
redirect: netapp.ontap.na_ontap_job_schedule
na_ontap_kerberos_realm:
redirect: netapp.ontap.na_ontap_kerberos_realm
na_ontap_ldap:
redirect: netapp.ontap.na_ontap_ldap
na_ontap_ldap_client:
redirect: netapp.ontap.na_ontap_ldap_client
na_ontap_license:
redirect: netapp.ontap.na_ontap_license
na_ontap_lun:
redirect: netapp.ontap.na_ontap_lun
na_ontap_lun_copy:
redirect: netapp.ontap.na_ontap_lun_copy
na_ontap_lun_map:
redirect: netapp.ontap.na_ontap_lun_map
na_ontap_motd:
redirect: netapp.ontap.na_ontap_motd
na_ontap_ndmp:
redirect: netapp.ontap.na_ontap_ndmp
na_ontap_net_ifgrp:
redirect: netapp.ontap.na_ontap_net_ifgrp
na_ontap_net_port:
redirect: netapp.ontap.na_ontap_net_port
na_ontap_net_routes:
redirect: netapp.ontap.na_ontap_net_routes
na_ontap_net_subnet:
redirect: netapp.ontap.na_ontap_net_subnet
na_ontap_net_vlan:
redirect: netapp.ontap.na_ontap_net_vlan
na_ontap_nfs:
redirect: netapp.ontap.na_ontap_nfs
na_ontap_node:
redirect: netapp.ontap.na_ontap_node
na_ontap_ntp:
redirect: netapp.ontap.na_ontap_ntp
na_ontap_nvme:
redirect: netapp.ontap.na_ontap_nvme
na_ontap_nvme_namespace:
redirect: netapp.ontap.na_ontap_nvme_namespace
na_ontap_nvme_subsystem:
redirect: netapp.ontap.na_ontap_nvme_subsystem
na_ontap_object_store:
redirect: netapp.ontap.na_ontap_object_store
na_ontap_ports:
redirect: netapp.ontap.na_ontap_ports
na_ontap_portset:
redirect: netapp.ontap.na_ontap_portset
na_ontap_qos_adaptive_policy_group:
redirect: netapp.ontap.na_ontap_qos_adaptive_policy_group
na_ontap_qos_policy_group:
redirect: netapp.ontap.na_ontap_qos_policy_group
na_ontap_qtree:
redirect: netapp.ontap.na_ontap_qtree
na_ontap_quotas:
redirect: netapp.ontap.na_ontap_quotas
na_ontap_security_key_manager:
redirect: netapp.ontap.na_ontap_security_key_manager
na_ontap_service_processor_network:
redirect: netapp.ontap.na_ontap_service_processor_network
na_ontap_snapmirror:
redirect: netapp.ontap.na_ontap_snapmirror
na_ontap_snapshot:
redirect: netapp.ontap.na_ontap_snapshot
na_ontap_snapshot_policy:
redirect: netapp.ontap.na_ontap_snapshot_policy
na_ontap_snmp:
redirect: netapp.ontap.na_ontap_snmp
na_ontap_software_update:
redirect: netapp.ontap.na_ontap_software_update
na_ontap_svm:
redirect: netapp.ontap.na_ontap_svm
na_ontap_svm_options:
redirect: netapp.ontap.na_ontap_svm_options
na_ontap_ucadapter:
redirect: netapp.ontap.na_ontap_ucadapter
na_ontap_unix_group:
redirect: netapp.ontap.na_ontap_unix_group
na_ontap_unix_user:
redirect: netapp.ontap.na_ontap_unix_user
na_ontap_user:
redirect: netapp.ontap.na_ontap_user
na_ontap_user_role:
redirect: netapp.ontap.na_ontap_user_role
na_ontap_volume:
redirect: netapp.ontap.na_ontap_volume
na_ontap_volume_autosize:
redirect: netapp.ontap.na_ontap_volume_autosize
na_ontap_volume_clone:
redirect: netapp.ontap.na_ontap_volume_clone
na_ontap_vscan:
redirect: netapp.ontap.na_ontap_vscan
na_ontap_vscan_on_access_policy:
redirect: netapp.ontap.na_ontap_vscan_on_access_policy
na_ontap_vscan_on_demand_task:
redirect: netapp.ontap.na_ontap_vscan_on_demand_task
na_ontap_vscan_scanner_pool:
redirect: netapp.ontap.na_ontap_vscan_scanner_pool
na_ontap_vserver_cifs_security:
redirect: netapp.ontap.na_ontap_vserver_cifs_security
na_ontap_vserver_peer:
redirect: netapp.ontap.na_ontap_vserver_peer
cp_mgmt_access_layer:
redirect: check_point.mgmt.cp_mgmt_access_layer
cp_mgmt_access_layer_facts:
redirect: check_point.mgmt.cp_mgmt_access_layer_facts
cp_mgmt_access_role:
redirect: check_point.mgmt.cp_mgmt_access_role
cp_mgmt_access_role_facts:
redirect: check_point.mgmt.cp_mgmt_access_role_facts
cp_mgmt_access_rule:
redirect: check_point.mgmt.cp_mgmt_access_rule
cp_mgmt_access_rule_facts:
redirect: check_point.mgmt.cp_mgmt_access_rule_facts
cp_mgmt_address_range:
redirect: check_point.mgmt.cp_mgmt_address_range
cp_mgmt_address_range_facts:
redirect: check_point.mgmt.cp_mgmt_address_range_facts
cp_mgmt_administrator:
redirect: check_point.mgmt.cp_mgmt_administrator
cp_mgmt_administrator_facts:
redirect: check_point.mgmt.cp_mgmt_administrator_facts
cp_mgmt_application_site:
redirect: check_point.mgmt.cp_mgmt_application_site
cp_mgmt_application_site_category:
redirect: check_point.mgmt.cp_mgmt_application_site_category
cp_mgmt_application_site_category_facts:
redirect: check_point.mgmt.cp_mgmt_application_site_category_facts
cp_mgmt_application_site_facts:
redirect: check_point.mgmt.cp_mgmt_application_site_facts
cp_mgmt_application_site_group:
redirect: check_point.mgmt.cp_mgmt_application_site_group
cp_mgmt_application_site_group_facts:
redirect: check_point.mgmt.cp_mgmt_application_site_group_facts
cp_mgmt_assign_global_assignment:
redirect: check_point.mgmt.cp_mgmt_assign_global_assignment
cp_mgmt_discard:
redirect: check_point.mgmt.cp_mgmt_discard
cp_mgmt_dns_domain:
redirect: check_point.mgmt.cp_mgmt_dns_domain
cp_mgmt_dns_domain_facts:
redirect: check_point.mgmt.cp_mgmt_dns_domain_facts
cp_mgmt_dynamic_object:
redirect: check_point.mgmt.cp_mgmt_dynamic_object
cp_mgmt_dynamic_object_facts:
redirect: check_point.mgmt.cp_mgmt_dynamic_object_facts
cp_mgmt_exception_group:
redirect: check_point.mgmt.cp_mgmt_exception_group
cp_mgmt_exception_group_facts:
redirect: check_point.mgmt.cp_mgmt_exception_group_facts
cp_mgmt_global_assignment:
redirect: check_point.mgmt.cp_mgmt_global_assignment
cp_mgmt_global_assignment_facts:
redirect: check_point.mgmt.cp_mgmt_global_assignment_facts
cp_mgmt_group:
redirect: check_point.mgmt.cp_mgmt_group
cp_mgmt_group_facts:
redirect: check_point.mgmt.cp_mgmt_group_facts
cp_mgmt_group_with_exclusion:
redirect: check_point.mgmt.cp_mgmt_group_with_exclusion
cp_mgmt_group_with_exclusion_facts:
redirect: check_point.mgmt.cp_mgmt_group_with_exclusion_facts
cp_mgmt_host:
redirect: check_point.mgmt.cp_mgmt_host
cp_mgmt_host_facts:
redirect: check_point.mgmt.cp_mgmt_host_facts
cp_mgmt_install_policy:
redirect: check_point.mgmt.cp_mgmt_install_policy
cp_mgmt_mds_facts:
redirect: check_point.mgmt.cp_mgmt_mds_facts
cp_mgmt_multicast_address_range:
redirect: check_point.mgmt.cp_mgmt_multicast_address_range
cp_mgmt_multicast_address_range_facts:
redirect: check_point.mgmt.cp_mgmt_multicast_address_range_facts
cp_mgmt_network:
redirect: check_point.mgmt.cp_mgmt_network
cp_mgmt_network_facts:
redirect: check_point.mgmt.cp_mgmt_network_facts
cp_mgmt_package:
redirect: check_point.mgmt.cp_mgmt_package
cp_mgmt_package_facts:
redirect: check_point.mgmt.cp_mgmt_package_facts
cp_mgmt_publish:
redirect: check_point.mgmt.cp_mgmt_publish
cp_mgmt_put_file:
redirect: check_point.mgmt.cp_mgmt_put_file
cp_mgmt_run_ips_update:
redirect: check_point.mgmt.cp_mgmt_run_ips_update
cp_mgmt_run_script:
redirect: check_point.mgmt.cp_mgmt_run_script
cp_mgmt_security_zone:
redirect: check_point.mgmt.cp_mgmt_security_zone
cp_mgmt_security_zone_facts:
redirect: check_point.mgmt.cp_mgmt_security_zone_facts
cp_mgmt_service_dce_rpc:
redirect: check_point.mgmt.cp_mgmt_service_dce_rpc
cp_mgmt_service_dce_rpc_facts:
redirect: check_point.mgmt.cp_mgmt_service_dce_rpc_facts
cp_mgmt_service_group:
redirect: check_point.mgmt.cp_mgmt_service_group
cp_mgmt_service_group_facts:
redirect: check_point.mgmt.cp_mgmt_service_group_facts
cp_mgmt_service_icmp:
redirect: check_point.mgmt.cp_mgmt_service_icmp
cp_mgmt_service_icmp6:
redirect: check_point.mgmt.cp_mgmt_service_icmp6
cp_mgmt_service_icmp6_facts:
redirect: check_point.mgmt.cp_mgmt_service_icmp6_facts
cp_mgmt_service_icmp_facts:
redirect: check_point.mgmt.cp_mgmt_service_icmp_facts
cp_mgmt_service_other:
redirect: check_point.mgmt.cp_mgmt_service_other
cp_mgmt_service_other_facts:
redirect: check_point.mgmt.cp_mgmt_service_other_facts
cp_mgmt_service_rpc:
redirect: check_point.mgmt.cp_mgmt_service_rpc
cp_mgmt_service_rpc_facts:
redirect: check_point.mgmt.cp_mgmt_service_rpc_facts
cp_mgmt_service_sctp:
redirect: check_point.mgmt.cp_mgmt_service_sctp
cp_mgmt_service_sctp_facts:
redirect: check_point.mgmt.cp_mgmt_service_sctp_facts
cp_mgmt_service_tcp:
redirect: check_point.mgmt.cp_mgmt_service_tcp
cp_mgmt_service_tcp_facts:
redirect: check_point.mgmt.cp_mgmt_service_tcp_facts
cp_mgmt_service_udp:
redirect: check_point.mgmt.cp_mgmt_service_udp
cp_mgmt_service_udp_facts:
redirect: check_point.mgmt.cp_mgmt_service_udp_facts
cp_mgmt_session_facts:
redirect: check_point.mgmt.cp_mgmt_session_facts
cp_mgmt_simple_gateway:
redirect: check_point.mgmt.cp_mgmt_simple_gateway
cp_mgmt_simple_gateway_facts:
redirect: check_point.mgmt.cp_mgmt_simple_gateway_facts
cp_mgmt_tag:
redirect: check_point.mgmt.cp_mgmt_tag
cp_mgmt_tag_facts:
redirect: check_point.mgmt.cp_mgmt_tag_facts
cp_mgmt_threat_exception:
redirect: check_point.mgmt.cp_mgmt_threat_exception
cp_mgmt_threat_exception_facts:
redirect: check_point.mgmt.cp_mgmt_threat_exception_facts
cp_mgmt_threat_indicator:
redirect: check_point.mgmt.cp_mgmt_threat_indicator
cp_mgmt_threat_indicator_facts:
redirect: check_point.mgmt.cp_mgmt_threat_indicator_facts
cp_mgmt_threat_layer:
redirect: check_point.mgmt.cp_mgmt_threat_layer
cp_mgmt_threat_layer_facts:
redirect: check_point.mgmt.cp_mgmt_threat_layer_facts
cp_mgmt_threat_profile:
redirect: check_point.mgmt.cp_mgmt_threat_profile
cp_mgmt_threat_profile_facts:
redirect: check_point.mgmt.cp_mgmt_threat_profile_facts
cp_mgmt_threat_protection_override:
redirect: check_point.mgmt.cp_mgmt_threat_protection_override
cp_mgmt_threat_rule:
redirect: check_point.mgmt.cp_mgmt_threat_rule
cp_mgmt_threat_rule_facts:
redirect: check_point.mgmt.cp_mgmt_threat_rule_facts
cp_mgmt_time:
redirect: check_point.mgmt.cp_mgmt_time
cp_mgmt_time_facts:
redirect: check_point.mgmt.cp_mgmt_time_facts
cp_mgmt_verify_policy:
redirect: check_point.mgmt.cp_mgmt_verify_policy
cp_mgmt_vpn_community_meshed:
redirect: check_point.mgmt.cp_mgmt_vpn_community_meshed
cp_mgmt_vpn_community_meshed_facts:
redirect: check_point.mgmt.cp_mgmt_vpn_community_meshed_facts
cp_mgmt_vpn_community_star:
redirect: check_point.mgmt.cp_mgmt_vpn_community_star
cp_mgmt_vpn_community_star_facts:
redirect: check_point.mgmt.cp_mgmt_vpn_community_star_facts
cp_mgmt_wildcard:
redirect: check_point.mgmt.cp_mgmt_wildcard
cp_mgmt_wildcard_facts:
redirect: check_point.mgmt.cp_mgmt_wildcard_facts
eos_static_route:
redirect: arista.eos.eos_static_route
eos_acls:
redirect: arista.eos.eos_acls
eos_interfaces:
redirect: arista.eos.eos_interfaces
eos_facts:
redirect: arista.eos.eos_facts
eos_logging:
redirect: arista.eos.eos_logging
eos_lag_interfaces:
redirect: arista.eos.eos_lag_interfaces
eos_l2_interfaces:
redirect: arista.eos.eos_l2_interfaces
eos_l3_interface:
redirect: arista.eos.eos_l3_interface
eos_lacp:
redirect: arista.eos.eos_lacp
eos_lldp_global:
redirect: arista.eos.eos_lldp_global
eos_static_routes:
redirect: arista.eos.eos_static_routes
eos_lacp_interfaces:
redirect: arista.eos.eos_lacp_interfaces
eos_system:
redirect: arista.eos.eos_system
eos_vlan:
redirect: arista.eos.eos_vlan
eos_eapi:
redirect: arista.eos.eos_eapi
eos_acl_interfaces:
redirect: arista.eos.eos_acl_interfaces
eos_l2_interface:
redirect: arista.eos.eos_l2_interface
eos_lldp_interfaces:
redirect: arista.eos.eos_lldp_interfaces
eos_command:
redirect: arista.eos.eos_command
eos_linkagg:
redirect: arista.eos.eos_linkagg
eos_l3_interfaces:
redirect: arista.eos.eos_l3_interfaces
eos_vlans:
redirect: arista.eos.eos_vlans
eos_user:
redirect: arista.eos.eos_user
eos_banner:
redirect: arista.eos.eos_banner
eos_lldp:
redirect: arista.eos.eos_lldp
eos_interface:
redirect: arista.eos.eos_interface
eos_config:
redirect: arista.eos.eos_config
eos_bgp:
redirect: arista.eos.eos_bgp
eos_vrf:
redirect: arista.eos.eos_vrf
aci_aaa_user:
redirect: cisco.aci.aci_aaa_user
aci_aaa_user_certificate:
redirect: cisco.aci.aci_aaa_user_certificate
aci_access_port_block_to_access_port:
redirect: cisco.aci.aci_access_port_block_to_access_port
aci_access_port_to_interface_policy_leaf_profile:
redirect: cisco.aci.aci_access_port_to_interface_policy_leaf_profile
aci_access_sub_port_block_to_access_port:
redirect: cisco.aci.aci_access_sub_port_block_to_access_port
aci_aep:
redirect: cisco.aci.aci_aep
aci_aep_to_domain:
redirect: cisco.aci.aci_aep_to_domain
aci_ap:
redirect: cisco.aci.aci_ap
aci_bd:
redirect: cisco.aci.aci_bd
aci_bd_subnet:
redirect: cisco.aci.aci_bd_subnet
aci_bd_to_l3out:
redirect: cisco.aci.aci_bd_to_l3out
aci_config_rollback:
redirect: cisco.aci.aci_config_rollback
aci_config_snapshot:
redirect: cisco.aci.aci_config_snapshot
aci_contract:
redirect: cisco.aci.aci_contract
aci_contract_subject:
redirect: cisco.aci.aci_contract_subject
aci_contract_subject_to_filter:
redirect: cisco.aci.aci_contract_subject_to_filter
aci_domain:
redirect: cisco.aci.aci_domain
aci_domain_to_encap_pool:
redirect: cisco.aci.aci_domain_to_encap_pool
aci_domain_to_vlan_pool:
redirect: cisco.aci.aci_domain_to_vlan_pool
aci_encap_pool:
redirect: cisco.aci.aci_encap_pool
aci_encap_pool_range:
redirect: cisco.aci.aci_encap_pool_range
aci_epg:
redirect: cisco.aci.aci_epg
aci_epg_monitoring_policy:
redirect: cisco.aci.aci_epg_monitoring_policy
aci_epg_to_contract:
redirect: cisco.aci.aci_epg_to_contract
aci_epg_to_domain:
redirect: cisco.aci.aci_epg_to_domain
aci_fabric_node:
redirect: cisco.aci.aci_fabric_node
aci_fabric_scheduler:
redirect: cisco.aci.aci_fabric_scheduler
aci_filter:
redirect: cisco.aci.aci_filter
aci_filter_entry:
redirect: cisco.aci.aci_filter_entry
aci_firmware_group:
redirect: cisco.aci.aci_firmware_group
aci_firmware_group_node:
redirect: cisco.aci.aci_firmware_group_node
aci_firmware_policy:
redirect: cisco.aci.aci_firmware_policy
aci_firmware_source:
redirect: cisco.aci.aci_firmware_source
aci_interface_policy_cdp:
redirect: cisco.aci.aci_interface_policy_cdp
aci_interface_policy_fc:
redirect: cisco.aci.aci_interface_policy_fc
aci_interface_policy_l2:
redirect: cisco.aci.aci_interface_policy_l2
aci_interface_policy_leaf_policy_group:
redirect: cisco.aci.aci_interface_policy_leaf_policy_group
aci_interface_policy_leaf_profile:
redirect: cisco.aci.aci_interface_policy_leaf_profile
aci_interface_policy_lldp:
redirect: cisco.aci.aci_interface_policy_lldp
aci_interface_policy_mcp:
redirect: cisco.aci.aci_interface_policy_mcp
aci_interface_policy_ospf:
redirect: cisco.aci.aci_interface_policy_ospf
aci_interface_policy_port_channel:
redirect: cisco.aci.aci_interface_policy_port_channel
aci_interface_policy_port_security:
redirect: cisco.aci.aci_interface_policy_port_security
aci_interface_selector_to_switch_policy_leaf_profile:
redirect: cisco.aci.aci_interface_selector_to_switch_policy_leaf_profile
aci_l3out:
redirect: cisco.aci.aci_l3out
aci_l3out_extepg:
redirect: cisco.aci.aci_l3out_extepg
aci_l3out_extsubnet:
redirect: cisco.aci.aci_l3out_extsubnet
aci_l3out_route_tag_policy:
redirect: cisco.aci.aci_l3out_route_tag_policy
aci_maintenance_group:
redirect: cisco.aci.aci_maintenance_group
aci_maintenance_group_node:
redirect: cisco.aci.aci_maintenance_group_node
aci_maintenance_policy:
redirect: cisco.aci.aci_maintenance_policy
aci_rest:
redirect: cisco.aci.aci_rest
aci_static_binding_to_epg:
redirect: cisco.aci.aci_static_binding_to_epg
aci_switch_leaf_selector:
redirect: cisco.aci.aci_switch_leaf_selector
aci_switch_policy_leaf_profile:
redirect: cisco.aci.aci_switch_policy_leaf_profile
aci_switch_policy_vpc_protection_group:
redirect: cisco.aci.aci_switch_policy_vpc_protection_group
aci_taboo_contract:
redirect: cisco.aci.aci_taboo_contract
aci_tenant:
redirect: cisco.aci.aci_tenant
aci_tenant_action_rule_profile:
redirect: cisco.aci.aci_tenant_action_rule_profile
aci_tenant_ep_retention_policy:
redirect: cisco.aci.aci_tenant_ep_retention_policy
aci_tenant_span_dst_group:
redirect: cisco.aci.aci_tenant_span_dst_group
aci_tenant_span_src_group:
redirect: cisco.aci.aci_tenant_span_src_group
aci_tenant_span_src_group_to_dst_group:
redirect: cisco.aci.aci_tenant_span_src_group_to_dst_group
aci_vlan_pool:
redirect: cisco.aci.aci_vlan_pool
aci_vlan_pool_encap_block:
redirect: cisco.aci.aci_vlan_pool_encap_block
aci_vmm_credential:
redirect: cisco.aci.aci_vmm_credential
aci_vrf:
redirect: cisco.aci.aci_vrf
asa_acl:
redirect: cisco.asa.asa_acl
asa_config:
redirect: cisco.asa.asa_config
asa_og:
redirect: cisco.asa.asa_og
asa_command:
redirect: cisco.asa.asa_command
intersight_facts:
redirect: cisco.intersight.intersight_facts
intersight_rest_api:
redirect: cisco.intersight.intersight_rest_api
ios_l3_interfaces:
redirect: cisco.ios.ios_l3_interfaces
ios_lldp:
redirect: cisco.ios.ios_lldp
ios_interface:
redirect: cisco.ios.ios_interface
ios_lldp_interfaces:
redirect: cisco.ios.ios_lldp_interfaces
ios_l3_interface:
redirect: cisco.ios.ios_l3_interface
ios_acl_interfaces:
redirect: cisco.ios.ios_acl_interfaces
ios_static_routes:
redirect: cisco.ios.ios_static_routes
ios_l2_interfaces:
redirect: cisco.ios.ios_l2_interfaces
ios_logging:
redirect: cisco.ios.ios_logging
ios_vlan:
redirect: cisco.ios.ios_vlan
ios_command:
redirect: cisco.ios.ios_command
ios_static_route:
redirect: cisco.ios.ios_static_route
ios_lldp_global:
redirect: cisco.ios.ios_lldp_global
ios_banner:
redirect: cisco.ios.ios_banner
ios_lag_interfaces:
redirect: cisco.ios.ios_lag_interfaces
ios_linkagg:
redirect: cisco.ios.ios_linkagg
ios_user:
redirect: cisco.ios.ios_user
ios_system:
redirect: cisco.ios.ios_system
ios_facts:
redirect: cisco.ios.ios_facts
ios_ping:
redirect: cisco.ios.ios_ping
ios_vlans:
redirect: cisco.ios.ios_vlans
ios_vrf:
redirect: cisco.ios.ios_vrf
ios_bgp:
redirect: cisco.ios.ios_bgp
ios_ntp:
redirect: cisco.ios.ios_ntp
ios_lacp_interfaces:
redirect: cisco.ios.ios_lacp_interfaces
ios_lacp:
redirect: cisco.ios.ios_lacp
ios_config:
redirect: cisco.ios.ios_config
ios_l2_interface:
redirect: cisco.ios.ios_l2_interface
ios_acls:
redirect: cisco.ios.ios_acls
ios_interfaces:
redirect: cisco.ios.ios_interfaces
iosxr_bgp:
redirect: cisco.iosxr.iosxr_bgp
iosxr_lldp_interfaces:
redirect: cisco.iosxr.iosxr_lldp_interfaces
iosxr_l3_interfaces:
redirect: cisco.iosxr.iosxr_l3_interfaces
iosxr_netconf:
redirect: cisco.iosxr.iosxr_netconf
iosxr_static_routes:
redirect: cisco.iosxr.iosxr_static_routes
iosxr_lldp_global:
redirect: cisco.iosxr.iosxr_lldp_global
iosxr_config:
redirect: cisco.iosxr.iosxr_config
iosxr_lag_interfaces:
redirect: cisco.iosxr.iosxr_lag_interfaces
iosxr_interface:
redirect: cisco.iosxr.iosxr_interface
iosxr_user:
redirect: cisco.iosxr.iosxr_user
iosxr_facts:
redirect: cisco.iosxr.iosxr_facts
iosxr_interfaces:
redirect: cisco.iosxr.iosxr_interfaces
iosxr_acl_interfaces:
redirect: cisco.iosxr.iosxr_acl_interfaces
iosxr_l2_interfaces:
redirect: cisco.iosxr.iosxr_l2_interfaces
iosxr_logging:
redirect: cisco.iosxr.iosxr_logging
iosxr_lacp:
redirect: cisco.iosxr.iosxr_lacp
iosxr_acls:
redirect: cisco.iosxr.iosxr_acls
iosxr_system:
redirect: cisco.iosxr.iosxr_system
iosxr_command:
redirect: cisco.iosxr.iosxr_command
iosxr_lacp_interfaces:
redirect: cisco.iosxr.iosxr_lacp_interfaces
iosxr_banner:
redirect: cisco.iosxr.iosxr_banner
meraki_admin:
redirect: cisco.meraki.meraki_admin
meraki_config_template:
redirect: cisco.meraki.meraki_config_template
meraki_content_filtering:
redirect: cisco.meraki.meraki_content_filtering
meraki_device:
redirect: cisco.meraki.meraki_device
meraki_firewalled_services:
redirect: cisco.meraki.meraki_firewalled_services
meraki_malware:
redirect: cisco.meraki.meraki_malware
meraki_mr_l3_firewall:
redirect: cisco.meraki.meraki_mr_l3_firewall
meraki_mx_l3_firewall:
redirect: cisco.meraki.meraki_mx_l3_firewall
meraki_mx_l7_firewall:
redirect: cisco.meraki.meraki_mx_l7_firewall
meraki_nat:
redirect: cisco.meraki.meraki_nat
meraki_network:
redirect: cisco.meraki.meraki_network
meraki_organization:
redirect: cisco.meraki.meraki_organization
meraki_snmp:
redirect: cisco.meraki.meraki_snmp
meraki_ssid:
redirect: cisco.meraki.meraki_ssid
meraki_static_route:
redirect: cisco.meraki.meraki_static_route
meraki_switchport:
redirect: cisco.meraki.meraki_switchport
meraki_syslog:
redirect: cisco.meraki.meraki_syslog
meraki_vlan:
redirect: cisco.meraki.meraki_vlan
meraki_webhook:
redirect: cisco.meraki.meraki_webhook
mso_label:
redirect: cisco.mso.mso_label
mso_role:
redirect: cisco.mso.mso_role
mso_schema:
redirect: cisco.mso.mso_schema
mso_schema_site:
redirect: cisco.mso.mso_schema_site
mso_schema_site_anp:
redirect: cisco.mso.mso_schema_site_anp
mso_schema_site_anp_epg:
redirect: cisco.mso.mso_schema_site_anp_epg
mso_schema_site_anp_epg_domain:
redirect: cisco.mso.mso_schema_site_anp_epg_domain
mso_schema_site_anp_epg_staticleaf:
redirect: cisco.mso.mso_schema_site_anp_epg_staticleaf
mso_schema_site_anp_epg_staticport:
redirect: cisco.mso.mso_schema_site_anp_epg_staticport
mso_schema_site_anp_epg_subnet:
redirect: cisco.mso.mso_schema_site_anp_epg_subnet
mso_schema_site_bd:
redirect: cisco.mso.mso_schema_site_bd
mso_schema_site_bd_l3out:
redirect: cisco.mso.mso_schema_site_bd_l3out
mso_schema_site_bd_subnet:
redirect: cisco.mso.mso_schema_site_bd_subnet
mso_schema_site_vrf:
redirect: cisco.mso.mso_schema_site_vrf
mso_schema_site_vrf_region:
redirect: cisco.mso.mso_schema_site_vrf_region
mso_schema_site_vrf_region_cidr:
redirect: cisco.mso.mso_schema_site_vrf_region_cidr
mso_schema_site_vrf_region_cidr_subnet:
redirect: cisco.mso.mso_schema_site_vrf_region_cidr_subnet
mso_schema_template:
redirect: cisco.mso.mso_schema_template
mso_schema_template_anp:
redirect: cisco.mso.mso_schema_template_anp
mso_schema_template_anp_epg:
redirect: cisco.mso.mso_schema_template_anp_epg
mso_schema_template_anp_epg_contract:
redirect: cisco.mso.mso_schema_template_anp_epg_contract
mso_schema_template_anp_epg_subnet:
redirect: cisco.mso.mso_schema_template_anp_epg_subnet
mso_schema_template_bd:
redirect: cisco.mso.mso_schema_template_bd
mso_schema_template_bd_subnet:
redirect: cisco.mso.mso_schema_template_bd_subnet
mso_schema_template_contract_filter:
redirect: cisco.mso.mso_schema_template_contract_filter
mso_schema_template_deploy:
redirect: cisco.mso.mso_schema_template_deploy
mso_schema_template_externalepg:
redirect: cisco.mso.mso_schema_template_externalepg
mso_schema_template_filter_entry:
redirect: cisco.mso.mso_schema_template_filter_entry
mso_schema_template_l3out:
redirect: cisco.mso.mso_schema_template_l3out
mso_schema_template_vrf:
redirect: cisco.mso.mso_schema_template_vrf
mso_site:
redirect: cisco.mso.mso_site
mso_tenant:
redirect: cisco.mso.mso_tenant
mso_user:
redirect: cisco.mso.mso_user
nxos_telemetry:
redirect: cisco.nxos.nxos_telemetry
nxos_user:
redirect: cisco.nxos.nxos_user
nxos_bfd_interfaces:
redirect: cisco.nxos.nxos_bfd_interfaces
nxos_ospf:
redirect: cisco.nxos.nxos_ospf
nxos_system:
redirect: cisco.nxos.nxos_system
nxos_l3_interface:
redirect: cisco.nxos.nxos_l3_interface
nxos_smu:
redirect: cisco.nxos.nxos_smu
nxos_reboot:
redirect: cisco.nxos.nxos_reboot
nxos_static_route:
redirect: cisco.nxos.nxos_static_route
nxos_acl_interfaces:
redirect: cisco.nxos.nxos_acl_interfaces
nxos_vpc:
redirect: cisco.nxos.nxos_vpc
nxos_linkagg:
redirect: cisco.nxos.nxos_linkagg
nxos_vxlan_vtep_vni:
redirect: cisco.nxos.nxos_vxlan_vtep_vni
nxos_vrrp:
redirect: cisco.nxos.nxos_vrrp
nxos_lldp:
redirect: cisco.nxos.nxos_lldp
nxos_interface:
redirect: cisco.nxos.nxos_interface
nxos_lacp_interfaces:
redirect: cisco.nxos.nxos_lacp_interfaces
nxos_gir_profile_management:
redirect: cisco.nxos.nxos_gir_profile_management
nxos_snmp_community:
redirect: cisco.nxos.nxos_snmp_community
nxos_lag_interfaces:
redirect: cisco.nxos.nxos_lag_interfaces
nxos_acl:
redirect: cisco.nxos.nxos_acl
nxos_hsrp_interfaces:
redirect: cisco.nxos.nxos_hsrp_interfaces
nxos_lldp_global:
redirect: cisco.nxos.nxos_lldp_global
nxos_snmp_contact:
redirect: cisco.nxos.nxos_snmp_contact
nxos_vrf_interface:
redirect: cisco.nxos.nxos_vrf_interface
nxos_rpm:
redirect: cisco.nxos.nxos_rpm
nxos_ntp_options:
redirect: cisco.nxos.nxos_ntp_options
nxos_ospf_vrf:
redirect: cisco.nxos.nxos_ospf_vrf
nxos_vtp_version:
redirect: cisco.nxos.nxos_vtp_version
nxos_igmp_interface:
redirect: cisco.nxos.nxos_igmp_interface
nxos_bgp_neighbor:
redirect: cisco.nxos.nxos_bgp_neighbor
nxos_bgp:
redirect: cisco.nxos.nxos_bgp
nxos_rollback:
redirect: cisco.nxos.nxos_rollback
nxos_aaa_server:
redirect: cisco.nxos.nxos_aaa_server
nxos_udld_interface:
redirect: cisco.nxos.nxos_udld_interface
nxos_bgp_af:
redirect: cisco.nxos.nxos_bgp_af
nxos_feature:
redirect: cisco.nxos.nxos_feature
nxos_snmp_traps:
redirect: cisco.nxos.nxos_snmp_traps
nxos_evpn_global:
redirect: cisco.nxos.nxos_evpn_global
nxos_igmp:
redirect: cisco.nxos.nxos_igmp
nxos_aaa_server_host:
redirect: cisco.nxos.nxos_aaa_server_host
nxos_vrf_af:
redirect: cisco.nxos.nxos_vrf_af
nxos_snapshot:
redirect: cisco.nxos.nxos_snapshot
nxos_gir:
redirect: cisco.nxos.nxos_gir
nxos_command:
redirect: cisco.nxos.nxos_command
nxos_vxlan_vtep:
redirect: cisco.nxos.nxos_vxlan_vtep
nxos_snmp_location:
redirect: cisco.nxos.nxos_snmp_location
nxos_evpn_vni:
redirect: cisco.nxos.nxos_evpn_vni
nxos_vpc_interface:
redirect: cisco.nxos.nxos_vpc_interface
nxos_logging:
redirect: cisco.nxos.nxos_logging
nxos_pim:
redirect: cisco.nxos.nxos_pim
nxos_ping:
redirect: cisco.nxos.nxos_ping
nxos_pim_rp_address:
redirect: cisco.nxos.nxos_pim_rp_address
nxos_pim_interface:
redirect: cisco.nxos.nxos_pim_interface
nxos_install_os:
redirect: cisco.nxos.nxos_install_os
nxos_nxapi:
redirect: cisco.nxos.nxos_nxapi
nxos_l2_interface:
redirect: cisco.nxos.nxos_l2_interface
nxos_bgp_neighbor_af:
redirect: cisco.nxos.nxos_bgp_neighbor_af
nxos_lacp:
redirect: cisco.nxos.nxos_lacp
nxos_lldp_interfaces:
redirect: cisco.nxos.nxos_lldp_interfaces
nxos_acl_interface:
redirect: cisco.nxos.nxos_acl_interface
nxos_vrf:
redirect: cisco.nxos.nxos_vrf
nxos_interface_ospf:
redirect: cisco.nxos.nxos_interface_ospf
nxos_acls:
redirect: cisco.nxos.nxos_acls
nxos_vtp_password:
redirect: cisco.nxos.nxos_vtp_password
nxos_l3_interfaces:
redirect: cisco.nxos.nxos_l3_interfaces
nxos_igmp_snooping:
redirect: cisco.nxos.nxos_igmp_snooping
nxos_banner:
redirect: cisco.nxos.nxos_banner
nxos_bfd_global:
redirect: cisco.nxos.nxos_bfd_global
nxos_udld:
redirect: cisco.nxos.nxos_udld
nxos_vtp_domain:
redirect: cisco.nxos.nxos_vtp_domain
nxos_snmp_host:
redirect: cisco.nxos.nxos_snmp_host
nxos_l2_interfaces:
redirect: cisco.nxos.nxos_l2_interfaces
nxos_hsrp:
redirect: cisco.nxos.nxos_hsrp
nxos_interfaces:
redirect: cisco.nxos.nxos_interfaces
nxos_overlay_global:
redirect: cisco.nxos.nxos_overlay_global
nxos_snmp_user:
redirect: cisco.nxos.nxos_snmp_user
nxos_vlans:
redirect: cisco.nxos.nxos_vlans
nxos_ntp:
redirect: cisco.nxos.nxos_ntp
nxos_file_copy:
redirect: cisco.nxos.nxos_file_copy
nxos_ntp_auth:
redirect: cisco.nxos.nxos_ntp_auth
nxos_config:
redirect: cisco.nxos.nxos_config
nxos_vlan:
redirect: cisco.nxos.nxos_vlan
nxos_facts:
redirect: cisco.nxos.nxos_facts
nxos_zone_zoneset:
redirect: cisco.nxos.nxos_zone_zoneset
nxos_vsan:
redirect: cisco.nxos.nxos_vsan
nxos_devicealias:
redirect: cisco.nxos.nxos_devicealias
ucs_managed_objects:
redirect: cisco.ucs.ucs_managed_objects
ucs_vnic_template:
redirect: cisco.ucs.ucs_vnic_template
ucs_query:
redirect: cisco.ucs.ucs_query
ucs_dns_server:
redirect: cisco.ucs.ucs_dns_server
ucs_lan_connectivity:
redirect: cisco.ucs.ucs_lan_connectivity
ucs_vhba_template:
redirect: cisco.ucs.ucs_vhba_template
ucs_san_connectivity:
redirect: cisco.ucs.ucs_san_connectivity
ucs_disk_group_policy:
redirect: cisco.ucs.ucs_disk_group_policy
ucs_uuid_pool:
redirect: cisco.ucs.ucs_uuid_pool
ucs_vlan_find:
redirect: cisco.ucs.ucs_vlan_find
ucs_vlans:
redirect: cisco.ucs.ucs_vlans
ucs_service_profile_template:
redirect: cisco.ucs.ucs_service_profile_template
ucs_ip_pool:
redirect: cisco.ucs.ucs_ip_pool
ucs_timezone:
redirect: cisco.ucs.ucs_timezone
ucs_ntp_server:
redirect: cisco.ucs.ucs_ntp_server
ucs_mac_pool:
redirect: cisco.ucs.ucs_mac_pool
ucs_storage_profile:
redirect: cisco.ucs.ucs_storage_profile
ucs_org:
redirect: cisco.ucs.ucs_org
ucs_vsans:
redirect: cisco.ucs.ucs_vsans
ucs_wwn_pool:
redirect: cisco.ucs.ucs_wwn_pool
bigip_apm_acl:
redirect: f5networks.f5_modules.bigip_apm_acl
bigip_apm_network_access:
redirect: f5networks.f5_modules.bigip_apm_network_access
bigip_apm_policy_fetch:
redirect: f5networks.f5_modules.bigip_apm_policy_fetch
bigip_apm_policy_import:
redirect: f5networks.f5_modules.bigip_apm_policy_import
bigip_appsvcs_extension:
redirect: f5networks.f5_modules.bigip_appsvcs_extension
bigip_asm_dos_application:
redirect: f5networks.f5_modules.bigip_asm_dos_application
bigip_asm_policy_fetch:
redirect: f5networks.f5_modules.bigip_asm_policy_fetch
bigip_asm_policy_import:
redirect: f5networks.f5_modules.bigip_asm_policy_import
bigip_asm_policy_manage:
redirect: f5networks.f5_modules.bigip_asm_policy_manage
bigip_asm_policy_server_technology:
redirect: f5networks.f5_modules.bigip_asm_policy_server_technology
bigip_asm_policy_signature_set:
redirect: f5networks.f5_modules.bigip_asm_policy_signature_set
bigip_cli_alias:
redirect: f5networks.f5_modules.bigip_cli_alias
bigip_cli_script:
redirect: f5networks.f5_modules.bigip_cli_script
bigip_command:
redirect: f5networks.f5_modules.bigip_command
bigip_config:
redirect: f5networks.f5_modules.bigip_config
bigip_configsync_action:
redirect: f5networks.f5_modules.bigip_configsync_action
bigip_data_group:
redirect: f5networks.f5_modules.bigip_data_group
bigip_device_auth:
redirect: f5networks.f5_modules.bigip_device_auth
bigip_device_auth_ldap:
redirect: f5networks.f5_modules.bigip_device_auth_ldap
bigip_device_certificate:
redirect: f5networks.f5_modules.bigip_device_certificate
bigip_device_connectivity:
redirect: f5networks.f5_modules.bigip_device_connectivity
bigip_device_dns:
redirect: f5networks.f5_modules.bigip_device_dns
bigip_device_group:
redirect: f5networks.f5_modules.bigip_device_group
bigip_device_group_member:
redirect: f5networks.f5_modules.bigip_device_group_member
bigip_device_ha_group:
redirect: f5networks.f5_modules.bigip_device_ha_group
bigip_device_httpd:
redirect: f5networks.f5_modules.bigip_device_httpd
bigip_device_info:
redirect: f5networks.f5_modules.bigip_device_info
bigip_device_license:
redirect: f5networks.f5_modules.bigip_device_license
bigip_device_ntp:
redirect: f5networks.f5_modules.bigip_device_ntp
bigip_device_sshd:
redirect: f5networks.f5_modules.bigip_device_sshd
bigip_device_syslog:
redirect: f5networks.f5_modules.bigip_device_syslog
bigip_device_traffic_group:
redirect: f5networks.f5_modules.bigip_device_traffic_group
bigip_device_trust:
redirect: f5networks.f5_modules.bigip_device_trust
bigip_dns_cache_resolver:
redirect: f5networks.f5_modules.bigip_dns_cache_resolver
bigip_dns_nameserver:
redirect: f5networks.f5_modules.bigip_dns_nameserver
bigip_dns_resolver:
redirect: f5networks.f5_modules.bigip_dns_resolver
bigip_dns_zone:
redirect: f5networks.f5_modules.bigip_dns_zone
bigip_file_copy:
redirect: f5networks.f5_modules.bigip_file_copy
bigip_firewall_address_list:
redirect: f5networks.f5_modules.bigip_firewall_address_list
bigip_firewall_dos_profile:
redirect: f5networks.f5_modules.bigip_firewall_dos_profile
bigip_firewall_dos_vector:
redirect: f5networks.f5_modules.bigip_firewall_dos_vector
bigip_firewall_global_rules:
redirect: f5networks.f5_modules.bigip_firewall_global_rules
bigip_firewall_log_profile:
redirect: f5networks.f5_modules.bigip_firewall_log_profile
bigip_firewall_log_profile_network:
redirect: f5networks.f5_modules.bigip_firewall_log_profile_network
bigip_firewall_policy:
redirect: f5networks.f5_modules.bigip_firewall_policy
bigip_firewall_port_list:
redirect: f5networks.f5_modules.bigip_firewall_port_list
bigip_firewall_rule:
redirect: f5networks.f5_modules.bigip_firewall_rule
bigip_firewall_rule_list:
redirect: f5networks.f5_modules.bigip_firewall_rule_list
bigip_firewall_schedule:
redirect: f5networks.f5_modules.bigip_firewall_schedule
bigip_gtm_datacenter:
redirect: f5networks.f5_modules.bigip_gtm_datacenter
bigip_gtm_global:
redirect: f5networks.f5_modules.bigip_gtm_global
bigip_gtm_monitor_bigip:
redirect: f5networks.f5_modules.bigip_gtm_monitor_bigip
bigip_gtm_monitor_external:
redirect: f5networks.f5_modules.bigip_gtm_monitor_external
bigip_gtm_monitor_firepass:
redirect: f5networks.f5_modules.bigip_gtm_monitor_firepass
bigip_gtm_monitor_http:
redirect: f5networks.f5_modules.bigip_gtm_monitor_http
bigip_gtm_monitor_https:
redirect: f5networks.f5_modules.bigip_gtm_monitor_https
bigip_gtm_monitor_tcp:
redirect: f5networks.f5_modules.bigip_gtm_monitor_tcp
bigip_gtm_monitor_tcp_half_open:
redirect: f5networks.f5_modules.bigip_gtm_monitor_tcp_half_open
bigip_gtm_pool:
redirect: f5networks.f5_modules.bigip_gtm_pool
bigip_gtm_pool_member:
redirect: f5networks.f5_modules.bigip_gtm_pool_member
bigip_gtm_server:
redirect: f5networks.f5_modules.bigip_gtm_server
bigip_gtm_topology_record:
redirect: f5networks.f5_modules.bigip_gtm_topology_record
bigip_gtm_topology_region:
redirect: f5networks.f5_modules.bigip_gtm_topology_region
bigip_gtm_virtual_server:
redirect: f5networks.f5_modules.bigip_gtm_virtual_server
bigip_gtm_wide_ip:
redirect: f5networks.f5_modules.bigip_gtm_wide_ip
bigip_hostname:
redirect: f5networks.f5_modules.bigip_hostname
bigip_iapp_service:
redirect: f5networks.f5_modules.bigip_iapp_service
bigip_iapp_template:
redirect: f5networks.f5_modules.bigip_iapp_template
bigip_ike_peer:
redirect: f5networks.f5_modules.bigip_ike_peer
bigip_imish_config:
redirect: f5networks.f5_modules.bigip_imish_config
bigip_ipsec_policy:
redirect: f5networks.f5_modules.bigip_ipsec_policy
bigip_irule:
redirect: f5networks.f5_modules.bigip_irule
bigip_log_destination:
redirect: f5networks.f5_modules.bigip_log_destination
bigip_log_publisher:
redirect: f5networks.f5_modules.bigip_log_publisher
bigip_lx_package:
redirect: f5networks.f5_modules.bigip_lx_package
bigip_management_route:
redirect: f5networks.f5_modules.bigip_management_route
bigip_message_routing_peer:
redirect: f5networks.f5_modules.bigip_message_routing_peer
bigip_message_routing_protocol:
redirect: f5networks.f5_modules.bigip_message_routing_protocol
bigip_message_routing_route:
redirect: f5networks.f5_modules.bigip_message_routing_route
bigip_message_routing_router:
redirect: f5networks.f5_modules.bigip_message_routing_router
bigip_message_routing_transport_config:
redirect: f5networks.f5_modules.bigip_message_routing_transport_config
bigip_monitor_dns:
redirect: f5networks.f5_modules.bigip_monitor_dns
bigip_monitor_external:
redirect: f5networks.f5_modules.bigip_monitor_external
bigip_monitor_gateway_icmp:
redirect: f5networks.f5_modules.bigip_monitor_gateway_icmp
bigip_monitor_http:
redirect: f5networks.f5_modules.bigip_monitor_http
bigip_monitor_https:
redirect: f5networks.f5_modules.bigip_monitor_https
bigip_monitor_ldap:
redirect: f5networks.f5_modules.bigip_monitor_ldap
bigip_monitor_snmp_dca:
redirect: f5networks.f5_modules.bigip_monitor_snmp_dca
bigip_monitor_tcp:
redirect: f5networks.f5_modules.bigip_monitor_tcp
bigip_monitor_tcp_echo:
redirect: f5networks.f5_modules.bigip_monitor_tcp_echo
bigip_monitor_tcp_half_open:
redirect: f5networks.f5_modules.bigip_monitor_tcp_half_open
bigip_monitor_udp:
redirect: f5networks.f5_modules.bigip_monitor_udp
bigip_node:
redirect: f5networks.f5_modules.bigip_node
bigip_partition:
redirect: f5networks.f5_modules.bigip_partition
bigip_password_policy:
redirect: f5networks.f5_modules.bigip_password_policy
bigip_policy:
redirect: f5networks.f5_modules.bigip_policy
bigip_policy_rule:
redirect: f5networks.f5_modules.bigip_policy_rule
bigip_pool:
redirect: f5networks.f5_modules.bigip_pool
bigip_pool_member:
redirect: f5networks.f5_modules.bigip_pool_member
bigip_profile_analytics:
redirect: f5networks.f5_modules.bigip_profile_analytics
bigip_profile_client_ssl:
redirect: f5networks.f5_modules.bigip_profile_client_ssl
bigip_profile_dns:
redirect: f5networks.f5_modules.bigip_profile_dns
bigip_profile_fastl4:
redirect: f5networks.f5_modules.bigip_profile_fastl4
bigip_profile_http:
redirect: f5networks.f5_modules.bigip_profile_http
bigip_profile_http2:
redirect: f5networks.f5_modules.bigip_profile_http2
bigip_profile_http_compression:
redirect: f5networks.f5_modules.bigip_profile_http_compression
bigip_profile_oneconnect:
redirect: f5networks.f5_modules.bigip_profile_oneconnect
bigip_profile_persistence_cookie:
redirect: f5networks.f5_modules.bigip_profile_persistence_cookie
bigip_profile_persistence_src_addr:
redirect: f5networks.f5_modules.bigip_profile_persistence_src_addr
bigip_profile_server_ssl:
redirect: f5networks.f5_modules.bigip_profile_server_ssl
bigip_profile_tcp:
redirect: f5networks.f5_modules.bigip_profile_tcp
bigip_profile_udp:
redirect: f5networks.f5_modules.bigip_profile_udp
bigip_provision:
redirect: f5networks.f5_modules.bigip_provision
bigip_qkview:
redirect: f5networks.f5_modules.bigip_qkview
bigip_remote_role:
redirect: f5networks.f5_modules.bigip_remote_role
bigip_remote_syslog:
redirect: f5networks.f5_modules.bigip_remote_syslog
bigip_remote_user:
redirect: f5networks.f5_modules.bigip_remote_user
bigip_routedomain:
redirect: f5networks.f5_modules.bigip_routedomain
bigip_selfip:
redirect: f5networks.f5_modules.bigip_selfip
bigip_service_policy:
redirect: f5networks.f5_modules.bigip_service_policy
bigip_smtp:
redirect: f5networks.f5_modules.bigip_smtp
bigip_snat_pool:
redirect: f5networks.f5_modules.bigip_snat_pool
bigip_snat_translation:
redirect: f5networks.f5_modules.bigip_snat_translation
bigip_snmp:
redirect: f5networks.f5_modules.bigip_snmp
bigip_snmp_community:
redirect: f5networks.f5_modules.bigip_snmp_community
bigip_snmp_trap:
redirect: f5networks.f5_modules.bigip_snmp_trap
bigip_software_image:
redirect: f5networks.f5_modules.bigip_software_image
bigip_software_install:
redirect: f5networks.f5_modules.bigip_software_install
bigip_software_update:
redirect: f5networks.f5_modules.bigip_software_update
bigip_ssl_certificate:
redirect: f5networks.f5_modules.bigip_ssl_certificate
bigip_ssl_key:
redirect: f5networks.f5_modules.bigip_ssl_key
bigip_ssl_ocsp:
redirect: f5networks.f5_modules.bigip_ssl_ocsp
bigip_static_route:
redirect: f5networks.f5_modules.bigip_static_route
bigip_sys_daemon_log_tmm:
redirect: f5networks.f5_modules.bigip_sys_daemon_log_tmm
bigip_sys_db:
redirect: f5networks.f5_modules.bigip_sys_db
bigip_sys_global:
redirect: f5networks.f5_modules.bigip_sys_global
bigip_timer_policy:
redirect: f5networks.f5_modules.bigip_timer_policy
bigip_traffic_selector:
redirect: f5networks.f5_modules.bigip_traffic_selector
bigip_trunk:
redirect: f5networks.f5_modules.bigip_trunk
bigip_tunnel:
redirect: f5networks.f5_modules.bigip_tunnel
bigip_ucs:
redirect: f5networks.f5_modules.bigip_ucs
bigip_ucs_fetch:
redirect: f5networks.f5_modules.bigip_ucs_fetch
bigip_user:
redirect: f5networks.f5_modules.bigip_user
bigip_vcmp_guest:
redirect: f5networks.f5_modules.bigip_vcmp_guest
bigip_virtual_address:
redirect: f5networks.f5_modules.bigip_virtual_address
bigip_virtual_server:
redirect: f5networks.f5_modules.bigip_virtual_server
bigip_vlan:
redirect: f5networks.f5_modules.bigip_vlan
bigip_wait:
redirect: f5networks.f5_modules.bigip_wait
bigiq_application_fasthttp:
redirect: f5networks.f5_modules.bigiq_application_fasthttp
bigiq_application_fastl4_tcp:
redirect: f5networks.f5_modules.bigiq_application_fastl4_tcp
bigiq_application_fastl4_udp:
redirect: f5networks.f5_modules.bigiq_application_fastl4_udp
bigiq_application_http:
redirect: f5networks.f5_modules.bigiq_application_http
bigiq_application_https_offload:
redirect: f5networks.f5_modules.bigiq_application_https_offload
bigiq_application_https_waf:
redirect: f5networks.f5_modules.bigiq_application_https_waf
bigiq_device_discovery:
redirect: f5networks.f5_modules.bigiq_device_discovery
bigiq_device_info:
redirect: f5networks.f5_modules.bigiq_device_info
bigiq_regkey_license:
redirect: f5networks.f5_modules.bigiq_regkey_license
bigiq_regkey_license_assignment:
redirect: f5networks.f5_modules.bigiq_regkey_license_assignment
bigiq_regkey_pool:
redirect: f5networks.f5_modules.bigiq_regkey_pool
bigiq_utility_license:
redirect: f5networks.f5_modules.bigiq_utility_license
bigiq_utility_license_assignment:
redirect: f5networks.f5_modules.bigiq_utility_license_assignment
os_auth:
redirect: openstack.cloud.os_auth
os_client_config:
redirect: openstack.cloud.os_client_config
os_coe_cluster:
redirect: openstack.cloud.os_coe_cluster
os_coe_cluster_template:
redirect: openstack.cloud.os_coe_cluster_template
os_flavor_info:
redirect: openstack.cloud.os_flavor_info
os_floating_ip:
redirect: openstack.cloud.os_floating_ip
os_group:
redirect: openstack.cloud.os_group
os_group_info:
redirect: openstack.cloud.os_group_info
os_image:
redirect: openstack.cloud.os_image
os_image_info:
redirect: openstack.cloud.os_image_info
os_ironic:
redirect: openstack.cloud.os_ironic
os_ironic_inspect:
redirect: openstack.cloud.os_ironic_inspect
os_ironic_node:
redirect: openstack.cloud.os_ironic_node
os_keypair:
redirect: openstack.cloud.os_keypair
os_keystone_domain:
redirect: openstack.cloud.os_keystone_domain
os_keystone_domain_info:
redirect: openstack.cloud.os_keystone_domain_info
os_keystone_endpoint:
redirect: openstack.cloud.os_keystone_endpoint
os_keystone_role:
redirect: openstack.cloud.os_keystone_role
os_keystone_service:
redirect: openstack.cloud.os_keystone_service
os_listener:
redirect: openstack.cloud.os_listener
os_loadbalancer:
redirect: openstack.cloud.os_loadbalancer
os_member:
redirect: openstack.cloud.os_member
os_network:
redirect: openstack.cloud.os_network
os_networks_info:
redirect: openstack.cloud.os_networks_info
os_nova_flavor:
redirect: openstack.cloud.os_nova_flavor
os_nova_host_aggregate:
redirect: openstack.cloud.os_nova_host_aggregate
os_object:
redirect: openstack.cloud.os_object
os_pool:
redirect: openstack.cloud.os_pool
os_port:
redirect: openstack.cloud.os_port
os_port_info:
redirect: openstack.cloud.os_port_info
os_project:
redirect: openstack.cloud.os_project
os_project_access:
redirect: openstack.cloud.os_project_access
os_project_info:
redirect: openstack.cloud.os_project_info
os_quota:
redirect: openstack.cloud.os_quota
os_recordset:
redirect: openstack.cloud.os_recordset
os_router:
redirect: openstack.cloud.os_router
os_security_group:
redirect: openstack.cloud.os_security_group
os_security_group_rule:
redirect: openstack.cloud.os_security_group_rule
os_server:
redirect: openstack.cloud.os_server
os_server_action:
redirect: openstack.cloud.os_server_action
os_server_group:
redirect: openstack.cloud.os_server_group
os_server_info:
redirect: openstack.cloud.os_server_info
os_server_metadata:
redirect: openstack.cloud.os_server_metadata
os_server_volume:
redirect: openstack.cloud.os_server_volume
os_stack:
redirect: openstack.cloud.os_stack
os_subnet:
redirect: openstack.cloud.os_subnet
os_subnets_info:
redirect: openstack.cloud.os_subnets_info
os_user:
redirect: openstack.cloud.os_user
os_user_group:
redirect: openstack.cloud.os_user_group
os_user_info:
redirect: openstack.cloud.os_user_info
os_user_role:
redirect: openstack.cloud.os_user_role
os_volume:
redirect: openstack.cloud.os_volume
os_volume_snapshot:
redirect: openstack.cloud.os_volume_snapshot
os_zone:
redirect: openstack.cloud.os_zone
junos_user:
redirect: junipernetworks.junos.junos_user
junos_l2_interface:
redirect: junipernetworks.junos.junos_l2_interface
junos_lldp:
redirect: junipernetworks.junos.junos_lldp
junos_rpc:
redirect: junipernetworks.junos.junos_rpc
junos_l2_interfaces:
redirect: junipernetworks.junos.junos_l2_interfaces
junos_lldp_interface:
redirect: junipernetworks.junos.junos_lldp_interface
junos_static_route:
redirect: junipernetworks.junos.junos_static_route
junos_lacp:
redirect: junipernetworks.junos.junos_lacp
junos_lacp_interfaces:
redirect: junipernetworks.junos.junos_lacp_interfaces
junos_vlans:
redirect: junipernetworks.junos.junos_vlans
junos_linkagg:
redirect: junipernetworks.junos.junos_linkagg
junos_scp:
redirect: junipernetworks.junos.junos_scp
junos_banner:
redirect: junipernetworks.junos.junos_banner
junos_l3_interface:
redirect: junipernetworks.junos.junos_l3_interface
junos_logging:
redirect: junipernetworks.junos.junos_logging
junos_package:
redirect: junipernetworks.junos.junos_package
junos_netconf:
redirect: junipernetworks.junos.junos_netconf
junos_facts:
redirect: junipernetworks.junos.junos_facts
junos_ping:
redirect: junipernetworks.junos.junos_ping
junos_interface:
redirect: junipernetworks.junos.junos_interface
junos_lldp_global:
redirect: junipernetworks.junos.junos_lldp_global
junos_config:
redirect: junipernetworks.junos.junos_config
junos_static_routes:
redirect: junipernetworks.junos.junos_static_routes
junos_command:
redirect: junipernetworks.junos.junos_command
junos_lag_interfaces:
redirect: junipernetworks.junos.junos_lag_interfaces
junos_l3_interfaces:
redirect: junipernetworks.junos.junos_l3_interfaces
junos_lldp_interfaces:
redirect: junipernetworks.junos.junos_lldp_interfaces
junos_vlan:
redirect: junipernetworks.junos.junos_vlan
junos_system:
redirect: junipernetworks.junos.junos_system
junos_interfaces:
redirect: junipernetworks.junos.junos_interfaces
junos_vrf:
redirect: junipernetworks.junos.junos_vrf
tower_credential:
redirect: awx.awx.tower_credential
tower_credential_type:
redirect: awx.awx.tower_credential_type
tower_group:
redirect: awx.awx.tower_group
tower_host:
redirect: awx.awx.tower_host
tower_inventory:
redirect: awx.awx.tower_inventory
tower_inventory_source:
redirect: awx.awx.tower_inventory_source
tower_job_cancel:
redirect: awx.awx.tower_job_cancel
tower_job_launch:
redirect: awx.awx.tower_job_launch
tower_job_list:
redirect: awx.awx.tower_job_list
tower_job_template:
redirect: awx.awx.tower_job_template
tower_job_wait:
redirect: awx.awx.tower_job_wait
tower_label:
redirect: awx.awx.tower_label
tower_notification:
redirect: awx.awx.tower_notification
tower_organization:
redirect: awx.awx.tower_organization
tower_project:
redirect: awx.awx.tower_project
tower_receive:
redirect: awx.awx.tower_receive
tower_role:
redirect: awx.awx.tower_role
tower_send:
redirect: awx.awx.tower_send
tower_settings:
redirect: awx.awx.tower_settings
tower_team:
redirect: awx.awx.tower_team
tower_user:
redirect: awx.awx.tower_user
tower_workflow_launch:
redirect: awx.awx.tower_workflow_launch
tower_workflow_template:
redirect: awx.awx.tower_workflow_template
ovirt_affinity_group:
redirect: ovirt.ovirt.ovirt_affinity_group
ovirt_affinity_label:
redirect: ovirt.ovirt.ovirt_affinity_label
ovirt_affinity_label_info:
redirect: ovirt.ovirt.ovirt_affinity_label_info
ovirt_api_info:
redirect: ovirt.ovirt.ovirt_api_info
ovirt_auth:
redirect: ovirt.ovirt.ovirt_auth
ovirt_cluster:
redirect: ovirt.ovirt.ovirt_cluster
ovirt_cluster_info:
redirect: ovirt.ovirt.ovirt_cluster_info
ovirt_datacenter:
redirect: ovirt.ovirt.ovirt_datacenter
ovirt_datacenter_info:
redirect: ovirt.ovirt.ovirt_datacenter_info
ovirt_disk:
redirect: ovirt.ovirt.ovirt_disk
ovirt_disk_info:
redirect: ovirt.ovirt.ovirt_disk_info
ovirt_event:
redirect: ovirt.ovirt.ovirt_event
ovirt_event_info:
redirect: ovirt.ovirt.ovirt_event_info
ovirt_external_provider:
redirect: ovirt.ovirt.ovirt_external_provider
ovirt_external_provider_info:
redirect: ovirt.ovirt.ovirt_external_provider_info
ovirt_group:
redirect: ovirt.ovirt.ovirt_group
ovirt_group_info:
redirect: ovirt.ovirt.ovirt_group_info
ovirt_host:
redirect: ovirt.ovirt.ovirt_host
ovirt_host_info:
redirect: ovirt.ovirt.ovirt_host_info
ovirt_host_network:
redirect: ovirt.ovirt.ovirt_host_network
ovirt_host_pm:
redirect: ovirt.ovirt.ovirt_host_pm
ovirt_host_storage_info:
redirect: ovirt.ovirt.ovirt_host_storage_info
ovirt_instance_type:
redirect: ovirt.ovirt.ovirt_instance_type
ovirt_job:
redirect: ovirt.ovirt.ovirt_job
ovirt_mac_pool:
redirect: ovirt.ovirt.ovirt_mac_pool
ovirt_network:
redirect: ovirt.ovirt.ovirt_network
ovirt_network_info:
redirect: ovirt.ovirt.ovirt_network_info
ovirt_nic:
redirect: ovirt.ovirt.ovirt_nic
ovirt_nic_info:
redirect: ovirt.ovirt.ovirt_nic_info
ovirt_permission:
redirect: ovirt.ovirt.ovirt_permission
ovirt_permission_info:
redirect: ovirt.ovirt.ovirt_permission_info
ovirt_quota:
redirect: ovirt.ovirt.ovirt_quota
ovirt_quota_info:
redirect: ovirt.ovirt.ovirt_quota_info
ovirt_role:
redirect: ovirt.ovirt.ovirt_role
ovirt_scheduling_policy_info:
redirect: ovirt.ovirt.ovirt_scheduling_policy_info
ovirt_snapshot:
redirect: ovirt.ovirt.ovirt_snapshot
ovirt_snapshot_info:
redirect: ovirt.ovirt.ovirt_snapshot_info
ovirt_storage_connection:
redirect: ovirt.ovirt.ovirt_storage_connection
ovirt_storage_domain:
redirect: ovirt.ovirt.ovirt_storage_domain
ovirt_storage_domain_info:
redirect: ovirt.ovirt.ovirt_storage_domain_info
ovirt_storage_template_info:
redirect: ovirt.ovirt.ovirt_storage_template_info
ovirt_storage_vm_info:
redirect: ovirt.ovirt.ovirt_storage_vm_info
ovirt_tag:
redirect: ovirt.ovirt.ovirt_tag
ovirt_tag_info:
redirect: ovirt.ovirt.ovirt_tag_info
ovirt_template:
redirect: ovirt.ovirt.ovirt_template
ovirt_template_info:
redirect: ovirt.ovirt.ovirt_template_info
ovirt_user:
redirect: ovirt.ovirt.ovirt_user
ovirt_user_info:
redirect: ovirt.ovirt.ovirt_user_info
ovirt_vm:
redirect: ovirt.ovirt.ovirt_vm
ovirt_vm_info:
redirect: ovirt.ovirt.ovirt_vm_info
ovirt_vmpool:
redirect: ovirt.ovirt.ovirt_vmpool
ovirt_vmpool_info:
redirect: ovirt.ovirt.ovirt_vmpool_info
ovirt_vnic_profile:
redirect: ovirt.ovirt.ovirt_vnic_profile
ovirt_vnic_profile_info:
redirect: ovirt.ovirt.ovirt_vnic_profile_info
dellos10_command:
redirect: dellemc_networking.os10.dellos10_command
dellos10_facts:
redirect: dellemc_networking.os10.dellos10_facts
dellos10_config:
redirect: dellemc_networking.os10.dellos10_config
dellos9_facts:
redirect: dellemc_networking.os9.dellos9_facts
dellos9_command:
redirect: dellemc_networking.os9.dellos9_command
dellos9_config:
redirect: dellemc_networking.os9.dellos9_config
dellos6_facts:
redirect: dellemc_networking.os6.dellos6_facts
dellos6_config:
redirect: dellemc_networking.os6.dellos6_config
dellos6_command:
redirect: dellemc_networking.os6.dellos6_command
hcloud_location_facts:
redirect: hetzner.hcloud.hcloud_location_facts
hcloud_server_info:
redirect: hetzner.hcloud.hcloud_server_info
hcloud_server_network:
redirect: hetzner.hcloud.hcloud_server_network
hcloud_server_type_info:
redirect: hetzner.hcloud.hcloud_server_type_info
hcloud_route:
redirect: hetzner.hcloud.hcloud_route
hcloud_server:
redirect: hetzner.hcloud.hcloud_server
hcloud_volume_info:
redirect: hetzner.hcloud.hcloud_volume_info
hcloud_server_type_facts:
redirect: hetzner.hcloud.hcloud_server_type_facts
hcloud_ssh_key_info:
redirect: hetzner.hcloud.hcloud_ssh_key_info
hcloud_network_info:
redirect: hetzner.hcloud.hcloud_network_info
hcloud_datacenter_info:
redirect: hetzner.hcloud.hcloud_datacenter_info
hcloud_image_facts:
redirect: hetzner.hcloud.hcloud_image_facts
hcloud_volume_facts:
redirect: hetzner.hcloud.hcloud_volume_facts
hcloud_floating_ip_info:
redirect: hetzner.hcloud.hcloud_floating_ip_info
hcloud_floating_ip_facts:
redirect: hetzner.hcloud.hcloud_floating_ip_facts
hcloud_image_info:
redirect: hetzner.hcloud.hcloud_image_info
hcloud_ssh_key_facts:
redirect: hetzner.hcloud.hcloud_ssh_key_facts
hcloud_location_info:
redirect: hetzner.hcloud.hcloud_location_info
hcloud_network:
redirect: hetzner.hcloud.hcloud_network
hcloud_volume:
redirect: hetzner.hcloud.hcloud_volume
hcloud_ssh_key:
redirect: hetzner.hcloud.hcloud_ssh_key
hcloud_datacenter_facts:
redirect: hetzner.hcloud.hcloud_datacenter_facts
hcloud_rdns:
redirect: hetzner.hcloud.hcloud_rdns
hcloud_floating_ip:
redirect: hetzner.hcloud.hcloud_floating_ip
hcloud_server_facts:
redirect: hetzner.hcloud.hcloud_server_facts
hcloud_subnetwork:
redirect: hetzner.hcloud.hcloud_subnetwork
skydive_capture:
redirect: skydive.skydive.skydive_capture
skydive_edge:
redirect: skydive.skydive.skydive_edge
skydive_node:
redirect: skydive.skydive.skydive_node
cyberark_authentication:
redirect: cyberark.bizdev.cyberark_authentication
cyberark_user:
redirect: cyberark.bizdev.cyberark_user
gcp_appengine_firewall_rule:
redirect: google.cloud.gcp_appengine_firewall_rule
gcp_appengine_firewall_rule_info:
redirect: google.cloud.gcp_appengine_firewall_rule_info
gcp_bigquery_dataset:
redirect: google.cloud.gcp_bigquery_dataset
gcp_bigquery_dataset_info:
redirect: google.cloud.gcp_bigquery_dataset_info
gcp_bigquery_table:
redirect: google.cloud.gcp_bigquery_table
gcp_bigquery_table_info:
redirect: google.cloud.gcp_bigquery_table_info
gcp_cloudbuild_trigger:
redirect: google.cloud.gcp_cloudbuild_trigger
gcp_cloudbuild_trigger_info:
redirect: google.cloud.gcp_cloudbuild_trigger_info
gcp_cloudfunctions_cloud_function:
redirect: google.cloud.gcp_cloudfunctions_cloud_function
gcp_cloudfunctions_cloud_function_info:
redirect: google.cloud.gcp_cloudfunctions_cloud_function_info
gcp_cloudscheduler_job:
redirect: google.cloud.gcp_cloudscheduler_job
gcp_cloudscheduler_job_info:
redirect: google.cloud.gcp_cloudscheduler_job_info
gcp_cloudtasks_queue:
redirect: google.cloud.gcp_cloudtasks_queue
gcp_cloudtasks_queue_info:
redirect: google.cloud.gcp_cloudtasks_queue_info
gcp_compute_address:
redirect: google.cloud.gcp_compute_address
gcp_compute_address_info:
redirect: google.cloud.gcp_compute_address_info
gcp_compute_autoscaler:
redirect: google.cloud.gcp_compute_autoscaler
gcp_compute_autoscaler_info:
redirect: google.cloud.gcp_compute_autoscaler_info
gcp_compute_backend_bucket:
redirect: google.cloud.gcp_compute_backend_bucket
gcp_compute_backend_bucket_info:
redirect: google.cloud.gcp_compute_backend_bucket_info
gcp_compute_backend_service:
redirect: google.cloud.gcp_compute_backend_service
gcp_compute_backend_service_info:
redirect: google.cloud.gcp_compute_backend_service_info
gcp_compute_disk:
redirect: google.cloud.gcp_compute_disk
gcp_compute_disk_info:
redirect: google.cloud.gcp_compute_disk_info
gcp_compute_firewall:
redirect: google.cloud.gcp_compute_firewall
gcp_compute_firewall_info:
redirect: google.cloud.gcp_compute_firewall_info
gcp_compute_forwarding_rule:
redirect: google.cloud.gcp_compute_forwarding_rule
gcp_compute_forwarding_rule_info:
redirect: google.cloud.gcp_compute_forwarding_rule_info
gcp_compute_global_address:
redirect: google.cloud.gcp_compute_global_address
gcp_compute_global_address_info:
redirect: google.cloud.gcp_compute_global_address_info
gcp_compute_global_forwarding_rule:
redirect: google.cloud.gcp_compute_global_forwarding_rule
gcp_compute_global_forwarding_rule_info:
redirect: google.cloud.gcp_compute_global_forwarding_rule_info
gcp_compute_health_check:
redirect: google.cloud.gcp_compute_health_check
gcp_compute_health_check_info:
redirect: google.cloud.gcp_compute_health_check_info
gcp_compute_http_health_check:
redirect: google.cloud.gcp_compute_http_health_check
gcp_compute_http_health_check_info:
redirect: google.cloud.gcp_compute_http_health_check_info
gcp_compute_https_health_check:
redirect: google.cloud.gcp_compute_https_health_check
gcp_compute_https_health_check_info:
redirect: google.cloud.gcp_compute_https_health_check_info
gcp_compute_image:
redirect: google.cloud.gcp_compute_image
gcp_compute_image_info:
redirect: google.cloud.gcp_compute_image_info
gcp_compute_instance:
redirect: google.cloud.gcp_compute_instance
gcp_compute_instance_group:
redirect: google.cloud.gcp_compute_instance_group
gcp_compute_instance_group_info:
redirect: google.cloud.gcp_compute_instance_group_info
gcp_compute_instance_group_manager:
redirect: google.cloud.gcp_compute_instance_group_manager
gcp_compute_instance_group_manager_info:
redirect: google.cloud.gcp_compute_instance_group_manager_info
gcp_compute_instance_info:
redirect: google.cloud.gcp_compute_instance_info
gcp_compute_instance_template:
redirect: google.cloud.gcp_compute_instance_template
gcp_compute_instance_template_info:
redirect: google.cloud.gcp_compute_instance_template_info
gcp_compute_interconnect_attachment:
redirect: google.cloud.gcp_compute_interconnect_attachment
gcp_compute_interconnect_attachment_info:
redirect: google.cloud.gcp_compute_interconnect_attachment_info
gcp_compute_network:
redirect: google.cloud.gcp_compute_network
gcp_compute_network_endpoint_group:
redirect: google.cloud.gcp_compute_network_endpoint_group
gcp_compute_network_endpoint_group_info:
redirect: google.cloud.gcp_compute_network_endpoint_group_info
gcp_compute_network_info:
redirect: google.cloud.gcp_compute_network_info
gcp_compute_node_group:
redirect: google.cloud.gcp_compute_node_group
gcp_compute_node_group_info:
redirect: google.cloud.gcp_compute_node_group_info
gcp_compute_node_template:
redirect: google.cloud.gcp_compute_node_template
gcp_compute_node_template_info:
redirect: google.cloud.gcp_compute_node_template_info
gcp_compute_region_backend_service:
redirect: google.cloud.gcp_compute_region_backend_service
gcp_compute_region_backend_service_info:
redirect: google.cloud.gcp_compute_region_backend_service_info
gcp_compute_region_disk:
redirect: google.cloud.gcp_compute_region_disk
gcp_compute_region_disk_info:
redirect: google.cloud.gcp_compute_region_disk_info
gcp_compute_reservation:
redirect: google.cloud.gcp_compute_reservation
gcp_compute_reservation_info:
redirect: google.cloud.gcp_compute_reservation_info
gcp_compute_route:
redirect: google.cloud.gcp_compute_route
gcp_compute_route_info:
redirect: google.cloud.gcp_compute_route_info
gcp_compute_router:
redirect: google.cloud.gcp_compute_router
gcp_compute_router_info:
redirect: google.cloud.gcp_compute_router_info
gcp_compute_snapshot:
redirect: google.cloud.gcp_compute_snapshot
gcp_compute_snapshot_info:
redirect: google.cloud.gcp_compute_snapshot_info
gcp_compute_ssl_certificate:
redirect: google.cloud.gcp_compute_ssl_certificate
gcp_compute_ssl_certificate_info:
redirect: google.cloud.gcp_compute_ssl_certificate_info
gcp_compute_ssl_policy:
redirect: google.cloud.gcp_compute_ssl_policy
gcp_compute_ssl_policy_info:
redirect: google.cloud.gcp_compute_ssl_policy_info
gcp_compute_subnetwork:
redirect: google.cloud.gcp_compute_subnetwork
gcp_compute_subnetwork_info:
redirect: google.cloud.gcp_compute_subnetwork_info
gcp_compute_target_http_proxy:
redirect: google.cloud.gcp_compute_target_http_proxy
gcp_compute_target_http_proxy_info:
redirect: google.cloud.gcp_compute_target_http_proxy_info
gcp_compute_target_https_proxy:
redirect: google.cloud.gcp_compute_target_https_proxy
gcp_compute_target_https_proxy_info:
redirect: google.cloud.gcp_compute_target_https_proxy_info
gcp_compute_target_instance:
redirect: google.cloud.gcp_compute_target_instance
gcp_compute_target_instance_info:
redirect: google.cloud.gcp_compute_target_instance_info
gcp_compute_target_pool:
redirect: google.cloud.gcp_compute_target_pool
gcp_compute_target_pool_info:
redirect: google.cloud.gcp_compute_target_pool_info
gcp_compute_target_ssl_proxy:
redirect: google.cloud.gcp_compute_target_ssl_proxy
gcp_compute_target_ssl_proxy_info:
redirect: google.cloud.gcp_compute_target_ssl_proxy_info
gcp_compute_target_tcp_proxy:
redirect: google.cloud.gcp_compute_target_tcp_proxy
gcp_compute_target_tcp_proxy_info:
redirect: google.cloud.gcp_compute_target_tcp_proxy_info
gcp_compute_target_vpn_gateway:
redirect: google.cloud.gcp_compute_target_vpn_gateway
gcp_compute_target_vpn_gateway_info:
redirect: google.cloud.gcp_compute_target_vpn_gateway_info
gcp_compute_url_map:
redirect: google.cloud.gcp_compute_url_map
gcp_compute_url_map_info:
redirect: google.cloud.gcp_compute_url_map_info
gcp_compute_vpn_tunnel:
redirect: google.cloud.gcp_compute_vpn_tunnel
gcp_compute_vpn_tunnel_info:
redirect: google.cloud.gcp_compute_vpn_tunnel_info
gcp_container_cluster:
redirect: google.cloud.gcp_container_cluster
gcp_container_cluster_info:
redirect: google.cloud.gcp_container_cluster_info
gcp_container_node_pool:
redirect: google.cloud.gcp_container_node_pool
gcp_container_node_pool_info:
redirect: google.cloud.gcp_container_node_pool_info
gcp_dns_managed_zone:
redirect: google.cloud.gcp_dns_managed_zone
gcp_dns_managed_zone_info:
redirect: google.cloud.gcp_dns_managed_zone_info
gcp_dns_resource_record_set:
redirect: google.cloud.gcp_dns_resource_record_set
gcp_dns_resource_record_set_info:
redirect: google.cloud.gcp_dns_resource_record_set_info
gcp_filestore_instance:
redirect: google.cloud.gcp_filestore_instance
gcp_filestore_instance_info:
redirect: google.cloud.gcp_filestore_instance_info
gcp_iam_role:
redirect: google.cloud.gcp_iam_role
gcp_iam_role_info:
redirect: google.cloud.gcp_iam_role_info
gcp_iam_service_account:
redirect: google.cloud.gcp_iam_service_account
gcp_iam_service_account_info:
redirect: google.cloud.gcp_iam_service_account_info
gcp_iam_service_account_key:
redirect: google.cloud.gcp_iam_service_account_key
gcp_kms_crypto_key:
redirect: google.cloud.gcp_kms_crypto_key
gcp_kms_crypto_key_info:
redirect: google.cloud.gcp_kms_crypto_key_info
gcp_kms_key_ring:
redirect: google.cloud.gcp_kms_key_ring
gcp_kms_key_ring_info:
redirect: google.cloud.gcp_kms_key_ring_info
gcp_logging_metric:
redirect: google.cloud.gcp_logging_metric
gcp_logging_metric_info:
redirect: google.cloud.gcp_logging_metric_info
gcp_mlengine_model:
redirect: google.cloud.gcp_mlengine_model
gcp_mlengine_model_info:
redirect: google.cloud.gcp_mlengine_model_info
gcp_mlengine_version:
redirect: google.cloud.gcp_mlengine_version
gcp_mlengine_version_info:
redirect: google.cloud.gcp_mlengine_version_info
gcp_pubsub_subscription:
redirect: google.cloud.gcp_pubsub_subscription
gcp_pubsub_subscription_info:
redirect: google.cloud.gcp_pubsub_subscription_info
gcp_pubsub_topic:
redirect: google.cloud.gcp_pubsub_topic
gcp_pubsub_topic_info:
redirect: google.cloud.gcp_pubsub_topic_info
gcp_redis_instance:
redirect: google.cloud.gcp_redis_instance
gcp_redis_instance_info:
redirect: google.cloud.gcp_redis_instance_info
gcp_resourcemanager_project:
redirect: google.cloud.gcp_resourcemanager_project
gcp_resourcemanager_project_info:
redirect: google.cloud.gcp_resourcemanager_project_info
gcp_runtimeconfig_config:
redirect: google.cloud.gcp_runtimeconfig_config
gcp_runtimeconfig_config_info:
redirect: google.cloud.gcp_runtimeconfig_config_info
gcp_runtimeconfig_variable:
redirect: google.cloud.gcp_runtimeconfig_variable
gcp_runtimeconfig_variable_info:
redirect: google.cloud.gcp_runtimeconfig_variable_info
gcp_serviceusage_service:
redirect: google.cloud.gcp_serviceusage_service
gcp_serviceusage_service_info:
redirect: google.cloud.gcp_serviceusage_service_info
gcp_sourcerepo_repository:
redirect: google.cloud.gcp_sourcerepo_repository
gcp_sourcerepo_repository_info:
redirect: google.cloud.gcp_sourcerepo_repository_info
gcp_spanner_database:
redirect: google.cloud.gcp_spanner_database
gcp_spanner_database_info:
redirect: google.cloud.gcp_spanner_database_info
gcp_spanner_instance:
redirect: google.cloud.gcp_spanner_instance
gcp_spanner_instance_info:
redirect: google.cloud.gcp_spanner_instance_info
gcp_sql_database:
redirect: google.cloud.gcp_sql_database
gcp_sql_database_info:
redirect: google.cloud.gcp_sql_database_info
gcp_sql_instance:
redirect: google.cloud.gcp_sql_instance
gcp_sql_instance_info:
redirect: google.cloud.gcp_sql_instance_info
gcp_sql_user:
redirect: google.cloud.gcp_sql_user
gcp_sql_user_info:
redirect: google.cloud.gcp_sql_user_info
gcp_storage_bucket:
redirect: google.cloud.gcp_storage_bucket
gcp_storage_bucket_access_control:
redirect: google.cloud.gcp_storage_bucket_access_control
gcp_storage_object:
redirect: google.cloud.gcp_storage_object
gcp_tpu_node:
redirect: google.cloud.gcp_tpu_node
gcp_tpu_node_info:
redirect: google.cloud.gcp_tpu_node_info
purefa_alert:
redirect: purestorage.flasharray.purefa_alert
purefa_arrayname:
redirect: purestorage.flasharray.purefa_arrayname
purefa_banner:
redirect: purestorage.flasharray.purefa_banner
purefa_connect:
redirect: purestorage.flasharray.purefa_connect
purefa_dns:
redirect: purestorage.flasharray.purefa_dns
purefa_ds:
redirect: purestorage.flasharray.purefa_ds
purefa_dsrole:
redirect: purestorage.flasharray.purefa_dsrole
purefa_hg:
redirect: purestorage.flasharray.purefa_hg
purefa_host:
redirect: purestorage.flasharray.purefa_host
purefa_info:
redirect: purestorage.flasharray.purefa_info
purefa_ntp:
redirect: purestorage.flasharray.purefa_ntp
purefa_offload:
redirect: purestorage.flasharray.purefa_offload
purefa_pg:
redirect: purestorage.flasharray.purefa_pg
purefa_pgsnap:
redirect: purestorage.flasharray.purefa_pgsnap
purefa_phonehome:
redirect: purestorage.flasharray.purefa_phonehome
purefa_ra:
redirect: purestorage.flasharray.purefa_ra
purefa_smtp:
redirect: purestorage.flasharray.purefa_smtp
purefa_snap:
redirect: purestorage.flasharray.purefa_snap
purefa_snmp:
redirect: purestorage.flasharray.purefa_snmp
purefa_syslog:
redirect: purestorage.flasharray.purefa_syslog
purefa_user:
redirect: purestorage.flasharray.purefa_user
purefa_vg:
redirect: purestorage.flasharray.purefa_vg
purefa_volume:
redirect: purestorage.flasharray.purefa_volume
purefb_bucket:
redirect: purestorage.flashblade.purefb_bucket
purefb_ds:
redirect: purestorage.flashblade.purefb_ds
purefb_dsrole:
redirect: purestorage.flashblade.purefb_dsrole
purefb_fs:
redirect: purestorage.flashblade.purefb_fs
purefb_info:
redirect: purestorage.flashblade.purefb_info
purefb_network:
redirect: purestorage.flashblade.purefb_network
purefb_ra:
redirect: purestorage.flashblade.purefb_ra
purefb_s3acc:
redirect: purestorage.flashblade.purefb_s3acc
purefb_s3user:
redirect: purestorage.flashblade.purefb_s3user
purefb_smtp:
redirect: purestorage.flashblade.purefb_smtp
purefb_snap:
redirect: purestorage.flashblade.purefb_snap
purefb_subnet:
redirect: purestorage.flashblade.purefb_subnet
azure_rm_acs:
redirect: azure.azcollection.azure_rm_acs
azure_rm_virtualmachine_info:
redirect: azure.azcollection.azure_rm_virtualmachine_info
azure_rm_dnsrecordset_info:
redirect: azure.azcollection.azure_rm_dnsrecordset_info
azure_rm_dnszone_info:
redirect: azure.azcollection.azure_rm_dnszone_info
azure_rm_networkinterface_info:
redirect: azure.azcollection.azure_rm_networkinterface_info
azure_rm_publicipaddress_info:
redirect: azure.azcollection.azure_rm_publicipaddress_info
azure_rm_securitygroup_info:
redirect: azure.azcollection.azure_rm_securitygroup_info
azure_rm_storageaccount_info:
redirect: azure.azcollection.azure_rm_storageaccount_info
azure_rm_virtualnetwork_info:
redirect: azure.azcollection.azure_rm_virtualnetwork_info
azure_rm_deployment:
redirect: azure.azcollection.azure_rm_deployment
azure_rm_dnsrecordset:
redirect: azure.azcollection.azure_rm_dnsrecordset
azure_rm_dnszone:
redirect: azure.azcollection.azure_rm_dnszone
azure_rm_networkinterface:
redirect: azure.azcollection.azure_rm_networkinterface
azure_rm_publicipaddress:
redirect: azure.azcollection.azure_rm_publicipaddress
azure_rm_securitygroup:
redirect: azure.azcollection.azure_rm_securitygroup
azure_rm_storageaccount:
redirect: azure.azcollection.azure_rm_storageaccount
azure_rm_subnet:
redirect: azure.azcollection.azure_rm_subnet
azure_rm_virtualmachine:
redirect: azure.azcollection.azure_rm_virtualmachine
azure_rm_virtualnetwork:
redirect: azure.azcollection.azure_rm_virtualnetwork
azure_rm_aks:
redirect: azure.azcollection.azure_rm_aks
azure_rm_aks_info:
redirect: azure.azcollection.azure_rm_aks_info
azure_rm_aksversion_info:
redirect: azure.azcollection.azure_rm_aksversion_info
azure_rm_appgateway:
redirect: azure.azcollection.azure_rm_appgateway
azure_rm_applicationsecuritygroup:
redirect: azure.azcollection.azure_rm_applicationsecuritygroup
azure_rm_applicationsecuritygroup_info:
redirect: azure.azcollection.azure_rm_applicationsecuritygroup_info
azure_rm_appserviceplan:
redirect: azure.azcollection.azure_rm_appserviceplan
azure_rm_appserviceplan_info:
redirect: azure.azcollection.azure_rm_appserviceplan_info
azure_rm_availabilityset:
redirect: azure.azcollection.azure_rm_availabilityset
azure_rm_availabilityset_info:
redirect: azure.azcollection.azure_rm_availabilityset_info
azure_rm_containerinstance:
redirect: azure.azcollection.azure_rm_containerinstance
azure_rm_containerinstance_info:
redirect: azure.azcollection.azure_rm_containerinstance_info
azure_rm_containerregistry:
redirect: azure.azcollection.azure_rm_containerregistry
azure_rm_containerregistry_info:
redirect: azure.azcollection.azure_rm_containerregistry_info
azure_rm_deployment_info:
redirect: azure.azcollection.azure_rm_deployment_info
azure_rm_functionapp:
redirect: azure.azcollection.azure_rm_functionapp
azure_rm_functionapp_info:
redirect: azure.azcollection.azure_rm_functionapp_info
azure_rm_gallery:
redirect: azure.azcollection.azure_rm_gallery
azure_rm_gallery_info:
redirect: azure.azcollection.azure_rm_gallery_info
azure_rm_galleryimage:
redirect: azure.azcollection.azure_rm_galleryimage
azure_rm_galleryimage_info:
redirect: azure.azcollection.azure_rm_galleryimage_info
azure_rm_galleryimageversion:
redirect: azure.azcollection.azure_rm_galleryimageversion
azure_rm_galleryimageversion_info:
redirect: azure.azcollection.azure_rm_galleryimageversion_info
azure_rm_image:
redirect: azure.azcollection.azure_rm_image
azure_rm_image_info:
redirect: azure.azcollection.azure_rm_image_info
azure_rm_keyvault:
redirect: azure.azcollection.azure_rm_keyvault
azure_rm_keyvault_info:
redirect: azure.azcollection.azure_rm_keyvault_info
azure_rm_keyvaultkey:
redirect: azure.azcollection.azure_rm_keyvaultkey
azure_rm_keyvaultkey_info:
redirect: azure.azcollection.azure_rm_keyvaultkey_info
azure_rm_keyvaultsecret:
redirect: azure.azcollection.azure_rm_keyvaultsecret
azure_rm_manageddisk:
redirect: azure.azcollection.azure_rm_manageddisk
azure_rm_manageddisk_info:
redirect: azure.azcollection.azure_rm_manageddisk_info
azure_rm_resource:
redirect: azure.azcollection.azure_rm_resource
azure_rm_resource_info:
redirect: azure.azcollection.azure_rm_resource_info
azure_rm_resourcegroup:
redirect: azure.azcollection.azure_rm_resourcegroup
azure_rm_resourcegroup_info:
redirect: azure.azcollection.azure_rm_resourcegroup_info
azure_rm_snapshot:
redirect: azure.azcollection.azure_rm_snapshot
azure_rm_storageblob:
redirect: azure.azcollection.azure_rm_storageblob
azure_rm_subnet_info:
redirect: azure.azcollection.azure_rm_subnet_info
azure_rm_virtualmachineextension:
redirect: azure.azcollection.azure_rm_virtualmachineextension
azure_rm_virtualmachineextension_info:
redirect: azure.azcollection.azure_rm_virtualmachineextension_info
azure_rm_virtualmachineimage_info:
redirect: azure.azcollection.azure_rm_virtualmachineimage_info
azure_rm_virtualmachinescaleset:
redirect: azure.azcollection.azure_rm_virtualmachinescaleset
azure_rm_virtualmachinescaleset_info:
redirect: azure.azcollection.azure_rm_virtualmachinescaleset_info
azure_rm_virtualmachinescalesetextension:
redirect: azure.azcollection.azure_rm_virtualmachinescalesetextension
azure_rm_virtualmachinescalesetextension_info:
redirect: azure.azcollection.azure_rm_virtualmachinescalesetextension_info
azure_rm_virtualmachinescalesetinstance:
redirect: azure.azcollection.azure_rm_virtualmachinescalesetinstance
azure_rm_virtualmachinescalesetinstance_info:
redirect: azure.azcollection.azure_rm_virtualmachinescalesetinstance_info
azure_rm_webapp:
redirect: azure.azcollection.azure_rm_webapp
azure_rm_webapp_info:
redirect: azure.azcollection.azure_rm_webapp_info
azure_rm_webappslot:
redirect: azure.azcollection.azure_rm_webappslot
azure_rm_automationaccount:
redirect: azure.azcollection.azure_rm_automationaccount
azure_rm_automationaccount_info:
redirect: azure.azcollection.azure_rm_automationaccount_info
azure_rm_autoscale:
redirect: azure.azcollection.azure_rm_autoscale
azure_rm_autoscale_info:
redirect: azure.azcollection.azure_rm_autoscale_info
azure_rm_azurefirewall:
redirect: azure.azcollection.azure_rm_azurefirewall
azure_rm_azurefirewall_info:
redirect: azure.azcollection.azure_rm_azurefirewall_info
azure_rm_batchaccount:
redirect: azure.azcollection.azure_rm_batchaccount
azure_rm_cdnendpoint:
redirect: azure.azcollection.azure_rm_cdnendpoint
azure_rm_cdnendpoint_info:
redirect: azure.azcollection.azure_rm_cdnendpoint_info
azure_rm_cdnprofile:
redirect: azure.azcollection.azure_rm_cdnprofile
azure_rm_cdnprofile_info:
redirect: azure.azcollection.azure_rm_cdnprofile_info
azure_rm_iotdevice:
redirect: azure.azcollection.azure_rm_iotdevice
azure_rm_iotdevice_info:
redirect: azure.azcollection.azure_rm_iotdevice_info
azure_rm_iotdevicemodule:
redirect: azure.azcollection.azure_rm_iotdevicemodule
azure_rm_iothub:
redirect: azure.azcollection.azure_rm_iothub
azure_rm_iothub_info:
redirect: azure.azcollection.azure_rm_iothub_info
azure_rm_iothubconsumergroup:
redirect: azure.azcollection.azure_rm_iothubconsumergroup
azure_rm_loadbalancer:
redirect: azure.azcollection.azure_rm_loadbalancer
azure_rm_loadbalancer_info:
redirect: azure.azcollection.azure_rm_loadbalancer_info
azure_rm_lock:
redirect: azure.azcollection.azure_rm_lock
azure_rm_lock_info:
redirect: azure.azcollection.azure_rm_lock_info
azure_rm_loganalyticsworkspace:
redirect: azure.azcollection.azure_rm_loganalyticsworkspace
azure_rm_loganalyticsworkspace_info:
redirect: azure.azcollection.azure_rm_loganalyticsworkspace_info
azure_rm_monitorlogprofile:
redirect: azure.azcollection.azure_rm_monitorlogprofile
azure_rm_rediscache:
redirect: azure.azcollection.azure_rm_rediscache
azure_rm_rediscache_info:
redirect: azure.azcollection.azure_rm_rediscache_info
azure_rm_rediscachefirewallrule:
redirect: azure.azcollection.azure_rm_rediscachefirewallrule
azure_rm_roleassignment:
redirect: azure.azcollection.azure_rm_roleassignment
azure_rm_roleassignment_info:
redirect: azure.azcollection.azure_rm_roleassignment_info
azure_rm_roledefinition:
redirect: azure.azcollection.azure_rm_roledefinition
azure_rm_roledefinition_info:
redirect: azure.azcollection.azure_rm_roledefinition_info
azure_rm_route:
redirect: azure.azcollection.azure_rm_route
azure_rm_routetable:
redirect: azure.azcollection.azure_rm_routetable
azure_rm_routetable_info:
redirect: azure.azcollection.azure_rm_routetable_info
azure_rm_servicebus:
redirect: azure.azcollection.azure_rm_servicebus
azure_rm_servicebus_info:
redirect: azure.azcollection.azure_rm_servicebus_info
azure_rm_servicebusqueue:
redirect: azure.azcollection.azure_rm_servicebusqueue
azure_rm_servicebussaspolicy:
redirect: azure.azcollection.azure_rm_servicebussaspolicy
azure_rm_servicebustopic:
redirect: azure.azcollection.azure_rm_servicebustopic
azure_rm_servicebustopicsubscription:
redirect: azure.azcollection.azure_rm_servicebustopicsubscription
azure_rm_trafficmanagerendpoint:
redirect: azure.azcollection.azure_rm_trafficmanagerendpoint
azure_rm_trafficmanagerendpoint_info:
redirect: azure.azcollection.azure_rm_trafficmanagerendpoint_info
azure_rm_trafficmanagerprofile:
redirect: azure.azcollection.azure_rm_trafficmanagerprofile
azure_rm_trafficmanagerprofile_info:
redirect: azure.azcollection.azure_rm_trafficmanagerprofile_info
azure_rm_virtualnetworkgateway:
redirect: azure.azcollection.azure_rm_virtualnetworkgateway
azure_rm_virtualnetworkpeering:
redirect: azure.azcollection.azure_rm_virtualnetworkpeering
azure_rm_virtualnetworkpeering_info:
redirect: azure.azcollection.azure_rm_virtualnetworkpeering_info
azure_rm_cosmosdbaccount:
redirect: azure.azcollection.azure_rm_cosmosdbaccount
azure_rm_cosmosdbaccount_info:
redirect: azure.azcollection.azure_rm_cosmosdbaccount_info
azure_rm_devtestlab:
redirect: azure.azcollection.azure_rm_devtestlab
azure_rm_devtestlab_info:
redirect: azure.azcollection.azure_rm_devtestlab_info
azure_rm_devtestlabarmtemplate_info:
redirect: azure.azcollection.azure_rm_devtestlabarmtemplate_info
azure_rm_devtestlabartifact_info:
redirect: azure.azcollection.azure_rm_devtestlabartifact_info
azure_rm_devtestlabartifactsource:
redirect: azure.azcollection.azure_rm_devtestlabartifactsource
azure_rm_devtestlabartifactsource_info:
redirect: azure.azcollection.azure_rm_devtestlabartifactsource_info
azure_rm_devtestlabcustomimage:
redirect: azure.azcollection.azure_rm_devtestlabcustomimage
azure_rm_devtestlabcustomimage_info:
redirect: azure.azcollection.azure_rm_devtestlabcustomimage_info
azure_rm_devtestlabenvironment:
redirect: azure.azcollection.azure_rm_devtestlabenvironment
azure_rm_devtestlabenvironment_info:
redirect: azure.azcollection.azure_rm_devtestlabenvironment_info
azure_rm_devtestlabpolicy:
redirect: azure.azcollection.azure_rm_devtestlabpolicy
azure_rm_devtestlabpolicy_info:
redirect: azure.azcollection.azure_rm_devtestlabpolicy_info
azure_rm_devtestlabschedule:
redirect: azure.azcollection.azure_rm_devtestlabschedule
azure_rm_devtestlabschedule_info:
redirect: azure.azcollection.azure_rm_devtestlabschedule_info
azure_rm_devtestlabvirtualmachine:
redirect: azure.azcollection.azure_rm_devtestlabvirtualmachine
azure_rm_devtestlabvirtualmachine_info:
redirect: azure.azcollection.azure_rm_devtestlabvirtualmachine_info
azure_rm_devtestlabvirtualnetwork:
redirect: azure.azcollection.azure_rm_devtestlabvirtualnetwork
azure_rm_devtestlabvirtualnetwork_info:
redirect: azure.azcollection.azure_rm_devtestlabvirtualnetwork_info
azure_rm_hdinsightcluster:
redirect: azure.azcollection.azure_rm_hdinsightcluster
azure_rm_hdinsightcluster_info:
redirect: azure.azcollection.azure_rm_hdinsightcluster_info
azure_rm_mariadbconfiguration:
redirect: azure.azcollection.azure_rm_mariadbconfiguration
azure_rm_mariadbconfiguration_info:
redirect: azure.azcollection.azure_rm_mariadbconfiguration_info
azure_rm_mariadbdatabase:
redirect: azure.azcollection.azure_rm_mariadbdatabase
azure_rm_mariadbdatabase_info:
redirect: azure.azcollection.azure_rm_mariadbdatabase_info
azure_rm_mariadbfirewallrule:
redirect: azure.azcollection.azure_rm_mariadbfirewallrule
azure_rm_mariadbfirewallrule_info:
redirect: azure.azcollection.azure_rm_mariadbfirewallrule_info
azure_rm_mariadbserver:
redirect: azure.azcollection.azure_rm_mariadbserver
azure_rm_mariadbserver_info:
redirect: azure.azcollection.azure_rm_mariadbserver_info
azure_rm_mysqlconfiguration:
redirect: azure.azcollection.azure_rm_mysqlconfiguration
azure_rm_mysqlconfiguration_info:
redirect: azure.azcollection.azure_rm_mysqlconfiguration_info
azure_rm_mysqldatabase:
redirect: azure.azcollection.azure_rm_mysqldatabase
azure_rm_mysqldatabase_info:
redirect: azure.azcollection.azure_rm_mysqldatabase_info
azure_rm_mysqlfirewallrule:
redirect: azure.azcollection.azure_rm_mysqlfirewallrule
azure_rm_mysqlfirewallrule_info:
redirect: azure.azcollection.azure_rm_mysqlfirewallrule_info
azure_rm_mysqlserver:
redirect: azure.azcollection.azure_rm_mysqlserver
azure_rm_mysqlserver_info:
redirect: azure.azcollection.azure_rm_mysqlserver_info
azure_rm_postgresqlconfiguration:
redirect: azure.azcollection.azure_rm_postgresqlconfiguration
azure_rm_postgresqlconfiguration_info:
redirect: azure.azcollection.azure_rm_postgresqlconfiguration_info
azure_rm_postgresqldatabase:
redirect: azure.azcollection.azure_rm_postgresqldatabase
azure_rm_postgresqldatabase_info:
redirect: azure.azcollection.azure_rm_postgresqldatabase_info
azure_rm_postgresqlfirewallrule:
redirect: azure.azcollection.azure_rm_postgresqlfirewallrule
azure_rm_postgresqlfirewallrule_info:
redirect: azure.azcollection.azure_rm_postgresqlfirewallrule_info
azure_rm_postgresqlserver:
redirect: azure.azcollection.azure_rm_postgresqlserver
azure_rm_postgresqlserver_info:
redirect: azure.azcollection.azure_rm_postgresqlserver_info
azure_rm_sqldatabase:
redirect: azure.azcollection.azure_rm_sqldatabase
azure_rm_sqldatabase_info:
redirect: azure.azcollection.azure_rm_sqldatabase_info
azure_rm_sqlfirewallrule:
redirect: azure.azcollection.azure_rm_sqlfirewallrule
azure_rm_sqlfirewallrule_info:
redirect: azure.azcollection.azure_rm_sqlfirewallrule_info
azure_rm_sqlserver:
redirect: azure.azcollection.azure_rm_sqlserver
azure_rm_sqlserver_info:
redirect: azure.azcollection.azure_rm_sqlserver_info
openvswitch_port:
redirect: openvswitch.openvswitch.openvswitch_port
openvswitch_db:
redirect: openvswitch.openvswitch.openvswitch_db
openvswitch_bridge:
redirect: openvswitch.openvswitch.openvswitch_bridge
vyos_l3_interface:
redirect: vyos.vyos.vyos_l3_interface
vyos_banner:
redirect: vyos.vyos.vyos_banner
vyos_firewall_rules:
redirect: vyos.vyos.vyos_firewall_rules
vyos_static_route:
redirect: vyos.vyos.vyos_static_route
vyos_lldp_interface:
redirect: vyos.vyos.vyos_lldp_interface
vyos_vlan:
redirect: vyos.vyos.vyos_vlan
vyos_user:
redirect: vyos.vyos.vyos_user
vyos_firewall_interfaces:
redirect: vyos.vyos.vyos_firewall_interfaces
vyos_interface:
redirect: vyos.vyos.vyos_interface
vyos_firewall_global:
redirect: vyos.vyos.vyos_firewall_global
vyos_config:
redirect: vyos.vyos.vyos_config
vyos_facts:
redirect: vyos.vyos.vyos_facts
vyos_linkagg:
redirect: vyos.vyos.vyos_linkagg
vyos_ping:
redirect: vyos.vyos.vyos_ping
vyos_lag_interfaces:
redirect: vyos.vyos.vyos_lag_interfaces
vyos_lldp:
redirect: vyos.vyos.vyos_lldp
vyos_lldp_global:
redirect: vyos.vyos.vyos_lldp_global
vyos_l3_interfaces:
redirect: vyos.vyos.vyos_l3_interfaces
vyos_lldp_interfaces:
redirect: vyos.vyos.vyos_lldp_interfaces
vyos_interfaces:
redirect: vyos.vyos.vyos_interfaces
vyos_logging:
redirect: vyos.vyos.vyos_logging
vyos_static_routes:
redirect: vyos.vyos.vyos_static_routes
vyos_command:
redirect: vyos.vyos.vyos_command
vyos_system:
redirect: vyos.vyos.vyos_system
cpm_plugconfig:
redirect: wti.remote.cpm_plugconfig
cpm_plugcontrol:
redirect: wti.remote.cpm_plugcontrol
cpm_serial_port_config:
redirect: wti.remote.cpm_serial_port_config
cpm_serial_port_info:
redirect: wti.remote.cpm_serial_port_info
cpm_user:
redirect: wti.remote.cpm_user
module_utils:
common:
redirect: f5networks.f5_modules.common
frr:
redirect: frr.frr.frr
module:
redirect: cisco.iosxr.module
providers:
redirect: cisco.iosxr.providers
base:
redirect: vyos.vyos.base
neighbors:
redirect: cisco.iosxr.neighbors
process:
redirect: cisco.iosxr.process
address_family:
redirect: cisco.iosxr.address_family
alicloud_ecs:
redirect: community.general.alicloud_ecs
cloud:
redirect: community.general.cloud
cloudscale:
redirect: community.general.cloudscale
cloudstack:
redirect: community.general.cloudstack
database:
redirect: community.general.database
digital_ocean:
redirect: community.general.digital_ocean
dimensiondata:
redirect: community.general.dimensiondata
swarm:
redirect: community.general.swarm
exoscale:
redirect: community.general.exoscale
f5_utils:
redirect: community.general.f5_utils
firewalld:
redirect: community.general.firewalld
gcdns:
redirect: community.general.gcdns
gce:
redirect: community.general.gce
gcp:
redirect: community.general.gcp
gitlab:
redirect: community.general.gitlab
heroku:
redirect: community.general.heroku
hetzner:
redirect: community.general.hetzner
hwc_utils:
redirect: community.general.hwc_utils
ibm_sa_utils:
redirect: community.general.ibm_sa_utils
keycloak:
redirect: community.general.keycloak
infinibox:
redirect: community.general.infinibox
influxdb:
redirect: community.general.influxdb
ipa:
redirect: community.general.ipa
known_hosts:
redirect: community.general.known_hosts
kubevirt:
redirect: community.general.kubevirt
ldap:
redirect: community.general.ldap
linode:
redirect: community.general.linode
lxd:
redirect: community.general.lxd
manageiq:
redirect: community.general.manageiq
memset:
redirect: community.general.memset
mysql:
redirect: community.general.mysql
api:
redirect: skydive.skydive.api
a10:
redirect: community.general.a10
aireos:
redirect: community.general.aireos
aos:
redirect: community.general.aos
apconos:
redirect: community.general.apconos
aruba:
redirect: community.general.aruba
ansible_utils:
redirect: community.general.ansible_utils
avi:
redirect: community.general.avi
avi_api:
redirect: community.general.avi_api
bigswitch:
redirect: community.general.bigswitch
ce:
redirect: community.general.ce
cnos:
redirect: community.general.cnos
cnos_devicerules:
redirect: community.general.cnos_devicerules
cnos_errorcodes:
redirect: community.general.cnos_errorcodes
edgeos:
redirect: community.general.edgeos
edgeswitch:
redirect: community.general.edgeswitch
edgeswitch_interface:
redirect: community.general.edgeswitch_interface
enos:
redirect: community.general.enos
eric_eccli:
redirect: community.general.eric_eccli
facts:
redirect: vyos.vyos.facts
l2_interfaces:
redirect: junipernetworks.junos.l2_interfaces
lldp_global:
redirect: vyos.vyos.lldp_global
lldp_interfaces:
redirect: vyos.vyos.lldp_interfaces
vlans:
redirect: junipernetworks.junos.vlans
exos:
redirect: community.general.exos
utils:
redirect: vyos.vyos.utils
iworkflow:
redirect: community.general.iworkflow
legacy:
redirect: community.general.legacy
urls:
redirect: amazon.aws.urls
fortianalyzer:
redirect: community.general.fortianalyzer
configuration:
redirect: community.general.configuration
device:
redirect: community.general.device
fdm_swagger_client:
redirect: community.general.fdm_swagger_client
operation:
redirect: community.general.operation
icx:
redirect: community.general.icx
ironware:
redirect: community.general.ironware
netscaler:
redirect: community.general.netscaler
netvisor:
redirect: community.general.netvisor
pn_nvos:
redirect: community.general.pn_nvos
nos:
redirect: community.general.nos
nso:
redirect: community.general.nso
onyx:
redirect: community.general.onyx
ordnance:
redirect: community.general.ordnance
panos:
redirect: community.general.panos
routeros:
redirect: community.general.routeros
slxos:
redirect: community.general.slxos
sros:
redirect: community.general.sros
voss:
redirect: community.general.voss
oneandone:
redirect: community.general.oneandone
oneview:
redirect: community.general.oneview
online:
redirect: community.general.online
opennebula:
redirect: community.general.opennebula
oci_utils:
redirect: community.general.oci_utils
postgres:
redirect: community.general.postgres
pure:
redirect: community.general.pure
rabbitmq:
redirect: community.general.rabbitmq
rax:
redirect: community.general.rax
redfish_utils:
redirect: community.general.redfish_utils
redhat:
redirect: community.general.redhat
dellemc_idrac:
redirect: community.general.dellemc_idrac
ome:
redirect: community.general.ome
scaleway:
redirect: community.general.scaleway
bitbucket:
redirect: community.general.bitbucket
emc_vnx:
redirect: community.general.emc_vnx
hpe3par:
redirect: community.general.hpe3par
univention_umc:
redirect: community.general.univention_umc
utm_utils:
redirect: community.general.utm_utils
vexata:
redirect: community.general.vexata
vultr:
redirect: community.general.vultr
xenserver:
redirect: community.general.xenserver
raw:
redirect: community.kubernetes.raw
scale:
redirect: community.kubernetes.scale
acme:
redirect: community.crypto.acme
crypto:
redirect: community.crypto.crypto
VmwareRestModule:
redirect: community.vmware_rest.VmwareRestModule
vca:
redirect: community.vmware.vca
vmware:
redirect: community.vmware.vmware
vmware_rest_client:
redirect: community.vmware.vmware_rest_client
vmware_spbm:
redirect: community.vmware.vmware_spbm
service_now:
redirect: servicenow.servicenow.service_now
acm:
redirect: amazon.aws.acm
batch:
redirect: amazon.aws.batch
cloudfront_facts:
redirect: amazon.aws.cloudfront_facts
core:
redirect: amazon.aws.core
direct_connect:
redirect: amazon.aws.direct_connect
elb_utils:
redirect: amazon.aws.elb_utils
elbv2:
redirect: amazon.aws.elbv2
iam:
redirect: amazon.aws.iam
rds:
redirect: amazon.aws.rds
s3:
redirect: amazon.aws.s3
waf:
redirect: amazon.aws.waf
waiters:
redirect: amazon.aws.waiters
ec2:
redirect: amazon.aws.ec2
ipaddress:
redirect: f5networks.f5_modules.ipaddress
network:
redirect: ansible.netcommon.network
parsing:
redirect: ansible.netcommon.parsing
netconf:
redirect: ansible.netcommon.netconf
config:
redirect: ansible.netcommon.config
restconf:
redirect: ansible.netcommon.restconf
ismount:
redirect: ansible.posix.ismount
Ansible.Service:
redirect: ansible.windows.Ansible.Service
fortimanager:
redirect: fortinet.fortios.fortimanager
system:
redirect: fortinet.fortios.system
fortios:
redirect: fortinet.fortios.fortios
netbox_utils:
redirect: netbox.netbox.netbox_utils
netapp:
redirect: netapp.ontap.netapp
netapp_elementsw_module:
redirect: netapp.ontap.netapp_elementsw_module
netapp_module:
redirect: netapp.ontap.netapp_module
checkpoint:
redirect: check_point.mgmt.checkpoint
eos:
redirect: arista.eos.eos
acl_interfaces:
redirect: cisco.nxos.acl_interfaces
static_routes:
redirect: vyos.vyos.static_routes
l3_interfaces:
redirect: vyos.vyos.l3_interfaces
lacp_interfaces:
redirect: junipernetworks.junos.lacp_interfaces
lag_interfaces:
redirect: vyos.vyos.lag_interfaces
interfaces:
redirect: vyos.vyos.interfaces
lacp:
redirect: junipernetworks.junos.lacp
acls:
redirect: cisco.nxos.acls
aci:
redirect: cisco.aci.aci
asa:
redirect: cisco.asa.asa
intersight:
redirect: cisco.intersight.intersight
ios:
redirect: cisco.ios.ios
iosxr:
redirect: cisco.iosxr.iosxr
meraki:
redirect: cisco.meraki.meraki
mso:
redirect: cisco.mso.mso
nxos:
redirect: cisco.nxos.nxos
bfd_interfaces:
redirect: cisco.nxos.bfd_interfaces
telemetry:
redirect: cisco.nxos.telemetry
hsrp_interfaces:
redirect: cisco.nxos.hsrp_interfaces
ucs:
redirect: cisco.ucs.ucs
bigip:
redirect: f5networks.f5_modules.bigip
bigiq:
redirect: f5networks.f5_modules.bigiq
compare:
redirect: f5networks.f5_modules.compare
icontrol:
redirect: f5networks.f5_modules.icontrol
openstack:
redirect: openstack.cloud.openstack
junos:
redirect: junipernetworks.junos.junos
ansible_tower:
redirect: awx.awx.ansible_tower
ovirt:
redirect: ovirt.ovirt.ovirt
dellos10:
redirect: dellemc_networking.os10.dellos10
dellos9:
redirect: dellemc_networking.os9.dellos9
dellos6:
redirect: dellemc_networking.os6.dellos6
hcloud:
redirect: hetzner.hcloud.hcloud
gcp_utils:
redirect: google.cloud.gcp_utils
azure_rm_common:
redirect: azure.azcollection.azure_rm_common
azure_rm_common_ext:
redirect: azure.azcollection.azure_rm_common_ext
azure_rm_common_rest:
redirect: azure.azcollection.azure_rm_common_rest
vyos:
redirect: vyos.vyos.vyos
firewall_global:
redirect: vyos.vyos.firewall_global
firewall_rules:
redirect: vyos.vyos.firewall_rules
firewall_interfaces:
redirect: vyos.vyos.firewall_interfaces
cliconf:
frr:
redirect: frr.frr.frr
aireos:
redirect: community.general.aireos
apconos:
redirect: community.general.apconos
aruba:
redirect: community.general.aruba
ce:
redirect: community.general.ce
cnos:
redirect: community.general.cnos
edgeos:
redirect: community.general.edgeos
edgeswitch:
redirect: community.general.edgeswitch
enos:
redirect: community.general.enos
eric_eccli:
redirect: community.general.eric_eccli
exos:
redirect: community.general.exos
icx:
redirect: community.general.icx
ironware:
redirect: community.general.ironware
netvisor:
redirect: community.general.netvisor
nos:
redirect: community.general.nos
onyx:
redirect: community.general.onyx
routeros:
redirect: community.general.routeros
slxos:
redirect: community.general.slxos
voss:
redirect: community.general.voss
eos:
redirect: arista.eos.eos
asa:
redirect: cisco.asa.asa
ios:
redirect: cisco.ios.ios
iosxr:
redirect: cisco.iosxr.iosxr
nxos:
redirect: cisco.nxos.nxos
junos:
redirect: junipernetworks.junos.junos
dellos10:
redirect: dellemc_networking.os10.dellos10
dellos9:
redirect: dellemc_networking.os9.dellos9
dellos6:
redirect: dellemc_networking.os6.dellos6
vyos:
redirect: vyos.vyos.vyos
terminal:
frr:
redirect: frr.frr.frr
aireos:
redirect: community.general.aireos
apconos:
redirect: community.general.apconos
aruba:
redirect: community.general.aruba
ce:
redirect: community.general.ce
cnos:
redirect: community.general.cnos
edgeos:
redirect: community.general.edgeos
edgeswitch:
redirect: community.general.edgeswitch
enos:
redirect: community.general.enos
eric_eccli:
redirect: community.general.eric_eccli
exos:
redirect: community.general.exos
icx:
redirect: community.general.icx
ironware:
redirect: community.general.ironware
netvisor:
redirect: community.general.netvisor
nos:
redirect: community.general.nos
onyx:
redirect: community.general.onyx
routeros:
redirect: community.general.routeros
slxos:
redirect: community.general.slxos
sros:
redirect: community.general.sros
voss:
redirect: community.general.voss
eos:
redirect: arista.eos.eos
asa:
redirect: cisco.asa.asa
ios:
redirect: cisco.ios.ios
iosxr:
redirect: cisco.iosxr.iosxr
nxos:
redirect: cisco.nxos.nxos
bigip:
redirect: f5networks.f5_modules.bigip
junos:
redirect: junipernetworks.junos.junos
dellos10:
redirect: dellemc_networking.os10.dellos10
dellos9:
redirect: dellemc_networking.os9.dellos9
dellos6:
redirect: dellemc_networking.os6.dellos6
vyos:
redirect: vyos.vyos.vyos
action:
aireos:
redirect: community.general.aireos
aruba:
redirect: community.general.aruba
ce:
redirect: community.general.ce
ce_template:
redirect: community.general.ce_template
cnos:
redirect: community.general.cnos
edgeos_config:
redirect: community.general.edgeos_config
enos:
redirect: community.general.enos
exos:
redirect: community.general.exos
ironware:
redirect: community.general.ironware
nos_config:
redirect: community.general.nos_config
onyx_config:
redirect: community.general.onyx_config
slxos:
redirect: community.general.slxos
sros:
redirect: community.general.sros
voss:
redirect: community.general.voss
aws_s3:
redirect: amazon.aws.aws_s3
cli_command:
redirect: ansible.netcommon.cli_command
cli_config:
redirect: ansible.netcommon.cli_config
net_base:
redirect: ansible.netcommon.net_base
net_user:
redirect: ansible.netcommon.net_user
net_vlan:
redirect: ansible.netcommon.net_vlan
net_static_route:
redirect: ansible.netcommon.net_static_route
net_lldp:
redirect: ansible.netcommon.net_lldp
net_vrf:
redirect: ansible.netcommon.net_vrf
net_ping:
redirect: ansible.netcommon.net_ping
net_l3_interface:
redirect: ansible.netcommon.net_l3_interface
net_l2_interface:
redirect: ansible.netcommon.net_l2_interface
net_interface:
redirect: ansible.netcommon.net_interface
net_system:
redirect: ansible.netcommon.net_system
net_lldp_interface:
redirect: ansible.netcommon.net_lldp_interface
net_put:
redirect: ansible.netcommon.net_put
net_get:
redirect: ansible.netcommon.net_get
net_logging:
redirect: ansible.netcommon.net_logging
net_banner:
redirect: ansible.netcommon.net_banner
net_linkagg:
redirect: ansible.netcommon.net_linkagg
netconf:
redirect: ansible.netcommon.netconf
network:
redirect: ansible.netcommon.network
telnet:
redirect: ansible.netcommon.telnet
patch:
redirect: ansible.posix.patch
synchronize:
redirect: ansible.posix.synchronize
win_copy:
redirect: ansible.windows.win_copy
win_reboot:
redirect: ansible.windows.win_reboot
win_template:
redirect: ansible.windows.win_template
win_updates:
redirect: ansible.windows.win_updates
fortios_config:
redirect: fortinet.fortios.fortios_config
eos:
redirect: arista.eos.eos
asa:
redirect: cisco.asa.asa
ios:
redirect: cisco.ios.ios
iosxr:
redirect: cisco.iosxr.iosxr
nxos:
redirect: cisco.nxos.nxos
nxos_file_copy:
redirect: cisco.nxos.nxos_file_copy
bigip:
redirect: f5networks.f5_modules.bigip
bigiq:
redirect: f5networks.f5_modules.bigiq
junos:
redirect: junipernetworks.junos.junos
dellos10:
redirect: dellemc_networking.os10.dellos10
dellos9:
redirect: dellemc_networking.os9.dellos9
dellos6:
redirect: dellemc_networking.os6.dellos6
vyos:
redirect: vyos.vyos.vyos
become:
doas:
redirect: community.general.doas
dzdo:
redirect: community.general.dzdo
ksu:
redirect: community.general.ksu
machinectl:
redirect: community.general.machinectl
pbrun:
redirect: community.general.pbrun
pfexec:
redirect: community.general.pfexec
pmrun:
redirect: community.general.pmrun
sesu:
redirect: community.general.sesu
enable:
redirect: ansible.netcommon.enable
cache:
jsonfile:
redirect: community.general.jsonfile
memcached:
redirect: community.general.memcached
pickle:
redirect: community.general.pickle
redis:
redirect: community.general.redis
yaml:
redirect: community.general.yaml
mongodb:
redirect: community.mongo.mongodb
callback:
actionable:
redirect: community.general.actionable
cgroup_memory_recap:
redirect: community.general.cgroup_memory_recap
context_demo:
redirect: community.general.context_demo
counter_enabled:
redirect: community.general.counter_enabled
dense:
redirect: community.general.dense
full_skip:
redirect: community.general.full_skip
hipchat:
redirect: community.general.hipchat
jabber:
redirect: community.general.jabber
log_plays:
redirect: community.general.log_plays
logdna:
redirect: community.general.logdna
logentries:
redirect: community.general.logentries
logstash:
redirect: community.general.logstash
mail:
redirect: community.general.mail
nrdp:
redirect: community.general.nrdp
'null':
redirect: community.general.null
osx_say:
redirect: community.general.osx_say
say:
redirect: community.general.say
selective:
redirect: community.general.selective
slack:
redirect: community.general.slack
splunk:
redirect: community.general.splunk
stderr:
redirect: community.general.stderr
sumologic:
redirect: community.general.sumologic
syslog_json:
redirect: community.general.syslog_json
unixy:
redirect: community.general.unixy
yaml:
redirect: community.general.yaml
grafana_annotations:
redirect: community.grafana.grafana_annotations
aws_resource_actions:
redirect: amazon.aws.aws_resource_actions
cgroup_perf_recap:
redirect: ansible.posix.cgroup_perf_recap
debug:
redirect: ansible.posix.debug
json:
redirect: ansible.posix.json
profile_roles:
redirect: ansible.posix.profile_roles
profile_tasks:
redirect: ansible.posix.profile_tasks
skippy:
redirect: ansible.posix.skippy
timer:
redirect: ansible.posix.timer
foreman:
redirect: theforeman.foreman.foreman
doc_fragments:
a10:
redirect: community.general.a10
aireos:
redirect: community.general.aireos
alicloud:
redirect: community.general.alicloud
aruba:
redirect: community.general.aruba
auth_basic:
redirect: community.general.auth_basic
avi:
redirect: community.general.avi
ce:
redirect: community.general.ce
cloudscale:
redirect: community.general.cloudscale
cloudstack:
redirect: community.general.cloudstack
cnos:
redirect: community.general.cnos
digital_ocean:
redirect: community.general.digital_ocean
dimensiondata:
redirect: community.general.dimensiondata
dimensiondata_wait:
redirect: community.general.dimensiondata_wait
docker:
redirect: community.general.docker
emc:
redirect: community.general.emc
enos:
redirect: community.general.enos
exoscale:
redirect: community.general.exoscale
gcp:
redirect: community.general.gcp
hetzner:
redirect: community.general.hetzner
hpe3par:
redirect: community.general.hpe3par
hwc:
redirect: community.general.hwc
ibm_storage:
redirect: community.general.ibm_storage
infinibox:
redirect: community.general.infinibox
influxdb:
redirect: community.general.influxdb
ingate:
redirect: community.general.ingate
ipa:
redirect: community.general.ipa
ironware:
redirect: community.general.ironware
keycloak:
redirect: community.general.keycloak
kubevirt_common_options:
redirect: community.general.kubevirt_common_options
kubevirt_vm_options:
redirect: community.general.kubevirt_vm_options
ldap:
redirect: community.general.ldap
lxca_common:
redirect: community.general.lxca_common
manageiq:
redirect: community.general.manageiq
mysql:
redirect: community.general.mysql
netscaler:
redirect: community.general.netscaler
nios:
redirect: community.general.nios
nso:
redirect: community.general.nso
oneview:
redirect: community.general.oneview
online:
redirect: community.general.online
onyx:
redirect: community.general.onyx
opennebula:
redirect: community.general.opennebula
openswitch:
redirect: community.general.openswitch
oracle:
redirect: community.general.oracle
oracle_creatable_resource:
redirect: community.general.oracle_creatable_resource
oracle_display_name_option:
redirect: community.general.oracle_display_name_option
oracle_name_option:
redirect: community.general.oracle_name_option
oracle_tags:
redirect: community.general.oracle_tags
oracle_wait_options:
redirect: community.general.oracle_wait_options
ovirt_facts:
redirect: community.general.ovirt_facts
panos:
redirect: community.general.panos
postgres:
redirect: community.general.postgres
proxysql:
redirect: community.general.proxysql
purestorage:
redirect: community.general.purestorage
rabbitmq:
redirect: community.general.rabbitmq
rackspace:
redirect: community.general.rackspace
scaleway:
redirect: community.general.scaleway
sros:
redirect: community.general.sros
utm:
redirect: community.general.utm
vexata:
redirect: community.general.vexata
vultr:
redirect: community.general.vultr
xenserver:
redirect: community.general.xenserver
zabbix:
redirect: community.general.zabbix
k8s_auth_options:
redirect: community.kubernetes.k8s_auth_options
k8s_name_options:
redirect: community.kubernetes.k8s_name_options
k8s_resource_options:
redirect: community.kubernetes.k8s_resource_options
k8s_scale_options:
redirect: community.kubernetes.k8s_scale_options
k8s_state_options:
redirect: community.kubernetes.k8s_state_options
acme:
redirect: community.crypto.acme
ecs_credential:
redirect: community.crypto.ecs_credential
VmwareRestModule:
redirect: community.vmware_rest.VmwareRestModule
VmwareRestModule_filters:
redirect: community.vmware_rest.VmwareRestModule_filters
VmwareRestModule_full:
redirect: community.vmware_rest.VmwareRestModule_full
VmwareRestModule_state:
redirect: community.vmware_rest.VmwareRestModule_state
vca:
redirect: community.vmware.vca
vmware:
redirect: community.vmware.vmware
vmware_rest_client:
redirect: community.vmware.vmware_rest_client
service_now:
redirect: servicenow.servicenow.service_now
aws:
redirect: amazon.aws.aws
aws_credentials:
redirect: amazon.aws.aws_credentials
aws_region:
redirect: amazon.aws.aws_region
ec2:
redirect: amazon.aws.ec2
netconf:
redirect: ansible.netcommon.netconf
network_agnostic:
redirect: ansible.netcommon.network_agnostic
fortios:
redirect: fortinet.fortios.fortios
netapp:
redirect: netapp.ontap.netapp
checkpoint_commands:
redirect: check_point.mgmt.checkpoint_commands
checkpoint_facts:
redirect: check_point.mgmt.checkpoint_facts
checkpoint_objects:
redirect: check_point.mgmt.checkpoint_objects
eos:
redirect: arista.eos.eos
aci:
redirect: cisco.aci.aci
asa:
redirect: cisco.asa.asa
intersight:
redirect: cisco.intersight.intersight
ios:
redirect: cisco.ios.ios
iosxr:
redirect: cisco.iosxr.iosxr
meraki:
redirect: cisco.meraki.meraki
mso:
redirect: cisco.mso.mso
nxos:
redirect: cisco.nxos.nxos
ucs:
redirect: cisco.ucs.ucs
f5:
redirect: f5networks.f5_modules.f5
openstack:
redirect: openstack.cloud.openstack
junos:
redirect: junipernetworks.junos.junos
tower:
redirect: awx.awx.tower
ovirt:
redirect: ovirt.ovirt.ovirt
ovirt_info:
redirect: ovirt.ovirt.ovirt_info
dellos10:
redirect: dellemc_networking.os10.dellos10
dellos9:
redirect: dellemc_networking.os9.dellos9
dellos6:
redirect: dellemc_networking.os6.dellos6
hcloud:
redirect: hetzner.hcloud.hcloud
skydive:
redirect: skydive.skydive.skydive
azure:
redirect: azure.azcollection.azure
azure_tags:
redirect: azure.azcollection.azure_tags
vyos:
redirect: vyos.vyos.vyos
filter:
gcp_kms_encrypt:
redirect: community.general.gcp_kms_encrypt
gcp_kms_decrypt:
redirect: community.general.gcp_kms_decrypt
json_query:
redirect: community.general.json_query
random_mac:
redirect: community.general.random_mac
k8s_config_resource_name:
redirect: community.kubernetes.k8s_config_resource_name
cidr_merge:
redirect: ansible.netcommon.cidr_merge
ipaddr:
redirect: ansible.netcommon.ipaddr
ipmath:
redirect: ansible.netcommon.ipmath
ipwrap:
redirect: ansible.netcommon.ipwrap
ip4_hex:
redirect: ansible.netcommon.ip4_hex
ipv4:
redirect: ansible.netcommon.ipv4
ipv6:
redirect: ansible.netcommon.ipv6
ipsubnet:
redirect: ansible.netcommon.ipsubnet
next_nth_usable:
redirect: ansible.netcommon.next_nth_usable
network_in_network:
redirect: ansible.netcommon.network_in_network
network_in_usable:
redirect: ansible.netcommon.network_in_usable
reduce_on_network:
redirect: ansible.netcommon.reduce_on_network
nthhost:
redirect: ansible.netcommon.nthhost
previous_nth_usable:
redirect: ansible.netcommon.previous_nth_usable
slaac:
redirect: ansible.netcommon.slaac
hwaddr:
redirect: ansible.netcommon.hwaddr
parse_cli:
redirect: ansible.netcommon.parse_cli
parse_cli_textsfm:
redirect: ansible.netcommon.parse_cli_textsfm
parse_xml:
redirect: ansible.netcommon.parse_xml
type5_pw:
redirect: ansible.netcommon.type5_pw
hash_salt:
redirect: ansible.netcommon.hash_salt
comp_type5:
redirect: ansible.netcommon.comp_type5
vlan_parser:
redirect: ansible.netcommon.vlan_parser
httpapi:
exos:
redirect: community.general.exos
fortianalyzer:
redirect: community.general.fortianalyzer
fortimanager:
redirect: community.general.fortimanager
ftd:
redirect: community.general.ftd
vmware:
redirect: community.vmware.vmware
restconf:
redirect: ansible.netcommon.restconf
fortios:
redirect: fortinet.fortios.fortios
checkpoint:
redirect: check_point.mgmt.checkpoint
eos:
redirect: arista.eos.eos
nxos:
redirect: cisco.nxos.nxos
splunk:
redirect: splunk.enterprise_security.splunk
qradar:
redirect: ibm.qradar.qradar
inventory:
cloudscale:
redirect: community.general.cloudscale
docker_machine:
redirect: community.general.docker_machine
docker_swarm:
redirect: community.general.docker_swarm
gitlab_runners:
redirect: community.general.gitlab_runners
kubevirt:
redirect: community.general.kubevirt
linode:
redirect: community.general.linode
nmap:
redirect: community.general.nmap
online:
redirect: community.general.online
scaleway:
redirect: community.general.scaleway
virtualbox:
redirect: community.general.virtualbox
vultr:
redirect: community.general.vultr
k8s:
redirect: community.kubernetes.k8s
openshift:
redirect: community.kubernetes.openshift
vmware_vm_inventory:
redirect: community.vmware.vmware_vm_inventory
aws_ec2:
redirect: amazon.aws.aws_ec2
aws_rds:
redirect: amazon.aws.aws_rds
foreman:
redirect: theforeman.foreman.foreman
netbox:
redirect: netbox.netbox.netbox
openstack:
redirect: openstack.cloud.openstack
tower:
redirect: awx.awx.tower
hcloud:
redirect: hetzner.hcloud.hcloud
gcp_compute:
redirect: google.cloud.gcp_compute
azure_rm:
redirect: azure.azcollection.azure_rm
lookup:
avi:
redirect: community.general.avi
cartesian:
redirect: community.general.cartesian
chef_databag:
redirect: community.general.chef_databag
conjur_variable:
redirect: community.general.conjur_variable
consul_kv:
redirect: community.general.consul_kv
credstash:
redirect: community.general.credstash
cyberarkpassword:
redirect: community.general.cyberarkpassword
dig:
redirect: community.general.dig
dnstxt:
redirect: community.general.dnstxt
etcd:
redirect: community.general.etcd
filetree:
redirect: community.general.filetree
flattened:
redirect: community.general.flattened
gcp_storage_file:
redirect: community.general.gcp_storage_file
hashi_vault:
redirect: community.general.hashi_vault
hiera:
redirect: community.general.hiera
keyring:
redirect: community.general.keyring
lastpass:
redirect: community.general.lastpass
lmdb_kv:
redirect: community.general.lmdb_kv
manifold:
redirect: community.general.manifold
nios:
redirect: community.general.nios
nios_next_ip:
redirect: community.general.nios_next_ip
nios_next_network:
redirect: community.general.nios_next_network
onepassword:
redirect: community.general.onepassword
onepassword_raw:
redirect: community.general.onepassword_raw
passwordstore:
redirect: community.general.passwordstore
rabbitmq:
redirect: community.general.rabbitmq
redis:
redirect: community.general.redis
shelvefile:
redirect: community.general.shelvefile
grafana_dashboard:
redirect: community.grafana.grafana_dashboard
openshift:
redirect: community.kubernetes.openshift
k8s:
redirect: community.kubernetes.k8s
mongodb:
redirect: community.mongo.mongodb
laps_password:
redirect: community.windows.laps_password
aws_account_attribute:
redirect: amazon.aws.aws_account_attribute
aws_secret:
redirect: amazon.aws.aws_secret
aws_service_ip_ranges:
redirect: amazon.aws.aws_service_ip_ranges
aws_ssm:
redirect: amazon.aws.aws_ssm
skydive:
redirect: skydive.skydive.skydive
cpm_metering:
redirect: wti.remote.cpm_metering
cpm_status:
redirect: wti.remote.cpm_status
netconf:
ce:
redirect: community.general.ce
sros:
redirect: community.general.sros
default:
redirect: ansible.netcommon.default
iosxr:
redirect: cisco.iosxr.iosxr
junos:
redirect: junipernetworks.junos.junos
shell:
csh:
redirect: ansible.posix.csh
fish:
redirect: ansible.posix.fish
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,637 |
apt_repo should be moved to a collection
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
I've talked about this with @gundalow in https://github.com/theforeman/foreman-ansible-modules/pull/591#issuecomment-561712408 but it seems to be forgotten.
`apt_repo` is for managing ALT Linux repositories, unlike `apt_repository` which is for Debian repositories. As we aim to only support Debian and Red Hat family OSes in "base", `apt_repo` should be moved to a collection.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
apt_repo
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/68637
|
https://github.com/ansible/ansible/pull/68641
|
ae1cd27b575a759e9d2477042fc5dbbb3275cd84
|
40d9650f20133cd6942990df205300fec802511f
| 2020-04-02T13:37:22Z |
python
| 2020-04-02T16:06:12Z |
lib/ansible/modules/packaging/os/apt_repo.py
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright: (c) 2018, Mikhail Gordeev
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'core'}
DOCUMENTATION = '''
---
module: apt_repo
short_description: Manage APT repositories via apt-repo
description:
- Manages APT repositories using apt-repo tool.
- See U(https://www.altlinux.org/Apt-repo) for details about apt-repo
notes:
- This module works on ALT based distros.
- Does NOT support checkmode, due to a limitation in apt-repo tool.
version_added: "2.8"
options:
repo:
description:
- Name of the repository to add or remove.
required: true
state:
description:
- Indicates the desired repository state.
choices: [ absent, present ]
default: present
remove_others:
description:
- Remove other then added repositories
- Used if I(state=present)
type: bool
default: 'no'
update:
description:
- Update the package database after changing repositories.
type: bool
default: 'no'
author:
- Mikhail Gordeev (@obirvalger)
'''
EXAMPLES = '''
- name: Remove all repositories
apt_repo:
repo: all
state: absent
- name: Add repository `Sisysphus` and remove other repositories
apt_repo:
repo: Sisysphus
state: present
remove_others: yes
- name: Add local repository `/space/ALT/Sisyphus` and update package cache
apt_repo:
repo: copy:///space/ALT/Sisyphus
state: present
update: yes
'''
RETURN = ''' # '''
import os
from ansible.module_utils.basic import AnsibleModule
APT_REPO_PATH = "/usr/bin/apt-repo"
def apt_repo(module, *args):
"""run apt-repo with args and return its output"""
# make args list to use in concatenation
args = list(args)
rc, out, err = module.run_command([APT_REPO_PATH] + args)
if rc != 0:
module.fail_json(msg="'%s' failed: %s" % (' '.join(['apt-repo'] + args), err))
return out
def add_repo(module, repo):
"""add a repository"""
apt_repo(module, 'add', repo)
def rm_repo(module, repo):
"""remove a repository"""
apt_repo(module, 'rm', repo)
def set_repo(module, repo):
"""add a repository and remove other repositories"""
# first add to validate repository
apt_repo(module, 'add', repo)
apt_repo(module, 'rm', 'all')
apt_repo(module, 'add', repo)
def update(module):
"""update package cache"""
apt_repo(module, 'update')
def main():
module = AnsibleModule(
argument_spec=dict(
repo=dict(type='str', required=True),
state=dict(type='str', default='present', choices=['absent', 'present']),
remove_others=dict(type='bool', default=False),
update=dict(type='bool', default=False),
),
)
if not os.path.exists(APT_REPO_PATH):
module.fail_json(msg='cannot find /usr/bin/apt-repo')
params = module.params
repo = params['repo']
state = params['state']
old_repositories = apt_repo(module)
if state == 'present':
if params['remove_others']:
set_repo(module, repo)
else:
add_repo(module, repo)
elif state == 'absent':
rm_repo(module, repo)
if params['update']:
update(module)
new_repositories = apt_repo(module)
changed = old_repositories != new_repositories
module.exit_json(changed=changed, repo=repo, state=state)
if __name__ == '__main__':
main()
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 68,637 |
apt_repo should be moved to a collection
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
I've talked about this with @gundalow in https://github.com/theforeman/foreman-ansible-modules/pull/591#issuecomment-561712408 but it seems to be forgotten.
`apt_repo` is for managing ALT Linux repositories, unlike `apt_repository` which is for Debian repositories. As we aim to only support Debian and Red Hat family OSes in "base", `apt_repo` should be moved to a collection.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
apt_repo
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
```
|
https://github.com/ansible/ansible/issues/68637
|
https://github.com/ansible/ansible/pull/68641
|
ae1cd27b575a759e9d2477042fc5dbbb3275cd84
|
40d9650f20133cd6942990df205300fec802511f
| 2020-04-02T13:37:22Z |
python
| 2020-04-02T16:06:12Z |
test/sanity/ignore.txt
|
docs/bin/find-plugin-refs.py future-import-boilerplate
docs/bin/find-plugin-refs.py metaclass-boilerplate
docs/docsite/_extensions/pygments_lexer.py future-import-boilerplate
docs/docsite/_extensions/pygments_lexer.py metaclass-boilerplate
docs/docsite/_themes/sphinx_rtd_theme/__init__.py future-import-boilerplate
docs/docsite/_themes/sphinx_rtd_theme/__init__.py metaclass-boilerplate
docs/docsite/rst/conf.py future-import-boilerplate
docs/docsite/rst/conf.py metaclass-boilerplate
docs/docsite/rst/dev_guide/testing/sanity/no-smart-quotes.rst no-smart-quotes
examples/scripts/ConfigureRemotingForAnsible.ps1 pslint:PSCustomUseLiteralPath
examples/scripts/upgrade_to_ps3.ps1 pslint:PSCustomUseLiteralPath
examples/scripts/upgrade_to_ps3.ps1 pslint:PSUseApprovedVerbs
examples/scripts/uptime.py future-import-boilerplate
examples/scripts/uptime.py metaclass-boilerplate
hacking/build-ansible.py shebang # only run by release engineers, Python 3.6+ required
hacking/build_library/build_ansible/announce.py compile-2.6!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/announce.py compile-2.7!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/announce.py compile-3.5!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/dump_config.py compile-2.6!skip # docs build only, 2.7+ required
hacking/build_library/build_ansible/command_plugins/dump_keywords.py compile-2.6!skip # docs build only, 2.7+ required
hacking/build_library/build_ansible/command_plugins/generate_man.py compile-2.6!skip # docs build only, 2.7+ required
hacking/build_library/build_ansible/command_plugins/plugin_formatter.py compile-2.6!skip # docs build only, 2.7+ required
hacking/build_library/build_ansible/command_plugins/porting_guide.py compile-2.6!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/porting_guide.py compile-2.7!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/porting_guide.py compile-3.5!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/release_announcement.py compile-2.6!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/release_announcement.py compile-2.7!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/release_announcement.py compile-3.5!skip # release process only, 3.6+ required
hacking/build_library/build_ansible/command_plugins/update_intersphinx.py compile-2.6!skip # release process and docs build only, 3.5+ required
hacking/build_library/build_ansible/command_plugins/update_intersphinx.py compile-2.7!skip # release process and docs build only, 3.5+ required
hacking/fix_test_syntax.py future-import-boilerplate
hacking/fix_test_syntax.py metaclass-boilerplate
hacking/get_library.py future-import-boilerplate
hacking/get_library.py metaclass-boilerplate
hacking/report.py future-import-boilerplate
hacking/report.py metaclass-boilerplate
hacking/return_skeleton_generator.py future-import-boilerplate
hacking/return_skeleton_generator.py metaclass-boilerplate
hacking/test-module.py future-import-boilerplate
hacking/test-module.py metaclass-boilerplate
hacking/tests/gen_distribution_version_testcase.py future-import-boilerplate
hacking/tests/gen_distribution_version_testcase.py metaclass-boilerplate
lib/ansible/cli/console.py pylint:blacklisted-name
lib/ansible/cli/scripts/ansible_cli_stub.py shebang
lib/ansible/cli/scripts/ansible_connection_cli_stub.py shebang
lib/ansible/compat/selectors/_selectors2.py future-import-boilerplate # ignore bundled
lib/ansible/compat/selectors/_selectors2.py metaclass-boilerplate # ignore bundled
lib/ansible/compat/selectors/_selectors2.py pylint:blacklisted-name
lib/ansible/config/base.yml no-unwanted-files
lib/ansible/config/module_defaults.yml no-unwanted-files
lib/ansible/executor/playbook_executor.py pylint:blacklisted-name
lib/ansible/executor/powershell/async_watchdog.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/executor/powershell/async_wrapper.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/executor/powershell/exec_wrapper.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/executor/task_queue_manager.py pylint:blacklisted-name
lib/ansible/module_utils/_text.py future-import-boilerplate
lib/ansible/module_utils/_text.py metaclass-boilerplate
lib/ansible/module_utils/api.py future-import-boilerplate
lib/ansible/module_utils/api.py metaclass-boilerplate
lib/ansible/module_utils/basic.py metaclass-boilerplate
lib/ansible/module_utils/common/network.py future-import-boilerplate
lib/ansible/module_utils/common/network.py metaclass-boilerplate
lib/ansible/module_utils/connection.py future-import-boilerplate
lib/ansible/module_utils/connection.py metaclass-boilerplate
lib/ansible/module_utils/distro/__init__.py empty-init # breaks namespacing, bundled, do not override
lib/ansible/module_utils/distro/_distro.py future-import-boilerplate # ignore bundled
lib/ansible/module_utils/distro/_distro.py metaclass-boilerplate # ignore bundled
lib/ansible/module_utils/distro/_distro.py no-assert
lib/ansible/module_utils/distro/_distro.py pep8!skip # bundled code we don't want to modify
lib/ansible/module_utils/facts/__init__.py empty-init # breaks namespacing, deprecate and eventually remove
lib/ansible/module_utils/facts/network/linux.py pylint:blacklisted-name
lib/ansible/module_utils/facts/sysctl.py future-import-boilerplate
lib/ansible/module_utils/facts/sysctl.py metaclass-boilerplate
lib/ansible/module_utils/facts/system/distribution.py pylint:ansible-bad-function
lib/ansible/module_utils/facts/utils.py future-import-boilerplate
lib/ansible/module_utils/facts/utils.py metaclass-boilerplate
lib/ansible/module_utils/json_utils.py future-import-boilerplate
lib/ansible/module_utils/json_utils.py metaclass-boilerplate
lib/ansible/module_utils/parsing/convert_bool.py future-import-boilerplate
lib/ansible/module_utils/parsing/convert_bool.py metaclass-boilerplate
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.ArgvParser.psm1 pslint:PSUseApprovedVerbs
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.CommandUtil.psm1 pslint:PSProvideCommentHelp # need to agree on best format for comment location
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.CommandUtil.psm1 pslint:PSUseApprovedVerbs
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.FileUtil.psm1 pslint:PSCustomUseLiteralPath
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.FileUtil.psm1 pslint:PSProvideCommentHelp
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.Legacy.psm1 pslint:PSCustomUseLiteralPath
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.Legacy.psm1 pslint:PSUseApprovedVerbs
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.LinkUtil.psm1 pslint:PSUseApprovedVerbs
lib/ansible/module_utils/pycompat24.py future-import-boilerplate
lib/ansible/module_utils/pycompat24.py metaclass-boilerplate
lib/ansible/module_utils/pycompat24.py no-get-exception
lib/ansible/module_utils/service.py future-import-boilerplate
lib/ansible/module_utils/service.py metaclass-boilerplate
lib/ansible/module_utils/six/__init__.py empty-init # breaks namespacing, bundled, do not override
lib/ansible/module_utils/six/__init__.py future-import-boilerplate # ignore bundled
lib/ansible/module_utils/six/__init__.py metaclass-boilerplate # ignore bundled
lib/ansible/module_utils/six/__init__.py no-basestring
lib/ansible/module_utils/six/__init__.py no-dict-iteritems
lib/ansible/module_utils/six/__init__.py no-dict-iterkeys
lib/ansible/module_utils/six/__init__.py no-dict-itervalues
lib/ansible/module_utils/six/__init__.py replace-urlopen
lib/ansible/module_utils/splitter.py future-import-boilerplate
lib/ansible/module_utils/splitter.py metaclass-boilerplate
lib/ansible/module_utils/urls.py future-import-boilerplate
lib/ansible/module_utils/urls.py metaclass-boilerplate
lib/ansible/module_utils/urls.py pylint:blacklisted-name
lib/ansible/module_utils/urls.py replace-urlopen
lib/ansible/module_utils/yumdnf.py future-import-boilerplate
lib/ansible/module_utils/yumdnf.py metaclass-boilerplate
lib/ansible/modules/commands/command.py validate-modules:doc-missing-type
lib/ansible/modules/commands/command.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/commands/command.py validate-modules:parameter-list-no-elements
lib/ansible/modules/commands/command.py validate-modules:undocumented-parameter
lib/ansible/modules/commands/expect.py validate-modules:doc-missing-type
lib/ansible/modules/files/assemble.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/files/blockinfile.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/files/blockinfile.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/files/copy.py pylint:blacklisted-name
lib/ansible/modules/files/copy.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/files/copy.py validate-modules:doc-type-does-not-match-spec
lib/ansible/modules/files/copy.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/files/copy.py validate-modules:undocumented-parameter
lib/ansible/modules/files/file.py pylint:ansible-bad-function
lib/ansible/modules/files/file.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/files/file.py validate-modules:undocumented-parameter
lib/ansible/modules/files/find.py use-argspec-type-path # fix needed
lib/ansible/modules/files/find.py validate-modules:parameter-list-no-elements
lib/ansible/modules/files/find.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/files/lineinfile.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/files/lineinfile.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/files/lineinfile.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/files/replace.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/files/stat.py validate-modules:parameter-invalid
lib/ansible/modules/files/stat.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/files/stat.py validate-modules:undocumented-parameter
lib/ansible/modules/files/unarchive.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/files/unarchive.py validate-modules:parameter-list-no-elements
lib/ansible/modules/net_tools/basics/get_url.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/net_tools/basics/uri.py pylint:blacklisted-name
lib/ansible/modules/net_tools/basics/uri.py validate-modules:doc-required-mismatch
lib/ansible/modules/net_tools/basics/uri.py validate-modules:parameter-list-no-elements
lib/ansible/modules/net_tools/basics/uri.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/language/pip.py pylint:blacklisted-name
lib/ansible/modules/packaging/language/pip.py validate-modules:doc-elements-mismatch
lib/ansible/modules/packaging/language/pip.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/packaging/os/apt.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/apt.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/apt.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/apt.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/apt_key.py validate-modules:mutually_exclusive-unknown
lib/ansible/modules/packaging/os/apt_key.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/apt_key.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/apt_repo.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/apt_repository.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/apt_repository.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/apt_repository.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/apt_repository.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/dnf.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/dnf.py validate-modules:doc-required-mismatch
lib/ansible/modules/packaging/os/dnf.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/dnf.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/dnf.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/dpkg_selections.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/dpkg_selections.py validate-modules:doc-required-mismatch
lib/ansible/modules/packaging/os/package_facts.py validate-modules:doc-choices-do-not-match-spec
lib/ansible/modules/packaging/os/package_facts.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/package_facts.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/package_facts.py validate-modules:return-syntax-error
lib/ansible/modules/packaging/os/rpm_key.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/yum.py pylint:blacklisted-name
lib/ansible/modules/packaging/os/yum.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/yum.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/yum.py validate-modules:parameter-invalid
lib/ansible/modules/packaging/os/yum.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/yum.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/yum.py validate-modules:undocumented-parameter
lib/ansible/modules/packaging/os/yum_repository.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/packaging/os/yum_repository.py validate-modules:doc-missing-type
lib/ansible/modules/packaging/os/yum_repository.py validate-modules:parameter-list-no-elements
lib/ansible/modules/packaging/os/yum_repository.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/packaging/os/yum_repository.py validate-modules:undocumented-parameter
lib/ansible/modules/source_control/git.py pylint:blacklisted-name
lib/ansible/modules/source_control/git.py use-argspec-type-path
lib/ansible/modules/source_control/git.py validate-modules:doc-missing-type
lib/ansible/modules/source_control/git.py validate-modules:doc-required-mismatch
lib/ansible/modules/source_control/git.py validate-modules:parameter-list-no-elements
lib/ansible/modules/source_control/git.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/source_control/subversion.py validate-modules:doc-required-mismatch
lib/ansible/modules/source_control/subversion.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/source_control/subversion.py validate-modules:undocumented-parameter
lib/ansible/modules/system/getent.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/hostname.py validate-modules:invalid-ansiblemodule-schema
lib/ansible/modules/system/hostname.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/iptables.py pylint:blacklisted-name
lib/ansible/modules/system/iptables.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/known_hosts.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/system/known_hosts.py validate-modules:doc-missing-type
lib/ansible/modules/system/known_hosts.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/service.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/system/service.py validate-modules:use-run-command-not-popen
lib/ansible/modules/system/setup.py validate-modules:doc-missing-type
lib/ansible/modules/system/setup.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/setup.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/systemd.py validate-modules:parameter-invalid
lib/ansible/modules/system/systemd.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/systemd.py validate-modules:return-syntax-error
lib/ansible/modules/system/sysvinit.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/sysvinit.py validate-modules:parameter-type-not-in-doc
lib/ansible/modules/system/sysvinit.py validate-modules:return-syntax-error
lib/ansible/modules/system/user.py validate-modules:doc-default-does-not-match-spec
lib/ansible/modules/system/user.py validate-modules:doc-default-incompatible-type
lib/ansible/modules/system/user.py validate-modules:parameter-list-no-elements
lib/ansible/modules/system/user.py validate-modules:use-run-command-not-popen
lib/ansible/modules/utilities/logic/async_status.py use-argspec-type-path
lib/ansible/modules/utilities/logic/async_status.py validate-modules!skip
lib/ansible/modules/utilities/logic/async_wrapper.py ansible-doc!skip # not an actual module
lib/ansible/modules/utilities/logic/async_wrapper.py pylint:ansible-bad-function
lib/ansible/modules/utilities/logic/async_wrapper.py use-argspec-type-path
lib/ansible/modules/utilities/logic/wait_for.py validate-modules:parameter-list-no-elements
lib/ansible/parsing/vault/__init__.py pylint:blacklisted-name
lib/ansible/playbook/base.py pylint:blacklisted-name
lib/ansible/playbook/collectionsearch.py required-and-default-attributes # https://github.com/ansible/ansible/issues/61460
lib/ansible/playbook/helpers.py pylint:blacklisted-name
lib/ansible/playbook/role/__init__.py pylint:blacklisted-name
lib/ansible/plugins/action/normal.py action-plugin-docs # default action plugin for modules without a dedicated action plugin
lib/ansible/plugins/cache/base.py ansible-doc!skip # not a plugin, but a stub for backwards compatibility
lib/ansible/plugins/doc_fragments/backup.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/backup.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/constructed.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/constructed.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/decrypt.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/decrypt.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/default_callback.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/default_callback.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/files.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/files.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/inventory_cache.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/inventory_cache.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/return_common.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/return_common.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/shell_common.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/shell_common.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/shell_windows.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/shell_windows.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/url.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/url.py metaclass-boilerplate
lib/ansible/plugins/doc_fragments/validate.py future-import-boilerplate
lib/ansible/plugins/doc_fragments/validate.py metaclass-boilerplate
lib/ansible/plugins/lookup/sequence.py pylint:blacklisted-name
lib/ansible/plugins/strategy/__init__.py pylint:blacklisted-name
lib/ansible/plugins/strategy/linear.py pylint:blacklisted-name
lib/ansible/vars/hostvars.py pylint:blacklisted-name
setup.py future-import-boilerplate
setup.py metaclass-boilerplate
test/integration/targets/ansible-runner/files/adhoc_example1.py future-import-boilerplate
test/integration/targets/ansible-runner/files/adhoc_example1.py metaclass-boilerplate
test/integration/targets/ansible-runner/files/playbook_example1.py future-import-boilerplate
test/integration/targets/ansible-runner/files/playbook_example1.py metaclass-boilerplate
test/integration/targets/ansible-test/ansible_collections/ns/col/plugins/modules/hello.py pylint:relative-beyond-top-level
test/integration/targets/ansible-test/ansible_collections/ns/col/tests/unit/plugins/module_utils/test_my_util.py pylint:relative-beyond-top-level
test/integration/targets/ansible-test/ansible_collections/ns/col/tests/unit/plugins/modules/test_hello.py pylint:relative-beyond-top-level
test/integration/targets/async/library/async_test.py future-import-boilerplate
test/integration/targets/async/library/async_test.py metaclass-boilerplate
test/integration/targets/async_fail/library/async_test.py future-import-boilerplate
test/integration/targets/async_fail/library/async_test.py metaclass-boilerplate
test/integration/targets/collections_plugin_namespace/collection_root/ansible_collections/my_ns/my_col/plugins/lookup/lookup_no_future_boilerplate.py future-import-boilerplate
test/integration/targets/collections_relative_imports/collection_root/ansible_collections/my_ns/my_col/plugins/module_utils/my_util2.py pylint:relative-beyond-top-level
test/integration/targets/collections_relative_imports/collection_root/ansible_collections/my_ns/my_col/plugins/module_utils/my_util3.py pylint:relative-beyond-top-level
test/integration/targets/collections_relative_imports/collection_root/ansible_collections/my_ns/my_col/plugins/modules/my_module.py pylint:relative-beyond-top-level
test/integration/targets/expect/files/test_command.py future-import-boilerplate
test/integration/targets/expect/files/test_command.py metaclass-boilerplate
test/integration/targets/gathering_facts/library/bogus_facts shebang
test/integration/targets/get_url/files/testserver.py future-import-boilerplate
test/integration/targets/get_url/files/testserver.py metaclass-boilerplate
test/integration/targets/group/files/gidget.py future-import-boilerplate
test/integration/targets/group/files/gidget.py metaclass-boilerplate
test/integration/targets/ignore_unreachable/fake_connectors/bad_exec.py future-import-boilerplate
test/integration/targets/ignore_unreachable/fake_connectors/bad_exec.py metaclass-boilerplate
test/integration/targets/ignore_unreachable/fake_connectors/bad_put_file.py future-import-boilerplate
test/integration/targets/ignore_unreachable/fake_connectors/bad_put_file.py metaclass-boilerplate
test/integration/targets/incidental_script_inventory_vmware_inventory/vmware_inventory.py future-import-boilerplate
test/integration/targets/incidental_script_inventory_vmware_inventory/vmware_inventory.py metaclass-boilerplate
test/integration/targets/incidental_win_dsc/files/xTestDsc/1.0.0/DSCResources/ANSIBLE_xSetReboot/ANSIBLE_xSetReboot.psm1 pslint!skip
test/integration/targets/incidental_win_dsc/files/xTestDsc/1.0.0/DSCResources/ANSIBLE_xTestResource/ANSIBLE_xTestResource.psm1 pslint!skip
test/integration/targets/incidental_win_dsc/files/xTestDsc/1.0.0/xTestDsc.psd1 pslint!skip
test/integration/targets/incidental_win_dsc/files/xTestDsc/1.0.1/DSCResources/ANSIBLE_xTestResource/ANSIBLE_xTestResource.psm1 pslint!skip
test/integration/targets/incidental_win_dsc/files/xTestDsc/1.0.1/xTestDsc.psd1 pslint!skip
test/integration/targets/incidental_win_ping/library/win_ping_syntax_error.ps1 pslint!skip
test/integration/targets/incidental_win_reboot/templates/post_reboot.ps1 pslint!skip
test/integration/targets/jinja2_native_types/filter_plugins/native_plugins.py future-import-boilerplate
test/integration/targets/jinja2_native_types/filter_plugins/native_plugins.py metaclass-boilerplate
test/integration/targets/lookup_ini/lookup-8859-15.ini no-smart-quotes
test/integration/targets/module_precedence/lib_with_extension/ping.py future-import-boilerplate
test/integration/targets/module_precedence/lib_with_extension/ping.py metaclass-boilerplate
test/integration/targets/module_precedence/multiple_roles/bar/library/ping.py future-import-boilerplate
test/integration/targets/module_precedence/multiple_roles/bar/library/ping.py metaclass-boilerplate
test/integration/targets/module_precedence/multiple_roles/foo/library/ping.py future-import-boilerplate
test/integration/targets/module_precedence/multiple_roles/foo/library/ping.py metaclass-boilerplate
test/integration/targets/module_precedence/roles_with_extension/foo/library/ping.py future-import-boilerplate
test/integration/targets/module_precedence/roles_with_extension/foo/library/ping.py metaclass-boilerplate
test/integration/targets/module_utils/library/test.py future-import-boilerplate
test/integration/targets/module_utils/library/test.py metaclass-boilerplate
test/integration/targets/module_utils/library/test_env_override.py future-import-boilerplate
test/integration/targets/module_utils/library/test_env_override.py metaclass-boilerplate
test/integration/targets/module_utils/library/test_failure.py future-import-boilerplate
test/integration/targets/module_utils/library/test_failure.py metaclass-boilerplate
test/integration/targets/module_utils/library/test_override.py future-import-boilerplate
test/integration/targets/module_utils/library/test_override.py metaclass-boilerplate
test/integration/targets/module_utils/module_utils/bar0/foo.py pylint:blacklisted-name
test/integration/targets/module_utils/module_utils/foo.py pylint:blacklisted-name
test/integration/targets/module_utils/module_utils/sub/bar/__init__.py pylint:blacklisted-name
test/integration/targets/module_utils/module_utils/sub/bar/bar.py pylint:blacklisted-name
test/integration/targets/module_utils/module_utils/yak/zebra/foo.py pylint:blacklisted-name
test/integration/targets/old_style_modules_posix/library/helloworld.sh shebang
test/integration/targets/pause/test-pause.py future-import-boilerplate
test/integration/targets/pause/test-pause.py metaclass-boilerplate
test/integration/targets/pip/files/ansible_test_pip_chdir/__init__.py future-import-boilerplate
test/integration/targets/pip/files/ansible_test_pip_chdir/__init__.py metaclass-boilerplate
test/integration/targets/pip/files/setup.py future-import-boilerplate
test/integration/targets/pip/files/setup.py metaclass-boilerplate
test/integration/targets/run_modules/library/test.py future-import-boilerplate
test/integration/targets/run_modules/library/test.py metaclass-boilerplate
test/integration/targets/script/files/no_shebang.py future-import-boilerplate
test/integration/targets/script/files/no_shebang.py metaclass-boilerplate
test/integration/targets/service/files/ansible_test_service.py future-import-boilerplate
test/integration/targets/service/files/ansible_test_service.py metaclass-boilerplate
test/integration/targets/setup_rpm_repo/files/create-repo.py future-import-boilerplate
test/integration/targets/setup_rpm_repo/files/create-repo.py metaclass-boilerplate
test/integration/targets/template/files/encoding_1252_utf-8.expected no-smart-quotes
test/integration/targets/template/files/encoding_1252_windows-1252.expected no-smart-quotes
test/integration/targets/template/files/foo.dos.txt line-endings
test/integration/targets/template/role_filter/filter_plugins/myplugin.py future-import-boilerplate
test/integration/targets/template/role_filter/filter_plugins/myplugin.py metaclass-boilerplate
test/integration/targets/template/templates/encoding_1252.j2 no-smart-quotes
test/integration/targets/infra/library/test.py future-import-boilerplate
test/integration/targets/infra/library/test.py metaclass-boilerplate
test/integration/targets/unicode/unicode.yml no-smart-quotes
test/integration/targets/uri/files/testserver.py future-import-boilerplate
test/integration/targets/uri/files/testserver.py metaclass-boilerplate
test/integration/targets/var_precedence/ansible-var-precedence-check.py future-import-boilerplate
test/integration/targets/var_precedence/ansible-var-precedence-check.py metaclass-boilerplate
test/integration/targets/builtin_vars_prompt/test-vars_prompt.py future-import-boilerplate
test/integration/targets/builtin_vars_prompt/test-vars_prompt.py metaclass-boilerplate
test/integration/targets/vault/test-vault-client.py future-import-boilerplate
test/integration/targets/vault/test-vault-client.py metaclass-boilerplate
test/integration/targets/wait_for/files/testserver.py future-import-boilerplate
test/integration/targets/wait_for/files/testserver.py metaclass-boilerplate
test/integration/targets/want_json_modules_posix/library/helloworld.py future-import-boilerplate
test/integration/targets/want_json_modules_posix/library/helloworld.py metaclass-boilerplate
test/integration/targets/win_exec_wrapper/library/test_fail.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_module_utils/library/legacy_only_new_way_win_line_ending.ps1 line-endings # Explicitly tests that we still work with Windows line endings
test/integration/targets/win_module_utils/library/legacy_only_old_way_win_line_ending.ps1 line-endings # Explicitly tests that we still work with Windows line endings
test/integration/targets/win_script/files/test_script.ps1 pslint:PSAvoidUsingWriteHost # Keep
test/integration/targets/win_script/files/test_script_creates_file.ps1 pslint:PSAvoidUsingCmdletAliases
test/integration/targets/win_script/files/test_script_removes_file.ps1 pslint:PSCustomUseLiteralPath
test/integration/targets/win_script/files/test_script_with_args.ps1 pslint:PSAvoidUsingWriteHost # Keep
test/integration/targets/win_script/files/test_script_with_splatting.ps1 pslint:PSAvoidUsingWriteHost # Keep
test/integration/targets/windows-minimal/library/win_ping_syntax_error.ps1 pslint!skip
test/lib/ansible_test/_data/requirements/constraints.txt test-constraints
test/lib/ansible_test/_data/requirements/integration.cloud.azure.txt test-constraints
test/lib/ansible_test/_data/sanity/pylint/plugins/string_format.py use-compat-six
test/lib/ansible_test/_data/setup/ConfigureRemotingForAnsible.ps1 pslint:PSCustomUseLiteralPath
test/lib/ansible_test/_data/setup/windows-httptester.ps1 pslint:PSCustomUseLiteralPath
test/support/integration/plugins/module_utils/ansible_tower.py future-import-boilerplate
test/support/integration/plugins/module_utils/ansible_tower.py metaclass-boilerplate
test/support/integration/plugins/module_utils/azure_rm_common.py future-import-boilerplate
test/support/integration/plugins/module_utils/azure_rm_common.py metaclass-boilerplate
test/support/integration/plugins/module_utils/azure_rm_common_rest.py future-import-boilerplate
test/support/integration/plugins/module_utils/azure_rm_common_rest.py metaclass-boilerplate
test/support/integration/plugins/module_utils/cloud.py future-import-boilerplate
test/support/integration/plugins/module_utils/cloud.py metaclass-boilerplate
test/support/integration/plugins/module_utils/common/network.py future-import-boilerplate
test/support/integration/plugins/module_utils/common/network.py metaclass-boilerplate
test/support/integration/plugins/module_utils/compat/ipaddress.py future-import-boilerplate
test/support/integration/plugins/module_utils/compat/ipaddress.py metaclass-boilerplate
test/support/integration/plugins/module_utils/compat/ipaddress.py no-unicode-literals
test/support/integration/plugins/module_utils/database.py future-import-boilerplate
test/support/integration/plugins/module_utils/database.py metaclass-boilerplate
test/support/integration/plugins/module_utils/k8s/common.py metaclass-boilerplate
test/support/integration/plugins/module_utils/k8s/raw.py metaclass-boilerplate
test/support/integration/plugins/module_utils/mysql.py future-import-boilerplate
test/support/integration/plugins/module_utils/mysql.py metaclass-boilerplate
test/support/integration/plugins/module_utils/net_tools/nios/api.py future-import-boilerplate
test/support/integration/plugins/module_utils/net_tools/nios/api.py metaclass-boilerplate
test/support/integration/plugins/module_utils/network/common/utils.py future-import-boilerplate
test/support/integration/plugins/module_utils/network/common/utils.py metaclass-boilerplate
test/support/integration/plugins/module_utils/postgres.py future-import-boilerplate
test/support/integration/plugins/module_utils/postgres.py metaclass-boilerplate
test/support/integration/plugins/modules/lvg.py pylint:blacklisted-name
test/support/integration/plugins/modules/synchronize.py pylint:blacklisted-name
test/support/integration/plugins/modules/timezone.py pylint:blacklisted-name
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/doc_fragments/netconf.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/doc_fragments/netconf.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/doc_fragments/network_agnostic.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/doc_fragments/network_agnostic.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/compat/ipaddress.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/compat/ipaddress.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/compat/ipaddress.py no-unicode-literals
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/compat/ipaddress.py pep8:E203
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/cfg/base.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/cfg/base.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/config.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/config.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/facts/facts.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/facts/facts.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/netconf.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/netconf.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/network.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/network.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/parsing.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/parsing.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/utils.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/common/utils.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/netconf/netconf.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/netconf/netconf.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/restconf/restconf.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/ansible/netcommon/plugins/module_utils/network/restconf/restconf.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/cisco/ios/plugins/doc_fragments/ios.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/cisco/ios/plugins/doc_fragments/ios.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/cisco/ios/plugins/module_utils/network/ios/ios.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/cisco/ios/plugins/module_utils/network/ios/ios.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/cisco/ios/plugins/modules/ios_command.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/cisco/ios/plugins/modules/ios_command.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/cisco/ios/plugins/modules/ios_config.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/cisco/ios/plugins/modules/ios_config.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/cisco/ios/plugins/modules/ios_config.py pep8:E501
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/doc_fragments/vyos.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/doc_fragments/vyos.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/module_utils/network/vyos/vyos.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/module_utils/network/vyos/vyos.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/modules/vyos_command.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/modules/vyos_command.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/modules/vyos_command.py pep8:E231
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/modules/vyos_command.py pylint:blacklisted-name
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/modules/vyos_config.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/modules/vyos_config.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/modules/vyos_facts.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/modules/vyos_facts.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/modules/vyos_logging.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/modules/vyos_logging.py metaclass-boilerplate
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/modules/vyos_static_route.py future-import-boilerplate
test/support/network-integration/collections/ansible_collections/vyos/vyos/plugins/modules/vyos_static_route.py metaclass-boilerplate
test/support/windows-integration/plugins/modules/async_status.ps1 pslint!skip
test/support/windows-integration/plugins/modules/setup.ps1 pslint!skip
test/support/windows-integration/plugins/modules/win_copy.ps1 pslint!skip
test/support/windows-integration/plugins/modules/win_dsc.ps1 pslint!skip
test/support/windows-integration/plugins/modules/win_feature.ps1 pslint!skip
test/support/windows-integration/plugins/modules/win_find.ps1 pslint!skip
test/support/windows-integration/plugins/modules/win_lineinfile.ps1 pslint!skip
test/support/windows-integration/plugins/modules/win_regedit.ps1 pslint!skip
test/support/windows-integration/plugins/modules/win_security_policy.ps1 pslint!skip
test/support/windows-integration/plugins/modules/win_shell.ps1 pslint!skip
test/support/windows-integration/plugins/modules/win_wait_for.ps1 pslint!skip
test/units/config/manager/test_find_ini_config_file.py future-import-boilerplate
test/units/executor/test_play_iterator.py pylint:blacklisted-name
test/units/inventory/test_group.py future-import-boilerplate
test/units/inventory/test_group.py metaclass-boilerplate
test/units/inventory/test_host.py future-import-boilerplate
test/units/inventory/test_host.py metaclass-boilerplate
test/units/mock/path.py future-import-boilerplate
test/units/mock/path.py metaclass-boilerplate
test/units/mock/yaml_helper.py future-import-boilerplate
test/units/mock/yaml_helper.py metaclass-boilerplate
test/units/module_utils/basic/test__symbolic_mode_to_octal.py future-import-boilerplate
test/units/module_utils/basic/test_deprecate_warn.py future-import-boilerplate
test/units/module_utils/basic/test_deprecate_warn.py metaclass-boilerplate
test/units/module_utils/basic/test_deprecate_warn.py pylint:ansible-deprecated-no-version
test/units/module_utils/basic/test_exit_json.py future-import-boilerplate
test/units/module_utils/basic/test_get_file_attributes.py future-import-boilerplate
test/units/module_utils/basic/test_heuristic_log_sanitize.py future-import-boilerplate
test/units/module_utils/basic/test_run_command.py future-import-boilerplate
test/units/module_utils/basic/test_run_command.py pylint:blacklisted-name
test/units/module_utils/basic/test_safe_eval.py future-import-boilerplate
test/units/module_utils/basic/test_tmpdir.py future-import-boilerplate
test/units/module_utils/common/test_dict_transformations.py future-import-boilerplate
test/units/module_utils/common/test_dict_transformations.py metaclass-boilerplate
test/units/module_utils/conftest.py future-import-boilerplate
test/units/module_utils/conftest.py metaclass-boilerplate
test/units/module_utils/facts/base.py future-import-boilerplate
test/units/module_utils/facts/hardware/test_sunos_get_uptime_facts.py future-import-boilerplate
test/units/module_utils/facts/hardware/test_sunos_get_uptime_facts.py metaclass-boilerplate
test/units/module_utils/facts/network/test_generic_bsd.py future-import-boilerplate
test/units/module_utils/facts/other/test_facter.py future-import-boilerplate
test/units/module_utils/facts/other/test_ohai.py future-import-boilerplate
test/units/module_utils/facts/system/test_lsb.py future-import-boilerplate
test/units/module_utils/facts/test_ansible_collector.py future-import-boilerplate
test/units/module_utils/facts/test_collector.py future-import-boilerplate
test/units/module_utils/facts/test_collectors.py future-import-boilerplate
test/units/module_utils/facts/test_facts.py future-import-boilerplate
test/units/module_utils/facts/test_timeout.py future-import-boilerplate
test/units/module_utils/facts/test_utils.py future-import-boilerplate
test/units/module_utils/json_utils/test_filter_non_json_lines.py future-import-boilerplate
test/units/module_utils/parsing/test_convert_bool.py future-import-boilerplate
test/units/module_utils/test_distro.py future-import-boilerplate
test/units/module_utils/test_distro.py metaclass-boilerplate
test/units/module_utils/test_text.py future-import-boilerplate
test/units/module_utils/urls/test_Request.py replace-urlopen
test/units/module_utils/urls/test_fetch_url.py replace-urlopen
test/units/modules/conftest.py future-import-boilerplate
test/units/modules/conftest.py metaclass-boilerplate
test/units/modules/files/test_copy.py future-import-boilerplate
test/units/modules/packaging/language/test_pip.py future-import-boilerplate
test/units/modules/packaging/language/test_pip.py metaclass-boilerplate
test/units/modules/packaging/os/test_apt.py future-import-boilerplate
test/units/modules/packaging/os/test_apt.py metaclass-boilerplate
test/units/modules/packaging/os/test_apt.py pylint:blacklisted-name
test/units/modules/packaging/os/test_yum.py future-import-boilerplate
test/units/modules/packaging/os/test_yum.py metaclass-boilerplate
test/units/modules/system/test_iptables.py future-import-boilerplate
test/units/modules/system/test_iptables.py metaclass-boilerplate
test/units/modules/system/test_known_hosts.py future-import-boilerplate
test/units/modules/system/test_known_hosts.py metaclass-boilerplate
test/units/modules/system/test_known_hosts.py pylint:ansible-bad-function
test/units/modules/system/test_systemd.py future-import-boilerplate
test/units/modules/system/test_systemd.py metaclass-boilerplate
test/units/modules/utils.py future-import-boilerplate
test/units/modules/utils.py metaclass-boilerplate
test/units/parsing/utils/test_addresses.py future-import-boilerplate
test/units/parsing/utils/test_addresses.py metaclass-boilerplate
test/units/parsing/vault/test_vault.py pylint:blacklisted-name
test/units/playbook/role/test_role.py pylint:blacklisted-name
test/units/playbook/test_attribute.py future-import-boilerplate
test/units/playbook/test_attribute.py metaclass-boilerplate
test/units/playbook/test_conditional.py future-import-boilerplate
test/units/playbook/test_conditional.py metaclass-boilerplate
test/units/plugins/inventory/test_constructed.py future-import-boilerplate
test/units/plugins/inventory/test_constructed.py metaclass-boilerplate
test/units/plugins/loader_fixtures/import_fixture.py future-import-boilerplate
test/units/plugins/shell/test_cmd.py future-import-boilerplate
test/units/plugins/shell/test_cmd.py metaclass-boilerplate
test/units/plugins/shell/test_powershell.py future-import-boilerplate
test/units/plugins/shell/test_powershell.py metaclass-boilerplate
test/units/plugins/test_plugins.py pylint:blacklisted-name
test/units/template/test_templar.py pylint:blacklisted-name
test/units/test_constants.py future-import-boilerplate
test/units/test_context.py future-import-boilerplate
test/units/utils/fixtures/collections/ansible_collections/my_namespace/my_collection/plugins/action/my_action.py future-import-boilerplate
test/units/utils/fixtures/collections/ansible_collections/my_namespace/my_collection/plugins/action/my_action.py metaclass-boilerplate
test/units/utils/fixtures/collections/ansible_collections/my_namespace/my_collection/plugins/module_utils/my_other_util.py future-import-boilerplate
test/units/utils/fixtures/collections/ansible_collections/my_namespace/my_collection/plugins/module_utils/my_other_util.py metaclass-boilerplate
test/units/utils/fixtures/collections/ansible_collections/my_namespace/my_collection/plugins/module_utils/my_util.py future-import-boilerplate
test/units/utils/fixtures/collections/ansible_collections/my_namespace/my_collection/plugins/module_utils/my_util.py metaclass-boilerplate
test/units/utils/test_cleanup_tmp_file.py future-import-boilerplate
test/units/utils/test_encrypt.py future-import-boilerplate
test/units/utils/test_encrypt.py metaclass-boilerplate
test/units/utils/test_helpers.py future-import-boilerplate
test/units/utils/test_helpers.py metaclass-boilerplate
test/units/utils/test_shlex.py future-import-boilerplate
test/units/utils/test_shlex.py metaclass-boilerplate
test/utils/shippable/check_matrix.py replace-urlopen
test/utils/shippable/timing.py shebang
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 61,002 |
discovered_interpreter_python is not fetched for delegated hosts
|
##### SUMMARY
I'm running a playbook on a host on F30 where `discovered_interpreter_python` is found to be `/usr/bin/python3`. Later the playbook I delegate tasks to another host which happen to run using the same interpreter without redoing the discovery and contrary to currently cached value.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
discovered_interpreter_python
##### ANSIBLE VERSION
```paste below
ansible 2.8.4
config file = /home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg
configured module search path = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/library']
ansible python module location = /home/duck/.local/lib/python3.7/site-packages/ansible
executable location = /home/duck/.local/bin/ansible
python version = 3.7.4 (default, Jul 11 2019, 10:43:21) [GCC 8.3.0]
```
##### CONFIGURATION
```paste below
ANSIBLE_FORCE_COLOR(env: ANSIBLE_FORCE_COLOR) = True
ANSIBLE_PIPELINING(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = True
ANSIBLE_SSH_ARGS(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = -o ControlMaster=auto -o ControlPersist=30m
CACHE_PLUGIN(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = jsonfile
CACHE_PLUGIN_CONNECTION(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ./facts_cache
CACHE_PLUGIN_TIMEOUT(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = 86400
DEFAULT_ACTION_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugins/a
DEFAULT_CALLBACK_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugins
DEFAULT_CONNECTION_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugi
DEFAULT_FILTER_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugins/f
DEFAULT_FORKS(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = 10
DEFAULT_GATHERING(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = smart
DEFAULT_HASH_BEHAVIOUR(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = merge
DEFAULT_HOST_LIST(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/hosts.yml']
DEFAULT_LOOKUP_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugins/l
DEFAULT_MANAGED_STR(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = Ansible managed
DEFAULT_MODULE_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/library']
DEFAULT_ROLES_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/roles']
DEFAULT_STRATEGY_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugins
DEFAULT_TIMEOUT(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = 40
DEFAULT_VARS_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugins/var
DEFAULT_VAULT_PASSWORD_FILE(env: ANSIBLE_VAULT_PASSWORD_FILE) = /somewhere
ERROR_ON_MISSING_HANDLER(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = True
INTERPRETER_PYTHON(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = auto
PARAMIKO_HOST_KEY_AUTO_ADD(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = True
PERSISTENT_CONNECT_TIMEOUT(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = 1800
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
```yaml
---
- hosts: ovirt-web-builder.int.osci.io
tasks:
- name: Get facts for the other side
setup:
delegate_to: www.ovirt.org
delegate_facts: True
```
##### EXPECTED RESULTS
Ansible should use the value in my cache (see below) or if expired initiate a discovery for www.ovirt.org instead of using the same interpreter as the target host.
##### ACTUAL RESULTS
```paste below
PLAY [ovirt-web-builder.int.osci.io] **************************************************************************************************************
TASK [Get facts for the other side] ***************************************************************************************************************
fatal: [ovirt-web-builder.int.osci.io -> 8.43.85.224]: FAILED! => {"changed": false, "module_stderr": "/bin/sh: /usr/bin/python3: No such file or directory\n", "module_stdout": "", "msg": "The module failed to execute correctly, you probably need to set the interpreter.\nSee stdout/stderr for the exact error", "rc": 127}
PLAY RECAP ****************************************************************************************************************************************
ovirt-web-builder.int.osci.io : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
```
Please not my cache:
```
$ grep discovered_interpreter_python facts_cache/www.ovirt.org
"discovered_interpreter_python": "/usr/bin/python",
```
moving the cache file aside does not change the behavior.
doing a setup manually works and discover the right interpreter:
```
$ ansible www.ovirt.org -m setup | grep discovered_interpreter_python
"discovered_interpreter_python": "/usr/bin/python",
```
Also the value of `interpreter_python` in `ansible.cfg` does not change anything.
Of course I can force the value of `ansible_python_interpreter` and everything works fine, but why would I do that since we now have a magic autodetection? This is also a regression for a perfectly working playbook.
|
https://github.com/ansible/ansible/issues/61002
|
https://github.com/ansible/ansible/pull/64906
|
318d5606c1d7e8be69b647a77f044b3532f30d7e
|
123c624b28398c10864be5238dfdd44c524564a0
| 2019-08-21T09:43:41Z |
python
| 2020-04-06T20:09:00Z |
changelogs/fragments/64906-always-delegate-fact-prefixes.yml
| |
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 61,002 |
discovered_interpreter_python is not fetched for delegated hosts
|
##### SUMMARY
I'm running a playbook on a host on F30 where `discovered_interpreter_python` is found to be `/usr/bin/python3`. Later the playbook I delegate tasks to another host which happen to run using the same interpreter without redoing the discovery and contrary to currently cached value.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
discovered_interpreter_python
##### ANSIBLE VERSION
```paste below
ansible 2.8.4
config file = /home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg
configured module search path = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/library']
ansible python module location = /home/duck/.local/lib/python3.7/site-packages/ansible
executable location = /home/duck/.local/bin/ansible
python version = 3.7.4 (default, Jul 11 2019, 10:43:21) [GCC 8.3.0]
```
##### CONFIGURATION
```paste below
ANSIBLE_FORCE_COLOR(env: ANSIBLE_FORCE_COLOR) = True
ANSIBLE_PIPELINING(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = True
ANSIBLE_SSH_ARGS(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = -o ControlMaster=auto -o ControlPersist=30m
CACHE_PLUGIN(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = jsonfile
CACHE_PLUGIN_CONNECTION(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ./facts_cache
CACHE_PLUGIN_TIMEOUT(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = 86400
DEFAULT_ACTION_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugins/a
DEFAULT_CALLBACK_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugins
DEFAULT_CONNECTION_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugi
DEFAULT_FILTER_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugins/f
DEFAULT_FORKS(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = 10
DEFAULT_GATHERING(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = smart
DEFAULT_HASH_BEHAVIOUR(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = merge
DEFAULT_HOST_LIST(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/hosts.yml']
DEFAULT_LOOKUP_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugins/l
DEFAULT_MANAGED_STR(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = Ansible managed
DEFAULT_MODULE_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/library']
DEFAULT_ROLES_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/roles']
DEFAULT_STRATEGY_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugins
DEFAULT_TIMEOUT(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = 40
DEFAULT_VARS_PLUGIN_PATH(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = ['/home/duck/Projects/oVirt/gerrit_infra-ansible/plugins/var
DEFAULT_VAULT_PASSWORD_FILE(env: ANSIBLE_VAULT_PASSWORD_FILE) = /somewhere
ERROR_ON_MISSING_HANDLER(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = True
INTERPRETER_PYTHON(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = auto
PARAMIKO_HOST_KEY_AUTO_ADD(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = True
PERSISTENT_CONNECT_TIMEOUT(/home/duck/Projects/oVirt/gerrit_infra-ansible/ansible.cfg) = 1800
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
```yaml
---
- hosts: ovirt-web-builder.int.osci.io
tasks:
- name: Get facts for the other side
setup:
delegate_to: www.ovirt.org
delegate_facts: True
```
##### EXPECTED RESULTS
Ansible should use the value in my cache (see below) or if expired initiate a discovery for www.ovirt.org instead of using the same interpreter as the target host.
##### ACTUAL RESULTS
```paste below
PLAY [ovirt-web-builder.int.osci.io] **************************************************************************************************************
TASK [Get facts for the other side] ***************************************************************************************************************
fatal: [ovirt-web-builder.int.osci.io -> 8.43.85.224]: FAILED! => {"changed": false, "module_stderr": "/bin/sh: /usr/bin/python3: No such file or directory\n", "module_stdout": "", "msg": "The module failed to execute correctly, you probably need to set the interpreter.\nSee stdout/stderr for the exact error", "rc": 127}
PLAY RECAP ****************************************************************************************************************************************
ovirt-web-builder.int.osci.io : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
```
Please not my cache:
```
$ grep discovered_interpreter_python facts_cache/www.ovirt.org
"discovered_interpreter_python": "/usr/bin/python",
```
moving the cache file aside does not change the behavior.
doing a setup manually works and discover the right interpreter:
```
$ ansible www.ovirt.org -m setup | grep discovered_interpreter_python
"discovered_interpreter_python": "/usr/bin/python",
```
Also the value of `interpreter_python` in `ansible.cfg` does not change anything.
Of course I can force the value of `ansible_python_interpreter` and everything works fine, but why would I do that since we now have a magic autodetection? This is also a regression for a perfectly working playbook.
|
https://github.com/ansible/ansible/issues/61002
|
https://github.com/ansible/ansible/pull/64906
|
318d5606c1d7e8be69b647a77f044b3532f30d7e
|
123c624b28398c10864be5238dfdd44c524564a0
| 2019-08-21T09:43:41Z |
python
| 2020-04-06T20:09:00Z |
lib/ansible/plugins/strategy/__init__.py
|
# (c) 2012-2014, Michael DeHaan <[email protected]>
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import cmd
import functools
import os
import pprint
import sys
import threading
import time
from collections import deque
from multiprocessing import Lock
from jinja2.exceptions import UndefinedError
from ansible import constants as C
from ansible import context
from ansible.errors import AnsibleError, AnsibleFileNotFound, AnsibleParserError, AnsibleUndefinedVariable
from ansible.executor import action_write_locks
from ansible.executor.process.worker import WorkerProcess
from ansible.executor.task_result import TaskResult
from ansible.inventory.host import Host
from ansible.module_utils.six.moves import queue as Queue
from ansible.module_utils.six import iteritems, itervalues, string_types
from ansible.module_utils._text import to_text
from ansible.module_utils.connection import Connection, ConnectionError
from ansible.playbook.helpers import load_list_of_blocks
from ansible.playbook.included_file import IncludedFile
from ansible.playbook.task_include import TaskInclude
from ansible.plugins import loader as plugin_loader
from ansible.template import Templar
from ansible.utils.display import Display
from ansible.utils.vars import combine_vars
from ansible.vars.clean import strip_internal_keys, module_response_deepcopy
display = Display()
__all__ = ['StrategyBase']
class StrategySentinel:
pass
def SharedPluginLoaderObj():
'''This only exists for backwards compat, do not use.
'''
display.deprecated('SharedPluginLoaderObj is deprecated, please directly use ansible.plugins.loader',
version='2.11')
return plugin_loader
_sentinel = StrategySentinel()
def results_thread_main(strategy):
while True:
try:
result = strategy._final_q.get()
if isinstance(result, StrategySentinel):
break
else:
strategy._results_lock.acquire()
strategy._results.append(result)
strategy._results_lock.release()
except (IOError, EOFError):
break
except Queue.Empty:
pass
def debug_closure(func):
"""Closure to wrap ``StrategyBase._process_pending_results`` and invoke the task debugger"""
@functools.wraps(func)
def inner(self, iterator, one_pass=False, max_passes=None):
status_to_stats_map = (
('is_failed', 'failures'),
('is_unreachable', 'dark'),
('is_changed', 'changed'),
('is_skipped', 'skipped'),
)
# We don't know the host yet, copy the previous states, for lookup after we process new results
prev_host_states = iterator._host_states.copy()
results = func(self, iterator, one_pass=one_pass, max_passes=max_passes)
_processed_results = []
for result in results:
task = result._task
host = result._host
_queued_task_args = self._queued_task_cache.pop((host.name, task._uuid), None)
task_vars = _queued_task_args['task_vars']
play_context = _queued_task_args['play_context']
# Try to grab the previous host state, if it doesn't exist use get_host_state to generate an empty state
try:
prev_host_state = prev_host_states[host.name]
except KeyError:
prev_host_state = iterator.get_host_state(host)
while result.needs_debugger(globally_enabled=self.debugger_active):
next_action = NextAction()
dbg = Debugger(task, host, task_vars, play_context, result, next_action)
dbg.cmdloop()
if next_action.result == NextAction.REDO:
# rollback host state
self._tqm.clear_failed_hosts()
iterator._host_states[host.name] = prev_host_state
for method, what in status_to_stats_map:
if getattr(result, method)():
self._tqm._stats.decrement(what, host.name)
self._tqm._stats.decrement('ok', host.name)
# redo
self._queue_task(host, task, task_vars, play_context)
_processed_results.extend(debug_closure(func)(self, iterator, one_pass))
break
elif next_action.result == NextAction.CONTINUE:
_processed_results.append(result)
break
elif next_action.result == NextAction.EXIT:
# Matches KeyboardInterrupt from bin/ansible
sys.exit(99)
else:
_processed_results.append(result)
return _processed_results
return inner
class StrategyBase:
'''
This is the base class for strategy plugins, which contains some common
code useful to all strategies like running handlers, cleanup actions, etc.
'''
# by default, strategies should support throttling but we allow individual
# strategies to disable this and either forego supporting it or managing
# the throttling internally (as `free` does)
ALLOW_BASE_THROTTLING = True
def __init__(self, tqm):
self._tqm = tqm
self._inventory = tqm.get_inventory()
self._workers = tqm._workers
self._variable_manager = tqm.get_variable_manager()
self._loader = tqm.get_loader()
self._final_q = tqm._final_q
self._step = context.CLIARGS.get('step', False)
self._diff = context.CLIARGS.get('diff', False)
self.flush_cache = context.CLIARGS.get('flush_cache', False)
# the task cache is a dictionary of tuples of (host.name, task._uuid)
# used to find the original task object of in-flight tasks and to store
# the task args/vars and play context info used to queue the task.
self._queued_task_cache = {}
# Backwards compat: self._display isn't really needed, just import the global display and use that.
self._display = display
# internal counters
self._pending_results = 0
self._cur_worker = 0
# this dictionary is used to keep track of hosts that have
# outstanding tasks still in queue
self._blocked_hosts = dict()
# this dictionary is used to keep track of hosts that have
# flushed handlers
self._flushed_hosts = dict()
self._results = deque()
self._results_lock = threading.Condition(threading.Lock())
# create the result processing thread for reading results in the background
self._results_thread = threading.Thread(target=results_thread_main, args=(self,))
self._results_thread.daemon = True
self._results_thread.start()
# holds the list of active (persistent) connections to be shutdown at
# play completion
self._active_connections = dict()
# Caches for get_host calls, to avoid calling excessively
# These values should be set at the top of the ``run`` method of each
# strategy plugin. Use ``_set_hosts_cache`` to set these values
self._hosts_cache = []
self._hosts_cache_all = []
self.debugger_active = C.ENABLE_TASK_DEBUGGER
def _set_hosts_cache(self, play, refresh=True):
"""Responsible for setting _hosts_cache and _hosts_cache_all
See comment in ``__init__`` for the purpose of these caches
"""
if not refresh and all((self._hosts_cache, self._hosts_cache_all)):
return
if Templar(None).is_template(play.hosts):
_pattern = 'all'
else:
_pattern = play.hosts or 'all'
self._hosts_cache_all = [h.name for h in self._inventory.get_hosts(pattern=_pattern, ignore_restrictions=True)]
self._hosts_cache = [h.name for h in self._inventory.get_hosts(play.hosts, order=play.order)]
def cleanup(self):
# close active persistent connections
for sock in itervalues(self._active_connections):
try:
conn = Connection(sock)
conn.reset()
except ConnectionError as e:
# most likely socket is already closed
display.debug("got an error while closing persistent connection: %s" % e)
self._final_q.put(_sentinel)
self._results_thread.join()
def run(self, iterator, play_context, result=0):
# execute one more pass through the iterator without peeking, to
# make sure that all of the hosts are advanced to their final task.
# This should be safe, as everything should be ITERATING_COMPLETE by
# this point, though the strategy may not advance the hosts itself.
for host in self._hosts_cache:
if host not in self._tqm._unreachable_hosts:
try:
iterator.get_next_task_for_host(self._inventory.hosts[host])
except KeyError:
iterator.get_next_task_for_host(self._inventory.get_host(host))
# save the failed/unreachable hosts, as the run_handlers()
# method will clear that information during its execution
failed_hosts = iterator.get_failed_hosts()
unreachable_hosts = self._tqm._unreachable_hosts.keys()
display.debug("running handlers")
handler_result = self.run_handlers(iterator, play_context)
if isinstance(handler_result, bool) and not handler_result:
result |= self._tqm.RUN_ERROR
elif not handler_result:
result |= handler_result
# now update with the hosts (if any) that failed or were
# unreachable during the handler execution phase
failed_hosts = set(failed_hosts).union(iterator.get_failed_hosts())
unreachable_hosts = set(unreachable_hosts).union(self._tqm._unreachable_hosts.keys())
# return the appropriate code, depending on the status hosts after the run
if not isinstance(result, bool) and result != self._tqm.RUN_OK:
return result
elif len(unreachable_hosts) > 0:
return self._tqm.RUN_UNREACHABLE_HOSTS
elif len(failed_hosts) > 0:
return self._tqm.RUN_FAILED_HOSTS
else:
return self._tqm.RUN_OK
def get_hosts_remaining(self, play):
self._set_hosts_cache(play, refresh=False)
ignore = set(self._tqm._failed_hosts).union(self._tqm._unreachable_hosts)
return [host for host in self._hosts_cache if host not in ignore]
def get_failed_hosts(self, play):
self._set_hosts_cache(play, refresh=False)
return [host for host in self._hosts_cache if host in self._tqm._failed_hosts]
def add_tqm_variables(self, vars, play):
'''
Base class method to add extra variables/information to the list of task
vars sent through the executor engine regarding the task queue manager state.
'''
vars['ansible_current_hosts'] = self.get_hosts_remaining(play)
vars['ansible_failed_hosts'] = self.get_failed_hosts(play)
def _queue_task(self, host, task, task_vars, play_context):
''' handles queueing the task up to be sent to a worker '''
display.debug("entering _queue_task() for %s/%s" % (host.name, task.action))
# Add a write lock for tasks.
# Maybe this should be added somewhere further up the call stack but
# this is the earliest in the code where we have task (1) extracted
# into its own variable and (2) there's only a single code path
# leading to the module being run. This is called by three
# functions: __init__.py::_do_handler_run(), linear.py::run(), and
# free.py::run() so we'd have to add to all three to do it there.
# The next common higher level is __init__.py::run() and that has
# tasks inside of play_iterator so we'd have to extract them to do it
# there.
if task.action not in action_write_locks.action_write_locks:
display.debug('Creating lock for %s' % task.action)
action_write_locks.action_write_locks[task.action] = Lock()
# create a templar and template things we need later for the queuing process
templar = Templar(loader=self._loader, variables=task_vars)
try:
throttle = int(templar.template(task.throttle))
except Exception as e:
raise AnsibleError("Failed to convert the throttle value to an integer.", obj=task._ds, orig_exc=e)
# and then queue the new task
try:
# Determine the "rewind point" of the worker list. This means we start
# iterating over the list of workers until the end of the list is found.
# Normally, that is simply the length of the workers list (as determined
# by the forks or serial setting), however a task/block/play may "throttle"
# that limit down.
rewind_point = len(self._workers)
if throttle > 0 and self.ALLOW_BASE_THROTTLING:
if task.run_once:
display.debug("Ignoring 'throttle' as 'run_once' is also set for '%s'" % task.get_name())
else:
if throttle <= rewind_point:
display.debug("task: %s, throttle: %d" % (task.get_name(), throttle))
rewind_point = throttle
queued = False
starting_worker = self._cur_worker
while True:
if self._cur_worker >= rewind_point:
self._cur_worker = 0
worker_prc = self._workers[self._cur_worker]
if worker_prc is None or not worker_prc.is_alive():
self._queued_task_cache[(host.name, task._uuid)] = {
'host': host,
'task': task,
'task_vars': task_vars,
'play_context': play_context
}
worker_prc = WorkerProcess(self._final_q, task_vars, host, task, play_context, self._loader, self._variable_manager, plugin_loader)
self._workers[self._cur_worker] = worker_prc
self._tqm.send_callback('v2_runner_on_start', host, task)
worker_prc.start()
display.debug("worker is %d (out of %d available)" % (self._cur_worker + 1, len(self._workers)))
queued = True
self._cur_worker += 1
if self._cur_worker >= rewind_point:
self._cur_worker = 0
if queued:
break
elif self._cur_worker == starting_worker:
time.sleep(0.0001)
self._pending_results += 1
except (EOFError, IOError, AssertionError) as e:
# most likely an abort
display.debug("got an error while queuing: %s" % e)
return
display.debug("exiting _queue_task() for %s/%s" % (host.name, task.action))
def get_task_hosts(self, iterator, task_host, task):
if task.run_once:
host_list = [host for host in self._hosts_cache if host not in self._tqm._unreachable_hosts]
else:
host_list = [task_host.name]
return host_list
def get_delegated_hosts(self, result, task):
host_name = result.get('_ansible_delegated_vars', {}).get('ansible_delegated_host', None)
return [host_name or task.delegate_to]
@debug_closure
def _process_pending_results(self, iterator, one_pass=False, max_passes=None):
'''
Reads results off the final queue and takes appropriate action
based on the result (executing callbacks, updating state, etc.).
'''
ret_results = []
handler_templar = Templar(self._loader)
def get_original_host(host_name):
# FIXME: this should not need x2 _inventory
host_name = to_text(host_name)
if host_name in self._inventory.hosts:
return self._inventory.hosts[host_name]
else:
return self._inventory.get_host(host_name)
def search_handler_blocks_by_name(handler_name, handler_blocks):
# iterate in reversed order since last handler loaded with the same name wins
for handler_block in reversed(handler_blocks):
for handler_task in handler_block.block:
if handler_task.name:
if not handler_task.cached_name:
if handler_templar.is_template(handler_task.name):
handler_templar.available_variables = self._variable_manager.get_vars(play=iterator._play,
task=handler_task,
_hosts=self._hosts_cache,
_hosts_all=self._hosts_cache_all)
handler_task.name = handler_templar.template(handler_task.name)
handler_task.cached_name = True
try:
# first we check with the full result of get_name(), which may
# include the role name (if the handler is from a role). If that
# is not found, we resort to the simple name field, which doesn't
# have anything extra added to it.
if handler_task.name == handler_name:
return handler_task
else:
if handler_task.get_name() == handler_name:
return handler_task
except (UndefinedError, AnsibleUndefinedVariable):
# We skip this handler due to the fact that it may be using
# a variable in the name that was conditionally included via
# set_fact or some other method, and we don't want to error
# out unnecessarily
continue
return None
cur_pass = 0
while True:
try:
self._results_lock.acquire()
task_result = self._results.popleft()
except IndexError:
break
finally:
self._results_lock.release()
# get the original host and task. We then assign them to the TaskResult for use in callbacks/etc.
original_host = get_original_host(task_result._host)
queue_cache_entry = (original_host.name, task_result._task)
found_task = self._queued_task_cache.get(queue_cache_entry)['task']
original_task = found_task.copy(exclude_parent=True, exclude_tasks=True)
original_task._parent = found_task._parent
original_task.from_attrs(task_result._task_fields)
task_result._host = original_host
task_result._task = original_task
# send callbacks for 'non final' results
if '_ansible_retry' in task_result._result:
self._tqm.send_callback('v2_runner_retry', task_result)
continue
elif '_ansible_item_result' in task_result._result:
if task_result.is_failed() or task_result.is_unreachable():
self._tqm.send_callback('v2_runner_item_on_failed', task_result)
elif task_result.is_skipped():
self._tqm.send_callback('v2_runner_item_on_skipped', task_result)
else:
if 'diff' in task_result._result:
if self._diff or getattr(original_task, 'diff', False):
self._tqm.send_callback('v2_on_file_diff', task_result)
self._tqm.send_callback('v2_runner_item_on_ok', task_result)
continue
if original_task.register:
host_list = self.get_task_hosts(iterator, original_host, original_task)
clean_copy = strip_internal_keys(module_response_deepcopy(task_result._result))
if 'invocation' in clean_copy:
del clean_copy['invocation']
for target_host in host_list:
self._variable_manager.set_nonpersistent_facts(target_host, {original_task.register: clean_copy})
# all host status messages contain 2 entries: (msg, task_result)
role_ran = False
if task_result.is_failed():
role_ran = True
ignore_errors = original_task.ignore_errors
if not ignore_errors:
display.debug("marking %s as failed" % original_host.name)
if original_task.run_once:
# if we're using run_once, we have to fail every host here
for h in self._inventory.get_hosts(iterator._play.hosts):
if h.name not in self._tqm._unreachable_hosts:
state, _ = iterator.get_next_task_for_host(h, peek=True)
iterator.mark_host_failed(h)
state, new_task = iterator.get_next_task_for_host(h, peek=True)
else:
iterator.mark_host_failed(original_host)
# grab the current state and if we're iterating on the rescue portion
# of a block then we save the failed task in a special var for use
# within the rescue/always
state, _ = iterator.get_next_task_for_host(original_host, peek=True)
if iterator.is_failed(original_host) and state and state.run_state == iterator.ITERATING_COMPLETE:
self._tqm._failed_hosts[original_host.name] = True
if state and iterator.get_active_state(state).run_state == iterator.ITERATING_RESCUE:
self._tqm._stats.increment('rescued', original_host.name)
self._variable_manager.set_nonpersistent_facts(
original_host.name,
dict(
ansible_failed_task=original_task.serialize(),
ansible_failed_result=task_result._result,
),
)
else:
self._tqm._stats.increment('failures', original_host.name)
else:
self._tqm._stats.increment('ok', original_host.name)
self._tqm._stats.increment('ignored', original_host.name)
if 'changed' in task_result._result and task_result._result['changed']:
self._tqm._stats.increment('changed', original_host.name)
self._tqm.send_callback('v2_runner_on_failed', task_result, ignore_errors=ignore_errors)
elif task_result.is_unreachable():
ignore_unreachable = original_task.ignore_unreachable
if not ignore_unreachable:
self._tqm._unreachable_hosts[original_host.name] = True
iterator._play._removed_hosts.append(original_host.name)
else:
self._tqm._stats.increment('skipped', original_host.name)
task_result._result['skip_reason'] = 'Host %s is unreachable' % original_host.name
self._tqm._stats.increment('dark', original_host.name)
self._tqm.send_callback('v2_runner_on_unreachable', task_result)
elif task_result.is_skipped():
self._tqm._stats.increment('skipped', original_host.name)
self._tqm.send_callback('v2_runner_on_skipped', task_result)
else:
role_ran = True
if original_task.loop:
# this task had a loop, and has more than one result, so
# loop over all of them instead of a single result
result_items = task_result._result.get('results', [])
else:
result_items = [task_result._result]
for result_item in result_items:
if '_ansible_notify' in result_item:
if task_result.is_changed():
# The shared dictionary for notified handlers is a proxy, which
# does not detect when sub-objects within the proxy are modified.
# So, per the docs, we reassign the list so the proxy picks up and
# notifies all other threads
for handler_name in result_item['_ansible_notify']:
found = False
# Find the handler using the above helper. First we look up the
# dependency chain of the current task (if it's from a role), otherwise
# we just look through the list of handlers in the current play/all
# roles and use the first one that matches the notify name
target_handler = search_handler_blocks_by_name(handler_name, iterator._play.handlers)
if target_handler is not None:
found = True
if target_handler.notify_host(original_host):
self._tqm.send_callback('v2_playbook_on_notify', target_handler, original_host)
for listening_handler_block in iterator._play.handlers:
for listening_handler in listening_handler_block.block:
listeners = getattr(listening_handler, 'listen', []) or []
if not listeners:
continue
listeners = listening_handler.get_validated_value(
'listen', listening_handler._valid_attrs['listen'], listeners, handler_templar
)
if handler_name not in listeners:
continue
else:
found = True
if listening_handler.notify_host(original_host):
self._tqm.send_callback('v2_playbook_on_notify', listening_handler, original_host)
# and if none were found, then we raise an error
if not found:
msg = ("The requested handler '%s' was not found in either the main handlers list nor in the listening "
"handlers list" % handler_name)
if C.ERROR_ON_MISSING_HANDLER:
raise AnsibleError(msg)
else:
display.warning(msg)
if 'add_host' in result_item:
# this task added a new host (add_host module)
new_host_info = result_item.get('add_host', dict())
self._add_host(new_host_info, iterator)
elif 'add_group' in result_item:
# this task added a new group (group_by module)
self._add_group(original_host, result_item)
if 'ansible_facts' in result_item:
# if delegated fact and we are delegating facts, we need to change target host for them
if original_task.delegate_to is not None and original_task.delegate_facts:
host_list = self.get_delegated_hosts(result_item, original_task)
else:
host_list = self.get_task_hosts(iterator, original_host, original_task)
if original_task.action == 'include_vars':
for (var_name, var_value) in iteritems(result_item['ansible_facts']):
# find the host we're actually referring too here, which may
# be a host that is not really in inventory at all
for target_host in host_list:
self._variable_manager.set_host_variable(target_host, var_name, var_value)
else:
cacheable = result_item.pop('_ansible_facts_cacheable', False)
for target_host in host_list:
# so set_fact is a misnomer but 'cacheable = true' was meant to create an 'actual fact'
# to avoid issues with precedence and confusion with set_fact normal operation,
# we set BOTH fact and nonpersistent_facts (aka hostvar)
# when fact is retrieved from cache in subsequent operations it will have the lower precedence,
# but for playbook setting it the 'higher' precedence is kept
if original_task.action != 'set_fact' or cacheable:
self._variable_manager.set_host_facts(target_host, result_item['ansible_facts'].copy())
if original_task.action == 'set_fact':
self._variable_manager.set_nonpersistent_facts(target_host, result_item['ansible_facts'].copy())
if 'ansible_stats' in result_item and 'data' in result_item['ansible_stats'] and result_item['ansible_stats']['data']:
if 'per_host' not in result_item['ansible_stats'] or result_item['ansible_stats']['per_host']:
host_list = self.get_task_hosts(iterator, original_host, original_task)
else:
host_list = [None]
data = result_item['ansible_stats']['data']
aggregate = 'aggregate' in result_item['ansible_stats'] and result_item['ansible_stats']['aggregate']
for myhost in host_list:
for k in data.keys():
if aggregate:
self._tqm._stats.update_custom_stats(k, data[k], myhost)
else:
self._tqm._stats.set_custom_stats(k, data[k], myhost)
if 'diff' in task_result._result:
if self._diff or getattr(original_task, 'diff', False):
self._tqm.send_callback('v2_on_file_diff', task_result)
if not isinstance(original_task, TaskInclude):
self._tqm._stats.increment('ok', original_host.name)
if 'changed' in task_result._result and task_result._result['changed']:
self._tqm._stats.increment('changed', original_host.name)
# finally, send the ok for this task
self._tqm.send_callback('v2_runner_on_ok', task_result)
self._pending_results -= 1
if original_host.name in self._blocked_hosts:
del self._blocked_hosts[original_host.name]
# If this is a role task, mark the parent role as being run (if
# the task was ok or failed, but not skipped or unreachable)
if original_task._role is not None and role_ran: # TODO: and original_task.action != 'include_role':?
# lookup the role in the ROLE_CACHE to make sure we're dealing
# with the correct object and mark it as executed
for (entry, role_obj) in iteritems(iterator._play.ROLE_CACHE[original_task._role._role_name]):
if role_obj._uuid == original_task._role._uuid:
role_obj._had_task_run[original_host.name] = True
ret_results.append(task_result)
if one_pass or max_passes is not None and (cur_pass + 1) >= max_passes:
break
cur_pass += 1
return ret_results
def _wait_on_handler_results(self, iterator, handler, notified_hosts):
'''
Wait for the handler tasks to complete, using a short sleep
between checks to ensure we don't spin lock
'''
ret_results = []
handler_results = 0
display.debug("waiting for handler results...")
while (self._pending_results > 0 and
handler_results < len(notified_hosts) and
not self._tqm._terminated):
if self._tqm.has_dead_workers():
raise AnsibleError("A worker was found in a dead state")
results = self._process_pending_results(iterator)
ret_results.extend(results)
handler_results += len([
r._host for r in results if r._host in notified_hosts and
r.task_name == handler.name])
if self._pending_results > 0:
time.sleep(C.DEFAULT_INTERNAL_POLL_INTERVAL)
display.debug("no more pending handlers, returning what we have")
return ret_results
def _wait_on_pending_results(self, iterator):
'''
Wait for the shared counter to drop to zero, using a short sleep
between checks to ensure we don't spin lock
'''
ret_results = []
display.debug("waiting for pending results...")
while self._pending_results > 0 and not self._tqm._terminated:
if self._tqm.has_dead_workers():
raise AnsibleError("A worker was found in a dead state")
results = self._process_pending_results(iterator)
ret_results.extend(results)
if self._pending_results > 0:
time.sleep(C.DEFAULT_INTERNAL_POLL_INTERVAL)
display.debug("no more pending results, returning what we have")
return ret_results
def _add_host(self, host_info, iterator):
'''
Helper function to add a new host to inventory based on a task result.
'''
if host_info:
host_name = host_info.get('host_name')
# Check if host in inventory, add if not
if host_name not in self._inventory.hosts:
self._inventory.add_host(host_name, 'all')
self._hosts_cache_all.append(host_name)
new_host = self._inventory.hosts.get(host_name)
# Set/update the vars for this host
new_host.vars = combine_vars(new_host.get_vars(), host_info.get('host_vars', dict()))
new_groups = host_info.get('groups', [])
for group_name in new_groups:
if group_name not in self._inventory.groups:
group_name = self._inventory.add_group(group_name)
new_group = self._inventory.groups[group_name]
new_group.add_host(self._inventory.hosts[host_name])
# reconcile inventory, ensures inventory rules are followed
self._inventory.reconcile_inventory()
def _add_group(self, host, result_item):
'''
Helper function to add a group (if it does not exist), and to assign the
specified host to that group.
'''
changed = False
# the host here is from the executor side, which means it was a
# serialized/cloned copy and we'll need to look up the proper
# host object from the master inventory
real_host = self._inventory.hosts.get(host.name)
if real_host is None:
if host.name == self._inventory.localhost.name:
real_host = self._inventory.localhost
else:
raise AnsibleError('%s cannot be matched in inventory' % host.name)
group_name = result_item.get('add_group')
parent_group_names = result_item.get('parent_groups', [])
if group_name not in self._inventory.groups:
group_name = self._inventory.add_group(group_name)
for name in parent_group_names:
if name not in self._inventory.groups:
# create the new group and add it to inventory
self._inventory.add_group(name)
changed = True
group = self._inventory.groups[group_name]
for parent_group_name in parent_group_names:
parent_group = self._inventory.groups[parent_group_name]
parent_group.add_child_group(group)
if real_host.name not in group.get_hosts():
group.add_host(real_host)
changed = True
if group_name not in host.get_groups():
real_host.add_group(group)
changed = True
if changed:
self._inventory.reconcile_inventory()
return changed
def _copy_included_file(self, included_file):
'''
A proven safe and performant way to create a copy of an included file
'''
ti_copy = included_file._task.copy(exclude_parent=True)
ti_copy._parent = included_file._task._parent
temp_vars = ti_copy.vars.copy()
temp_vars.update(included_file._vars)
ti_copy.vars = temp_vars
return ti_copy
def _load_included_file(self, included_file, iterator, is_handler=False):
'''
Loads an included YAML file of tasks, applying the optional set of variables.
'''
display.debug("loading included file: %s" % included_file._filename)
try:
data = self._loader.load_from_file(included_file._filename)
if data is None:
return []
elif not isinstance(data, list):
raise AnsibleError("included task files must contain a list of tasks")
ti_copy = self._copy_included_file(included_file)
# pop tags out of the include args, if they were specified there, and assign
# them to the include. If the include already had tags specified, we raise an
# error so that users know not to specify them both ways
tags = included_file._task.vars.pop('tags', [])
if isinstance(tags, string_types):
tags = tags.split(',')
if len(tags) > 0:
if len(included_file._task.tags) > 0:
raise AnsibleParserError("Include tasks should not specify tags in more than one way (both via args and directly on the task). "
"Mixing tag specify styles is prohibited for whole import hierarchy, not only for single import statement",
obj=included_file._task._ds)
display.deprecated("You should not specify tags in the include parameters. All tags should be specified using the task-level option",
version='2.12')
included_file._task.tags = tags
block_list = load_list_of_blocks(
data,
play=iterator._play,
parent_block=ti_copy.build_parent_block(),
role=included_file._task._role,
use_handlers=is_handler,
loader=self._loader,
variable_manager=self._variable_manager,
)
# since we skip incrementing the stats when the task result is
# first processed, we do so now for each host in the list
for host in included_file._hosts:
self._tqm._stats.increment('ok', host.name)
except AnsibleError as e:
if isinstance(e, AnsibleFileNotFound):
reason = "Could not find or access '%s' on the Ansible Controller." % to_text(e.file_name)
else:
reason = to_text(e)
# mark all of the hosts including this file as failed, send callbacks,
# and increment the stats for this host
for host in included_file._hosts:
tr = TaskResult(host=host, task=included_file._task, return_data=dict(failed=True, reason=reason))
iterator.mark_host_failed(host)
self._tqm._failed_hosts[host.name] = True
self._tqm._stats.increment('failures', host.name)
self._tqm.send_callback('v2_runner_on_failed', tr)
return []
# finally, send the callback and return the list of blocks loaded
self._tqm.send_callback('v2_playbook_on_include', included_file)
display.debug("done processing included file")
return block_list
def run_handlers(self, iterator, play_context):
'''
Runs handlers on those hosts which have been notified.
'''
result = self._tqm.RUN_OK
for handler_block in iterator._play.handlers:
# FIXME: handlers need to support the rescue/always portions of blocks too,
# but this may take some work in the iterator and gets tricky when
# we consider the ability of meta tasks to flush handlers
for handler in handler_block.block:
if handler.notified_hosts:
result = self._do_handler_run(handler, handler.get_name(), iterator=iterator, play_context=play_context)
if not result:
break
return result
def _do_handler_run(self, handler, handler_name, iterator, play_context, notified_hosts=None):
# FIXME: need to use iterator.get_failed_hosts() instead?
# if not len(self.get_hosts_remaining(iterator._play)):
# self._tqm.send_callback('v2_playbook_on_no_hosts_remaining')
# result = False
# break
if notified_hosts is None:
notified_hosts = handler.notified_hosts[:]
# strategy plugins that filter hosts need access to the iterator to identify failed hosts
failed_hosts = self._filter_notified_failed_hosts(iterator, notified_hosts)
notified_hosts = self._filter_notified_hosts(notified_hosts)
notified_hosts += failed_hosts
if len(notified_hosts) > 0:
saved_name = handler.name
handler.name = handler_name
self._tqm.send_callback('v2_playbook_on_handler_task_start', handler)
handler.name = saved_name
bypass_host_loop = False
try:
action = plugin_loader.action_loader.get(handler.action, class_only=True)
if getattr(action, 'BYPASS_HOST_LOOP', False):
bypass_host_loop = True
except KeyError:
# we don't care here, because the action may simply not have a
# corresponding action plugin
pass
host_results = []
for host in notified_hosts:
if not iterator.is_failed(host) or iterator._play.force_handlers:
task_vars = self._variable_manager.get_vars(play=iterator._play, host=host, task=handler,
_hosts=self._hosts_cache, _hosts_all=self._hosts_cache_all)
self.add_tqm_variables(task_vars, play=iterator._play)
templar = Templar(loader=self._loader, variables=task_vars)
if not handler.cached_name:
handler.name = templar.template(handler.name)
handler.cached_name = True
self._queue_task(host, handler, task_vars, play_context)
if templar.template(handler.run_once) or bypass_host_loop:
break
# collect the results from the handler run
host_results = self._wait_on_handler_results(iterator, handler, notified_hosts)
included_files = IncludedFile.process_include_results(
host_results,
iterator=iterator,
loader=self._loader,
variable_manager=self._variable_manager
)
result = True
if len(included_files) > 0:
for included_file in included_files:
try:
new_blocks = self._load_included_file(included_file, iterator=iterator, is_handler=True)
# for every task in each block brought in by the include, add the list
# of hosts which included the file to the notified_handlers dict
for block in new_blocks:
iterator._play.handlers.append(block)
for task in block.block:
task_name = task.get_name()
display.debug("adding task '%s' included in handler '%s'" % (task_name, handler_name))
task.notified_hosts = included_file._hosts[:]
result = self._do_handler_run(
handler=task,
handler_name=task_name,
iterator=iterator,
play_context=play_context,
notified_hosts=included_file._hosts[:],
)
if not result:
break
except AnsibleError as e:
for host in included_file._hosts:
iterator.mark_host_failed(host)
self._tqm._failed_hosts[host.name] = True
display.warning(to_text(e))
continue
# remove hosts from notification list
handler.notified_hosts = [
h for h in handler.notified_hosts
if h not in notified_hosts]
display.debug("done running handlers, result is: %s" % result)
return result
def _filter_notified_failed_hosts(self, iterator, notified_hosts):
return []
def _filter_notified_hosts(self, notified_hosts):
'''
Filter notified hosts accordingly to strategy
'''
# As main strategy is linear, we do not filter hosts
# We return a copy to avoid race conditions
return notified_hosts[:]
def _take_step(self, task, host=None):
ret = False
msg = u'Perform task: %s ' % task
if host:
msg += u'on %s ' % host
msg += u'(N)o/(y)es/(c)ontinue: '
resp = display.prompt(msg)
if resp.lower() in ['y', 'yes']:
display.debug("User ran task")
ret = True
elif resp.lower() in ['c', 'continue']:
display.debug("User ran task and canceled step mode")
self._step = False
ret = True
else:
display.debug("User skipped task")
display.banner(msg)
return ret
def _cond_not_supported_warn(self, task_name):
display.warning("%s task does not support when conditional" % task_name)
def _execute_meta(self, task, play_context, iterator, target_host):
# meta tasks store their args in the _raw_params field of args,
# since they do not use k=v pairs, so get that
meta_action = task.args.get('_raw_params')
def _evaluate_conditional(h):
all_vars = self._variable_manager.get_vars(play=iterator._play, host=h, task=task,
_hosts=self._hosts_cache, _hosts_all=self._hosts_cache_all)
templar = Templar(loader=self._loader, variables=all_vars)
return task.evaluate_conditional(templar, all_vars)
skipped = False
msg = ''
if meta_action == 'noop':
# FIXME: issue a callback for the noop here?
if task.when:
self._cond_not_supported_warn(meta_action)
msg = "noop"
elif meta_action == 'flush_handlers':
if task.when:
self._cond_not_supported_warn(meta_action)
self._flushed_hosts[target_host] = True
self.run_handlers(iterator, play_context)
self._flushed_hosts[target_host] = False
msg = "ran handlers"
elif meta_action == 'refresh_inventory' or self.flush_cache:
if task.when:
self._cond_not_supported_warn(meta_action)
self._inventory.refresh_inventory()
self._set_hosts_cache(iterator._play)
msg = "inventory successfully refreshed"
elif meta_action == 'clear_facts':
if _evaluate_conditional(target_host):
for host in self._inventory.get_hosts(iterator._play.hosts):
hostname = host.get_name()
self._variable_manager.clear_facts(hostname)
msg = "facts cleared"
else:
skipped = True
elif meta_action == 'clear_host_errors':
if _evaluate_conditional(target_host):
for host in self._inventory.get_hosts(iterator._play.hosts):
self._tqm._failed_hosts.pop(host.name, False)
self._tqm._unreachable_hosts.pop(host.name, False)
iterator._host_states[host.name].fail_state = iterator.FAILED_NONE
msg = "cleared host errors"
else:
skipped = True
elif meta_action == 'end_play':
if _evaluate_conditional(target_host):
for host in self._inventory.get_hosts(iterator._play.hosts):
if host.name not in self._tqm._unreachable_hosts:
iterator._host_states[host.name].run_state = iterator.ITERATING_COMPLETE
msg = "ending play"
elif meta_action == 'end_host':
if _evaluate_conditional(target_host):
iterator._host_states[target_host.name].run_state = iterator.ITERATING_COMPLETE
iterator._play._removed_hosts.append(target_host.name)
msg = "ending play for %s" % target_host.name
else:
skipped = True
msg = "end_host conditional evaluated to false, continuing execution for %s" % target_host.name
elif meta_action == 'reset_connection':
all_vars = self._variable_manager.get_vars(play=iterator._play, host=target_host, task=task,
_hosts=self._hosts_cache, _hosts_all=self._hosts_cache_all)
templar = Templar(loader=self._loader, variables=all_vars)
# apply the given task's information to the connection info,
# which may override some fields already set by the play or
# the options specified on the command line
play_context = play_context.set_task_and_variable_override(task=task, variables=all_vars, templar=templar)
# fields set from the play/task may be based on variables, so we have to
# do the same kind of post validation step on it here before we use it.
play_context.post_validate(templar=templar)
# now that the play context is finalized, if the remote_addr is not set
# default to using the host's address field as the remote address
if not play_context.remote_addr:
play_context.remote_addr = target_host.address
# We also add "magic" variables back into the variables dict to make sure
# a certain subset of variables exist.
play_context.update_vars(all_vars)
if task.when:
self._cond_not_supported_warn(meta_action)
if target_host in self._active_connections:
connection = Connection(self._active_connections[target_host])
del self._active_connections[target_host]
else:
connection = plugin_loader.connection_loader.get(play_context.connection, play_context, os.devnull)
play_context.set_attributes_from_plugin(connection)
if connection:
try:
connection.reset()
msg = 'reset connection'
except ConnectionError as e:
# most likely socket is already closed
display.debug("got an error while closing persistent connection: %s" % e)
else:
msg = 'no connection, nothing to reset'
else:
raise AnsibleError("invalid meta action requested: %s" % meta_action, obj=task._ds)
result = {'msg': msg}
if skipped:
result['skipped'] = True
else:
result['changed'] = False
display.vv("META: %s" % msg)
return [TaskResult(target_host, task, result)]
def get_hosts_left(self, iterator):
''' returns list of available hosts for this iterator by filtering out unreachables '''
hosts_left = []
for host in self._hosts_cache:
if host not in self._tqm._unreachable_hosts:
try:
hosts_left.append(self._inventory.hosts[host])
except KeyError:
hosts_left.append(self._inventory.get_host(host))
return hosts_left
def update_active_connections(self, results):
''' updates the current active persistent connections '''
for r in results:
if 'args' in r._task_fields:
socket_path = r._task_fields['args'].get('_ansible_socket')
if socket_path:
if r._host not in self._active_connections:
self._active_connections[r._host] = socket_path
class NextAction(object):
""" The next action after an interpreter's exit. """
REDO = 1
CONTINUE = 2
EXIT = 3
def __init__(self, result=EXIT):
self.result = result
class Debugger(cmd.Cmd):
prompt_continuous = '> ' # multiple lines
def __init__(self, task, host, task_vars, play_context, result, next_action):
# cmd.Cmd is old-style class
cmd.Cmd.__init__(self)
self.prompt = '[%s] %s (debug)> ' % (host, task)
self.intro = None
self.scope = {}
self.scope['task'] = task
self.scope['task_vars'] = task_vars
self.scope['host'] = host
self.scope['play_context'] = play_context
self.scope['result'] = result
self.next_action = next_action
def cmdloop(self):
try:
cmd.Cmd.cmdloop(self)
except KeyboardInterrupt:
pass
do_h = cmd.Cmd.do_help
def do_EOF(self, args):
"""Quit"""
return self.do_quit(args)
def do_quit(self, args):
"""Quit"""
display.display('User interrupted execution')
self.next_action.result = NextAction.EXIT
return True
do_q = do_quit
def do_continue(self, args):
"""Continue to next result"""
self.next_action.result = NextAction.CONTINUE
return True
do_c = do_continue
def do_redo(self, args):
"""Schedule task for re-execution. The re-execution may not be the next result"""
self.next_action.result = NextAction.REDO
return True
do_r = do_redo
def do_update_task(self, args):
"""Recreate the task from ``task._ds``, and template with updated ``task_vars``"""
templar = Templar(None, shared_loader_obj=None, variables=self.scope['task_vars'])
task = self.scope['task']
task = task.load_data(task._ds)
task.post_validate(templar)
self.scope['task'] = task
do_u = do_update_task
def evaluate(self, args):
try:
return eval(args, globals(), self.scope)
except Exception:
t, v = sys.exc_info()[:2]
if isinstance(t, str):
exc_type_name = t
else:
exc_type_name = t.__name__
display.display('***%s:%s' % (exc_type_name, repr(v)))
raise
def do_pprint(self, args):
"""Pretty Print"""
try:
result = self.evaluate(args)
display.display(pprint.pformat(result))
except Exception:
pass
do_p = do_pprint
def execute(self, args):
try:
code = compile(args + '\n', '<stdin>', 'single')
exec(code, globals(), self.scope)
except Exception:
t, v = sys.exc_info()[:2]
if isinstance(t, str):
exc_type_name = t
else:
exc_type_name = t.__name__
display.display('***%s:%s' % (exc_type_name, repr(v)))
raise
def default(self, line):
try:
self.execute(line)
except Exception:
pass
|
closed
|
ansible/ansible
|
https://github.com/ansible/ansible
| 67,972 |
Git module: Failed to download remote objects and refs
|
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
The invocation to git module executes the command `git fetch --tags` which can fail under certain circumstances (some tag remotely overridden).
Even by adding the option `force: "yes"` the command it is executed without the **-f** option.
To workaround it, it is necessary to execute the command `git fetch --tags -f` previous to the git module execution.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
Git Module
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.9.4
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
Generate a git tag:
```
git checkout bla
git tag -f DEV
git push origin DEV --force
```
And remotely, override it:
```
git checkout foo
git tag -f DEV
git push origin DEV --force
```
Then in the first local environment, execute the ansible git task
<!--- Paste example playbooks or commands between quotes below -->
```yaml
- name: "Clone/refresh Staging Repo with SSH"
when: use_ssh == "yes"
git:
dest: "{{ staging_repo }}"
name: "some git repo"
version: "master"
force: "yes"
key_file: '~/.ssh/id_rsa'
ssh_opts: "-o StrictHostKeyChecking=no"
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
If I set the option force to true, I would expect to get the git fetch command executed with the -f option.
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
TASK [Clone/refresh Staging Repo with SSH] ************************************************************************************
4:37
fatal: [localhost]: FAILED! => {“changed”: false, “cmd”: [“/usr/local/bin/git”, “fetch”, “--tags”, “origin”], “msg”: “Failed to download remote objects and refs: From XXXX some commit..other commit master -> origin/master\n
! [rejected] DEV -> DEV (would clobber existing tag)}
```
|
https://github.com/ansible/ansible/issues/67972
|
https://github.com/ansible/ansible/pull/68691
|
123c624b28398c10864be5238dfdd44c524564a0
|
4916be24fd8be0c223bf5e5f641d676a8d56ad82
| 2020-03-03T14:48:44Z |
python
| 2020-04-06T20:25:24Z |
changelogs/fragments/67972-git-fetch-force.yml
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.