Compare commits

...

13 commits

Author SHA1 Message Date
Zuul
fc2549a223 Merge "Calculate missing checksum for file:// based images" into bugfix/24.0 2025-01-10 10:46:31 +00:00
Riccardo Pittau
8ba74445b9 [bugfix only] Pin upper-constraints
This is to fix tox tests.
The bugfix branches should pin upper constraints when they are
created.

Change-Id: I175a8394c5cad3fb3f99534dc2cf15eb92654aa4
2025-01-08 16:37:48 +01:00
Steve Baker
a39a8ee134 Calculate missing checksum for file:// based images
The fix for CVE-2024-47211 results in image checksum being required in
all cases. However there is no requirement for checksums in
file:// based images.

This change checks for this situation. When checksum is missing for
file:// based image_source it is now calculated on-the-fly.

Change-Id: Ib2fd5ddcbee9a9d1c7e32770ec3d9b6cb20a2e2a
(cherry picked from commit b827c7bf72)
2025-01-08 10:19:00 +13:00
Julia Kreger
3aa45b7a9d Fix actual size calculation for storage fallback logic
When we were fixing the qemu-img related CVE, in our rush we didn't
realize that the logic for storage sizing, which only falls back to
actual size didn't match the prior interface exactly. Instead of
disk_size, we have actual_size on the format inspector.

This was not discovered because all of the code handling that side
of the unit tests were mocked.

Anyhow, easy fix.

Closes-Bug: 2083520
Change-Id: Ic4390d578f564f245d7fb4013f2ba5531aee9ea9
(cherry picked from commit 90f9fa3eb0)
2024-10-09 07:09:23 +00:00
Julia Kreger
c164641b45 Checksum files before raw conversion
While working another issue, we discovered that support added to
the ironic-conductor process combined the image_download_source
option of "local" with the "force_raw" option resulted in a case
where Ironic had no concept to checksum the files *before* the
conductor process triggered an image format conversion and
then records new checksum values.

In essence, this opened the user requested image file to be
suspetible to a theoretical man-in-the-middle attack OR
the remote server replacing the content with an unknown file,
such as a new major version.

The is at odds with Ironic's security model where we do want to
ensure the end user of ironic is asserting a known checksum for
the image artifact they are deploying, so they are aware of the
present state. Due to the risk, we chose to raise this as a CVE,
as infrastructure operators should likely apply this patch.

As a note, if your *not* forcing all images to be raw format
through the conductor, then this issue is likely not a major
issue for you, but you should still apply the patch.

This is being tracked as CVE-2024-47211.

Closes-Bug: 2076289
Change-Id: Id6185b317aa6e4f4363ee49f77e688701995323a
Signed-off-by: Julia Kreger <juliaashleykreger@gmail.com>
2024-09-25 14:57:26 -07:00
Dmitry Tantsur
091a0e8512
Fix inspection if bmc_address or bmc_v6address is None
IPA started sending None when the device is not found.

Change-Id: Ibeef33ff9a0acdb7c605bc46ef9e5d203c7aaa6d
(cherry picked from commit ad03a4c32d)
2024-09-12 17:08:38 +02:00
Dmitry Tantsur
d6dbf988f6
Try limiting MTU to at least 1280
Change-Id: If8f9907df62019b3cf6d6df7d83d5ff421f6be65
(cherry picked from commit 510f87a033)
2024-09-12 17:05:38 +02:00
Zuul
3a94de6bea Merge "CVE-2024-44982: Harden all image handling and conversion code" into bugfix/24.0 2024-09-05 04:09:24 +00:00
Julia Kreger
07bb2caf3c CVE-2024-44982: Harden all image handling and conversion code
It was recently learned by the OpenStack community that running qemu-img
on un-trusted images without a format pre-specified can present a
security risk. Furthermore, some of these specific image formats have
inherently unsafe features. This is rooted in how qemu-img operates
where all image drivers are loaded and attempt to evaluate the input data.
This can result in several different vectors which this patch works to
close.

This change imports the qemu-img handling code from Ironic-Lib into
Ironic, and image format inspection code, which has been developed by
the wider community to validate general safety of images before converting
them for use in	a deployment.

This patch contains functional changes related to the hardening of these
calls including how images are handled, and updates documentation to
provide context and guidance to operators.

Closes-Bug: 2071740
Change-Id: I7fac5c64f89aec39e9755f0930ee47ff8f7aed47
Signed-off-by: Julia Kreger <juliaashleykreger@gmail.com>
2024-09-04 15:19:49 -07:00
Zuul
78071be02a Merge "CI: Disable metal3-integration test job" into bugfix/24.0 2024-09-04 18:33:18 +00:00
Julia Kreger
ca4f4bf86e CI: Disable metal3-integration test job
The metal3-integration CI job is not smart enough to know which
branches to pull for it to correctly test the branch, and so it
should be disabled on this branch.

Change-Id: If04a5b97722cc1a8e125c3348e09339c3a7ce0eb
(cherry picked from commit 4cb0af7fd6)
2024-09-04 10:56:31 -07:00
Riccardo Pittau
a31a49eb07 [bugfix only] Remove deleted lextudio packages
The lextudio pyasn1 and pyasn1-modules packages have been yanked.
Just use the normal ones.

Change-Id: Ia63be2f04e2cd0438a4a14ac7b4a7cdb63bd8093
2024-08-12 09:19:34 +02:00
OpenStack Release Bot
3464aef661 Update .gitreview for bugfix/24.0
Change-Id: I36e0dd100cd5605a5a65fc599a5fa97a25f06af8
2024-02-01 11:20:50 +00:00
36 changed files with 5295 additions and 224 deletions

View file

@ -2,3 +2,4 @@
host=review.opendev.org
port=29418
project=openstack/ironic.git
defaultbranch=bugfix/24.0

View file

@ -22,9 +22,12 @@ if [ $local_mtu -gt $PUBLIC_BRIDGE_MTU ]; then
fi
# 50 bytes is overhead for vxlan (which is greater than GRE
# allowing us to use either overlay option with this MTU.
# allowing us to use either overlay option with this MTU).
# However, if traffic is flowing over IPv6 tunnels, then
# The overhead is essentially another 100 bytes. In order to
# The overhead is essentially another 78 bytes. In order to
# handle both cases, lets go ahead and drop the maximum by
# 100 bytes.
PUBLIC_BRIDGE_MTU=${OVERRIDE_PUBLIC_BRIDGE_MTU:-$((local_mtu - 100))}
# 78 bytes, while not going below 1280 to make IPv6 work at all.
PUBLIC_BRIDGE_MTU=${OVERRIDE_PUBLIC_BRIDGE_MTU:-$((local_mtu - 78))}
if [ $PUBLIC_BRIDGE_MTU -lt 1280 ]; then
PUBLIC_BRIDGE_MTU=1280
fi

View file

@ -19,6 +19,40 @@ OpenStack deployment.
.. TODO: add "Multi-tenancy Considerations" section
Image Checksums
===============
Ironic has long provided a capacity to supply and check a checksum for disk
images being deployed. However, one aspect which Ironic has not asserted is
"Why?" in terms of "Is it for security?" or "Is it for data integrity?".
The answer is both to ensure a higher level of security with remote
image files, *and* provide faster feedback should a image being transferred
happens to be corrupted.
Normally checksums are verified by the ``ironic-python-agent`` **OR** the
deployment interface responsible for overall deployment operation. That being
said, not *every* deployment interface relies on disk images which have
checksums, and those deployment interfaces are for specific use cases which
Ironic users leverage, outside of the "general" use case capabilities provided
by the ``direct`` deployment interface.
.. NOTE::
Use of the node ``instance_info/image_checksum`` field is discouraged
for integrated OpenStack Users as usage of the matching Glance Image
Service field is also deprecated. That being said, Ironic retains this
feature by popular demand while also enabling also retain simplified
operator interaction.
The newer field values supported by Glance are also specifically
supported by Ironic as ``instance_info/image_os_hash_value`` for
checksum values and ``instance_info/image_os_hash_algo`` field for
the checksum algorithm.
.. WARNING::
Setting a checksum value to a URL is supported, *however* doing this is
making a "tradeoff" with security as the remote checksum *can* change.
Conductor support this functionality can be disabled using the
:oslo.config:option:`conductor.disable_support_for_checksum_files` setting.
REST API: user roles and policy settings
========================================
@ -275,3 +309,70 @@ An easy way to do this is to:
# Access IPA ramdisk functions
"baremetal:driver:ipa_lookup": "rule:is_admin"
Disk Images
===========
Ironic relies upon the ``qemu-img`` tool to convert images from a supplied
disk image format, to a ``raw`` format in order to write the contents of a
disk image to the remote device.
By default, only ``qcow2`` format is supported for this operation, however there
have been reports other formats work when so enabled using the
``[conductor]permitted_image_formats`` configuration option.
Ironic takes several steps by default.
#. Ironic checks and compares supplied metadata with a remote authoritative
source, such as the Glance Image Service, if available.
#. Ironic attempts to "fingerprint" the file type based upon available
metadata and file structure. A file format which is not known to the image
format inspection code may be evaluated as "raw", which means the image
would not be passed through ``qemu-img``. When in doubt, use a ``raw``
image which you can verify is in the desirable and expected state.
#. The image then has a set of safety and sanity checks executed which look
for unknown or unsafe feature usage in the base format which could permit
an attacker to potentially leverage functionality in ``qemu-img`` which
should not be utilized. This check, by default, occurs only through images
which transverse *through* the conductor.
#. Ironic then checks if the fingerprint values and metadata values match.
If they do not match, the requested image is rejected and the operation
fails.
#. The image is then provided to the ``ironic-python-agent``.
Images which are considered "pass-through", as in they are supplied by an
API user as a URL, or are translated to a temporary URL via available
service configuration, are supplied as a URL to the
``ironic-python-agent``.
Ironic can be configured to intercept this interaction and have the conductor
download and inspect these items before the ``ironic-python-agent`` will do so,
however this can increase the temporary disk utilization of the Conductor
along with network traffic to facilitate the transfer. This check is disabled
by default, but can be enabled using the
``[conductor]conductor_always_validates_images`` configuration option.
An option exists which forces all files to be served from the conductor, and
thus force image inspection before involvement of the ``ironic-python-agent``
is the use of the ``[agent]image_download_source`` configuration parameter
when set to ``local`` which proxies all disk images through the conductor.
This setting is also available in the node ``driver_info`` and
``instance_info`` fields.
Mitigating Factors to disk images
---------------------------------
In a fully integrated OpenStack context, Ironic requires images to be set to
"public" in the Image Service.
A direct API user with sufficient elevated access rights *can* submit a URL
for the baremetal node ``instance_info`` dictionary field with an
``image_source`` key value set to a URL. To do so explicitly requires
elevated (trusted) access rights of a System scoped Member,
or Project scoped Owner-Member, or a Project scoped Lessee-Admin via
the ``baremetal:node:update_instance_info`` policy permission rule.
Before the Wallaby release of OpenStack, this was restricted to
``admin`` and ``baremetal_admin`` roles and remains similarly restrictive
in the newer "Secure RBAC" model.
>>>>>>> 8491abb92 (Harden all image handling and conversion code)

View file

@ -1127,3 +1127,27 @@ or manual cleaning with ``clean`` command. or the next appropriate action
in the workflow process you are attempting to follow, which may be
ultimately be decommissioning the node because it could have failed and is
being removed or replaced.
Ironic says my Image is Invalid
===============================
As a result of security fixes which were added to Ironic, resulting from the
security posture of the ``qemu-img`` utility, Ironic enforces certain aspects
related to image files.
* Enforces that the file format of a disk image matches what Ironic is
told by an API user. Any mismatch will result in the image being declared
as invalid. A mismatch with the file contents and what is stored in the
Image service will necessitate uploading a new image as that property
cannot be changed in the image service *after* creation of an image.
* Enforces that the input file format to be written is ``qcow2`` or ``raw``.
This can be extended by modifying ``[conductor]permitted_image_formats`` in
``ironic.conf``.
* Performs safety and sanity check assessment against the file, which can be
disabled by modifying ``[conductor]disable_deep_image_inspection`` and
setting it to ``True``. Doing so is not considered safe and should only
be done by operators accepting the inherent risk that the image they
are attempting to use may have a bad or malicious structure.
Image safety checks are generally performed as the deployment process begins
and stages artifacts, however a late stage check is performed when
needed by the ironic-python-agent.

View file

@ -3,6 +3,24 @@
Add images to the Image service
===============================
Supported Image Formats
~~~~~~~~~~~~~~~~~~~~~~~
Ironic officially supports and tests use of ``qcow2`` formatted images as well
as ``raw`` format images. Other types of disk images, like ``vdi``, and single
file ``vmdk`` files have been reported by users as working in their specific
cases, but are not tested upstream. We advise operators to convert the image
and properly upload the image to Glance.
Ironic enforces the list of supported and permitted image formats utilizing
the ``[conductor]permitted_image_formats`` option in ironic.conf. This setting
defaults to "raw" and "qcow2".
A detected format mismatch between Glance and what the actual contents of
the disk image file are detected as will result in a failed deployment.
To correct such a situation, the image must be re-uploaded with the
declared ``--disk-format`` or actual image file format corrected.
Instance (end-user) images
~~~~~~~~~~~~~~~~~~~~~~~~~~
@ -11,6 +29,10 @@ Build or download the user images as described in :doc:`/user/creating-images`.
Load all the created images into the Image service, and note the image UUIDs in
the Image service for each one as it is generated.
.. note::
Images from Glance used by Ironic must be flagged as ``public``, which
requires administrative privileges with the Glance image service to set.
- For *whole disk images* just upload the image:
.. code-block:: console

View file

@ -27,6 +27,39 @@ Many distributions publish their own cloud images. These are usually whole disk
images that are built for legacy boot mode (not UEFI), with Ubuntu being an
exception (they publish images that work in both modes).
Supported Disk Image Formats
----------------------------
The following formats are tested by Ironic and are expected to work as
long as no unknown or unsafe special features are being used
* raw - A file containing bytes as they would exist on a disk or other
block storage device. This is the simplest format.
* qcow2 - An updated file format based upon the `QEMU <https://www.qemu.org>`_
Copy-on-Write format.
A special mention exists for ``iso`` formatted "CD" images. While Ironic uses
the ISO9660 filesystems in some of it's processes for aspects such as virtual
media, it does *not* support writing them to the remote block storage device.
Image formats we believe may work due to third party reports, but do not test:
* vmdk - A file format derived from the image format originally created
by VMware for their hypervisor product line. Specifically we believe
a single file VMDK formatted image should work. As there are
are several subformats, some of which will not work and may result
in unexpected behavior such as failed deployments.
* vdi - A file format used by
`Oracle VM Virtualbox <https://www.virtualbox.org>`_ hypervisor.
As Ironic does not support these formats, their usage is normally blocked
due security considerations by default. Please consult with your Ironic Operator.
It is important to highlight that Ironic enforces and matches the file type
based upon signature, and not file extension. If there is a mismatch,
the input and or remote service records such as in the Image service
must be corrected.
disk-image-builder
------------------

View file

@ -6,8 +6,8 @@
# These are available on pypi
proliantutils>=2.16.0
pysnmp-lextudio>=5.0.0 # BSD
pyasn1-lextudio>=1.1.0 # BSD
pyasn1-modules-lextudio>=0.2.0 # BSD
pyasn1>=0.5.1 # BSD
pyasn1-modules>=0.3.0 # BSD
python-scciclient>=0.15.0
python-dracclient>=5.1.0,<9.0.0
python-xclarityclient>=0.1.6

View file

@ -60,6 +60,8 @@ def config(token):
# and behavior into place.
'agent_token_required': True,
'agent_md5_checksum_enable': CONF.agent.allow_md5_checksum,
'disable_deep_image_inspection': CONF.conductor.disable_deep_image_inspection, # noqa
'permitted_image_formats': CONF.conductor.permitted_image_formats,
}
@ -271,8 +273,8 @@ DATA_VALIDATOR = args.schema({
'inventory': {
'type': 'object',
'properties': {
'bmc_address': {'type': 'string'},
'bmc_v6address': {'type': 'string'},
'bmc_address': {'type': ['string', 'null']},
'bmc_v6address': {'type': ['string', 'null']},
'interfaces': {
'type': 'array',
'items': {

View file

@ -0,0 +1,269 @@
# Copyright (c) 2024 Red Hat, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import os
import re
import time
from urllib import parse as urlparse
from oslo_log import log as logging
from oslo_utils import fileutils
from ironic.common import exception
from ironic.common.i18n import _
from ironic.common import image_service
from ironic.conf import CONF
LOG = logging.getLogger(__name__)
# REGEX matches for Checksum file payloads
# If this list requires changes, it should be changed in
# ironic-python-agent (extensions/standby.py) as well.
MD5_MATCH = r"^([a-fA-F\d]{32})\s" # MD5 at beginning of line
MD5_MATCH_END = r"\s([a-fA-F\d]{32})$" # MD5 at end of line
MD5_MATCH_ONLY = r"^([a-fA-F\d]{32})$" # MD5 only
SHA256_MATCH = r"^([a-fA-F\d]{64})\s" # SHA256 at beginning of line
SHA256_MATCH_END = r"\s([a-fA-F\d]{64})$" # SHA256 at end of line
SHA256_MATCH_ONLY = r"^([a-fA-F\d]{64})$" # SHA256 only
SHA512_MATCH = r"^([a-fA-F\d]{128})\s" # SHA512 at beginning of line
SHA512_MATCH_END = r"\s([a-fA-F\d]{128})$" # SHA512 at end of line
SHA512_MATCH_ONLY = r"^([a-fA-F\d]{128})$" # SHA512 only
FILENAME_MATCH_END = r"\s[*]?{filename}$" # Filename binary/text end of line
FILENAME_MATCH_PARENTHESES = r"\s\({filename}\)\s" # CentOS images
CHECKSUM_MATCHERS = (MD5_MATCH, MD5_MATCH_END, SHA256_MATCH, SHA256_MATCH_END,
SHA512_MATCH, SHA512_MATCH_END)
CHECKSUM_ONLY_MATCHERS = (MD5_MATCH_ONLY, SHA256_MATCH_ONLY, SHA512_MATCH_ONLY)
FILENAME_MATCHERS = (FILENAME_MATCH_END, FILENAME_MATCH_PARENTHESES)
def validate_checksum(path, checksum, checksum_algo=None):
"""Validate image checksum.
:param path: File path in the form of a string to calculate a checksum
which is compared to the checksum field.
:param checksum: The supplied checksum value, a string, which will be
compared to the file.
:param checksum_algo: The checksum type of the algorithm.
:raises: ImageChecksumError if the supplied data cannot be parsed or
if the supplied value does not match the supplied checksum
value.
"""
# TODO(TheJilia): At some point, we likely need to compare
# the incoming checksum algorithm upfront, ut if one is invoked which
# is not supported, hashlib will raise ValueError.
use_checksum_algo = None
if ":" in checksum:
# A form of communicating the checksum algorithm is to delimit the
# type from the value. See ansible deploy interface where this
# is most evident.
split_checksum = checksum.split(":")
use_checksum = split_checksum[1]
use_checksum_algo = split_checksum[0]
else:
use_checksum = checksum
if not use_checksum_algo:
use_checksum_algo = checksum_algo
# If we have a zero length value, but we split it, we have
# invalid input. Also, checksum is what we expect, algorithm is
# optional. This guards against the split of a value which is
# image_checksum = "sha256:" which is a potential side effect of
# splitting the string.
if use_checksum == '':
raise exception.ImageChecksumError()
# Make everything lower case since we don't expect mixed case,
# but we may have human originated input on the supplied algorithm.
try:
if not use_checksum_algo:
# This is backwards compatible support for a bare checksum.
calculated = compute_image_checksum(path)
else:
calculated = compute_image_checksum(path,
use_checksum_algo.lower())
except ValueError:
# ValueError is raised when an invalid/unsupported/unknown
# checksum algorithm is invoked.
LOG.error("Failed to generate checksum for file %(path)s, possible "
"invalid checksum algorithm: %(algo)s",
{"path": path,
"algo": use_checksum_algo})
raise exception.ImageChecksumAlgorithmFailure()
except OSError:
LOG.error("Failed to read file %(path)s to compute checksum.",
{"path": path})
raise exception.ImageChecksumFileReadFailure()
if (use_checksum is not None
and calculated.lower() != use_checksum.lower()):
LOG.error("We were supplied a checksum value of %(supplied)s, but "
"calculated a value of %(value)s. This is a fatal error.",
{"supplied": use_checksum,
"value": calculated})
raise exception.ImageChecksumError()
def compute_image_checksum(image_path, algorithm='md5'):
"""Compute checksum by given image path and algorithm.
:param image_path: The path to the file to undergo checksum calculation.
:param algorithm: The checksum algorithm to utilize. Defaults
to 'md5' due to historical support reasons in Ironic.
:returns: The calculated checksum value.
:raises: ValueError when the checksum algorithm is not supported
by the system.
"""
time_start = time.time()
LOG.debug('Start computing %(algo)s checksum for image %(image)s.',
{'algo': algorithm, 'image': image_path})
checksum = fileutils.compute_file_checksum(image_path,
algorithm=algorithm)
time_elapsed = time.time() - time_start
LOG.debug('Computed %(algo)s checksum for image %(image)s in '
'%(delta).2f seconds, checksum value: %(checksum)s.',
{'algo': algorithm, 'image': image_path, 'delta': time_elapsed,
'checksum': checksum})
return checksum
def get_checksum_and_algo(instance_info):
"""Get and return the image checksum and algo.
:param instance_info: The node instance info, or newly updated/generated
instance_info value.
:returns: A tuple containing two values, a checksum and algorithm,
if available.
"""
checksum_algo = None
if 'image_os_hash_value' in instance_info.keys():
# A value set by image_os_hash_value supersedes other
# possible uses as it is specific.
checksum = instance_info.get('image_os_hash_value')
checksum_algo = instance_info.get('image_os_hash_algo')
else:
checksum = instance_info.get('image_checksum')
image_source = instance_info.get('image_source')
# NOTE(stevebaker): file:// images have no requirement to supply
# checksums but they are now mandatory for validation as part
# of the fix for CVE-2024-47211.
# The only practical option is to calculate it here.
if checksum is None and image_source.startswith('file:'):
checksum_algo = "sha256"
image_path = urlparse.urlparse(image_source).path
checksum = fileutils.compute_file_checksum(
image_path, algorithm=checksum_algo)
elif is_checksum_url(checksum):
checksum = get_checksum_from_url(checksum, image_source)
# NOTE(TheJulia): This is all based on SHA-2 lengths.
# SHA-3 would require a hint and it would not be a fixed length.
# That said, SHA-2 is still valid and has not been withdrawn.
checksum_len = len(checksum)
if checksum_len == 128:
# SHA2-512 is 512 bits, 128 characters.
checksum_algo = "sha512"
elif checksum_len == 64:
checksum_algo = "sha256"
if checksum_len == 32 and not CONF.agent.allow_md5_checksum:
# MD5 not permitted and the checksum is the length of MD5
# and not otherwise defined.
LOG.error('Cannot compute the checksum as it uses MD5 '
'and is disabled by configuration. If the checksum '
'is *not* MD5, please specify the algorithm.')
raise exception.ImageChecksumAlgorithmFailure()
return checksum, checksum_algo
def is_checksum_url(checksum):
"""Identify if checksum is not a url.
:param checksum: The user supplied checksum value.
:returns: True if the checksum is a url, otherwise False.
:raises: ImageChecksumURLNotSupported should the conductor have this
support disabled.
"""
if (checksum.startswith('http://') or checksum.startswith('https://')):
if CONF.conductor.disable_support_for_checksum_files:
raise exception.ImageChecksumURLNotSupported()
return True
else:
return False
def get_checksum_from_url(checksum, image_source):
"""Gets a checksum value based upon a remote checksum URL file.
:param checksum: The URL to the checksum URL content.
:param image_soource: The image source utilized to match with
the contents of the URL payload file.
:raises: ImageDownloadFailed when the checksum file cannot be
accessed or cannot be parsed.
"""
LOG.debug('Attempting to download checksum from: %(checksum)s.',
{'checksum': checksum})
# Directly invoke the image service and get the checksum data.
resp = image_service.HttpImageService.get(checksum)
checksum_url = str(checksum)
# NOTE(TheJulia): The rest of this method is taken from
# ironic-python-agent. If a change is required here, it may
# be required in ironic-python-agent (extensions/standby.py).
lines = [line.strip() for line in resp.split('\n') if line.strip()]
if not lines:
raise exception.ImageDownloadFailed(image_href=checksum,
reason=_('Checksum file empty.'))
elif len(lines) == 1:
# Special case - checksums file with only the checksum itself
if ' ' not in lines[0]:
for matcher in CHECKSUM_ONLY_MATCHERS:
checksum = re.findall(matcher, lines[0])
if checksum:
return checksum[0]
raise exception.ImageDownloadFailed(
image_href=checksum_url,
reason=(
_("Invalid checksum file (No valid checksum found)")))
# FIXME(dtantsur): can we assume the same name for all images?
expected_fname = os.path.basename(urlparse.urlparse(
image_source).path)
for line in lines:
# Ignore comment lines
if line.startswith("#"):
continue
# Ignore checksums for other files
for matcher in FILENAME_MATCHERS:
if re.findall(matcher.format(filename=expected_fname), line):
break
else:
continue
for matcher in CHECKSUM_MATCHERS:
checksum = re.findall(matcher, line)
if checksum:
return checksum[0]
raise exception.ImageDownloadFailed(
image_href=checksum,
reason=(_("Checksum file does not contain name %s")
% expected_fname))

View file

@ -889,3 +889,29 @@ class UnsupportedHardwareFeature(Invalid):
_msg_fmt = _("Node %(node)s hardware does not support feature "
"%(feature)s, which is required based upon the "
"requested configuration.")
class InvalidImage(ImageUnacceptable):
_msg_fmt = _("The requested image is not valid for use.")
class ImageChecksumError(InvalidImage):
"""Exception indicating checksum failed to match."""
_msg_fmt = _("The supplied image checksum is invalid or does not match.")
class ImageChecksumAlgorithmFailure(InvalidImage):
"""Cannot load the requested or required checksum algorithm."""
_msg_fmt = _("The requested image checksum algorithm cannot be loaded.")
class ImageChecksumURLNotSupported(InvalidImage):
"""Exception indicating we cannot support the remote checksum file."""
_msg_fmt = _("Use of remote checksum files is not supported.")
class ImageChecksumFileReadFailure(InvalidImage):
"""An OSError was raised when trying to read the file."""
_msg_fmt = _("Failed to read the file from local storage "
"to perform a checksum operation.")
code = http_client.SERVICE_UNAVAILABLE

File diff suppressed because it is too large Load diff

View file

@ -287,6 +287,43 @@ class HttpImageService(BaseImageService):
'no_cache': no_cache,
}
@staticmethod
def get(image_href):
"""Downloads content and returns the response text.
:param image_href: Image reference.
:raises: exception.ImageRefValidationFailed if GET request returned
response code not equal to 200.
:raises: exception.ImageDownloadFailed if:
* IOError happened during file write;
* GET request failed.
"""
try:
verify = strutils.bool_from_string(CONF.webserver_verify_ca,
strict=True)
except ValueError:
verify = CONF.webserver_verify_ca
try:
auth = HttpImageService.gen_auth_from_conf_user_pass(image_href)
response = requests.get(image_href, stream=False, verify=verify,
timeout=CONF.webserver_connection_timeout,
auth=auth)
if response.status_code != http_client.OK:
raise exception.ImageRefValidationFailed(
image_href=image_href,
reason=_("Got HTTP code %s instead of 200 in response "
"to GET request.") % response.status_code)
return response.text
except (OSError, requests.ConnectionError, requests.RequestException,
IOError) as e:
raise exception.ImageDownloadFailed(image_href=image_href,
reason=str(e))
class FileImageService(BaseImageService):
"""Provides retrieval of disk images available locally on the conductor."""

View file

@ -23,16 +23,18 @@ import os
import shutil
import time
from ironic_lib import disk_utils
from oslo_concurrency import processutils
from oslo_log import log as logging
from oslo_utils import fileutils
import pycdlib
from ironic.common import checksum_utils
from ironic.common import exception
from ironic.common.glance_service import service_utils as glance_utils
from ironic.common.i18n import _
from ironic.common import image_format_inspector
from ironic.common import image_service as service
from ironic.common import qemu_img
from ironic.common import utils
from ironic.conf import CONF
@ -369,31 +371,25 @@ def fetch_into(context, image_href, image_file):
{'image_href': image_href, 'time': time.time() - start})
def fetch(context, image_href, path, force_raw=False):
def fetch(context, image_href, path, force_raw=False,
checksum=None, checksum_algo=None):
with fileutils.remove_path_on_error(path):
fetch_into(context, image_href, path)
if (not CONF.conductor.disable_file_checksum
and checksum):
checksum_utils.validate_checksum(path, checksum, checksum_algo)
if force_raw:
image_to_raw(image_href, path, "%s.part" % path)
def get_source_format(image_href, path):
data = disk_utils.qemu_img_info(path)
fmt = data.file_format
if fmt is None:
try:
img_format = image_format_inspector.detect_file_format(path)
except image_format_inspector.ImageFormatError:
raise exception.ImageUnacceptable(
reason=_("'qemu-img info' parsing failed."),
reason=_("parsing of the image failed."),
image_id=image_href)
backing_file = data.backing_file
if backing_file is not None:
raise exception.ImageUnacceptable(
image_id=image_href,
reason=_("fmt=%(fmt)s backed by: %(backing_file)s") %
{'fmt': fmt, 'backing_file': backing_file})
return fmt
return str(img_format)
def force_raw_will_convert(image_href, path_tmp):
@ -406,24 +402,46 @@ def force_raw_will_convert(image_href, path_tmp):
def image_to_raw(image_href, path, path_tmp):
with fileutils.remove_path_on_error(path_tmp):
fmt = get_source_format(image_href, path_tmp)
if not CONF.conductor.disable_deep_image_inspection:
fmt = safety_check_image(path_tmp)
if fmt != "raw":
if fmt not in CONF.conductor.permitted_image_formats:
LOG.error("Security: The requested image %(image_href)s "
"is of format image %(format)s and is not in "
"the [conductor]permitted_image_formats list.",
{'image_href': image_href,
'format': fmt})
raise exception.InvalidImage()
else:
fmt = get_source_format(image_href, path)
LOG.warning("Security: Image safety checking has been disabled. "
"This is unsafe operation. Attempting to continue "
"the detected format %(img_fmt)s for %(path)s.",
{'img_fmt': fmt,
'path': path})
if fmt != "raw" and fmt != "iso":
# When the target format is NOT raw, we need to convert it.
# however, we don't need nor want to do that when we have
# an ISO image. If we have an ISO because it was requested,
# we have correctly fingerprinted it. Prior to proper
# image detection, we thought we had a raw image, and we
# would end up asking for a raw image to be made a raw image.
staged = "%s.converted" % path
utils.is_memory_insufficient(raise_if_fail=True)
LOG.debug("%(image)s was %(format)s, converting to raw",
{'image': image_href, 'format': fmt})
with fileutils.remove_path_on_error(staged):
disk_utils.convert_image(path_tmp, staged, 'raw')
qemu_img.convert_image(path_tmp, staged, 'raw',
source_format=fmt)
os.unlink(path_tmp)
data = disk_utils.qemu_img_info(staged)
if data.file_format != "raw":
new_fmt = get_source_format(image_href, staged)
if new_fmt != "raw":
raise exception.ImageConvertFailed(
image_id=image_href,
reason=_("Converted to raw, but format is "
"now %s") % data.file_format)
"now %s") % new_fmt)
os.rename(staged, path)
else:
@ -454,11 +472,11 @@ def converted_size(path, estimate=False):
the original image scaled by the configuration value
`raw_image_growth_factor`.
"""
data = disk_utils.qemu_img_info(path)
data = image_format_inspector.detect_file_format(path)
if not estimate:
return data.virtual_size
growth_factor = CONF.raw_image_growth_factor
return int(min(data.disk_size * growth_factor, data.virtual_size))
return int(min(data.actual_size * growth_factor, data.virtual_size))
def get_image_properties(context, image_href, properties="all"):
@ -767,3 +785,92 @@ def _get_deploy_iso_files(deploy_iso, mountdir):
# present in deploy iso. This path varies for different OS vendors.
# e_img_rel_path: is required by mkisofs to generate boot iso.
return uefi_path_info, e_img_rel_path, grub_rel_path
def __node_or_image_cache(node):
"""A helper for logging to determine if image cache or node uuid."""
if not node:
return 'image cache'
else:
return node.uuid
def safety_check_image(image_path, node=None):
"""Performs a safety check on the supplied image.
This method triggers the image format inspector's to both identify the
type of the supplied file and safety check logic to identify if there
are any known unsafe features being leveraged, and return the detected
file format in the form of a string for the caller.
:param image_path: A fully qualified path to an image which needs to
be evaluated for safety.
:param node: A Node object, optional. When supplied logging indicates the
node which triggered this issue, but the node is not
available in all invocation cases.
:returns: a string representing the the image type which is used.
:raises: InvalidImage when the supplied image is detected as unsafe,
or the image format inspector has failed to parse the supplied
image's contents.
"""
id_string = __node_or_image_cache(node)
try:
img_class = image_format_inspector.detect_file_format(image_path)
if not img_class.safety_check():
LOG.error("Security: The requested image for "
"deployment of node %(node)s fails safety sanity "
"checking.",
{'node': id_string})
raise exception.InvalidImage()
image_format_name = str(img_class)
except image_format_inspector.ImageFormatError:
LOG.error("Security: The requested user image for the "
"deployment node %(node)s failed to be able "
"to be parsed by the image format checker.",
{'node': id_string})
raise exception.InvalidImage()
return image_format_name
def check_if_image_format_is_permitted(img_format,
expected_format=None,
node=None):
"""Checks image format consistency.
:params img_format: The determined image format by name.
:params expected_format: Optional, the expected format based upon
supplied configuration values.
:params node: A node object or None implying image cache.
:raises: InvalidImage if the requested image format is not permitted
by configuration, or the expected_format does not match the
determined format.
"""
id_string = __node_or_image_cache(node)
if img_format not in CONF.conductor.permitted_image_formats:
LOG.error("Security: The requested deploy image for node %(node)s "
"is of format image %(format)s and is not in the "
"[conductor]permitted_image_formats list.",
{'node': id_string,
'format': img_format})
raise exception.InvalidImage()
if expected_format is not None and img_format != expected_format:
if expected_format in ['ari', 'aki']:
# In this case, we have an ari or aki, meaning we're pulling
# down a kernel/ramdisk, and this is rooted in a misunderstanding.
# They should be raw. The detector should be detecting this *as*
# raw anyway, so the data just mismatches from a common
# misunderstanding, and that is okay in this case as they are not
# passed to qemu-img.
# TODO(TheJulia): Add a log entry to warn here at some point in
# the future as we begin to shift the perception around this.
# See: https://bugs.launchpad.net/ironic/+bug/2074090
return
LOG.error("Security: The requested deploy image for node %(node)s "
"has a format (%(format)s) which does not match the "
"expected image format (%(expected)s) based upon "
"supplied or retrieved information.",
{'node': id_string,
'format': img_format,
'expected': expected_format})
raise exception.InvalidImage()

89
ironic/common/qemu_img.py Normal file
View file

@ -0,0 +1,89 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import logging
from oslo_concurrency import processutils
from oslo_utils import units
import tenacity
from ironic.common import utils
from ironic.conf import CONF
LOG = logging.getLogger(__name__)
# Limit the memory address space to 1 GiB when running qemu-img
QEMU_IMG_LIMITS = None
def _qemu_img_limits():
# NOTE(TheJulia): If you make *any* chance to this code, you may need
# to make an identitical or similar change to ironic-python-agent.
global QEMU_IMG_LIMITS
if QEMU_IMG_LIMITS is None:
QEMU_IMG_LIMITS = processutils.ProcessLimits(
address_space=CONF.disk_utils.image_convert_memory_limit
* units.Mi)
return QEMU_IMG_LIMITS
def _retry_on_res_temp_unavailable(exc):
if (isinstance(exc, processutils.ProcessExecutionError)
and ('Resource temporarily unavailable' in exc.stderr
or 'Cannot allocate memory' in exc.stderr)):
return True
return False
@tenacity.retry(
retry=tenacity.retry_if_exception(_retry_on_res_temp_unavailable),
stop=tenacity.stop_after_attempt(CONF.disk_utils.image_convert_attempts),
reraise=True)
def convert_image(source, dest, out_format, run_as_root=False, cache=None,
out_of_order=False, sparse_size=None,
source_format='qcow2'):
# NOTE(TheJulia): If you make *any* chance to this code, you may need
# to make an identitical or similar change to ironic-python-agent.
"""Convert image to other format."""
cmd = ['qemu-img', 'convert', '-f', source_format, '-O', out_format]
if cache is not None:
cmd += ['-t', cache]
if sparse_size is not None:
cmd += ['-S', sparse_size]
if out_of_order:
cmd.append('-W')
cmd += [source, dest]
# NOTE(TheJulia): Statically set the MALLOC_ARENA_MAX to prevent leaking
# and the creation of new malloc arenas which will consume the system
# memory. If limited to 1, qemu-img consumes ~250 MB of RAM, but when
# another thread tries to access a locked section of memory in use with
# another thread, then by default a new malloc arena is created,
# which essentially balloons the memory requirement of the machine.
# Default for qemu-img is 8 * nCPU * ~250MB (based on defaults +
# thread/code/process/library overhead. In other words, 64 GB. Limiting
# this to 3 keeps the memory utilization in happy cases below the overall
# threshold which is in place in case a malicious image is attempted to
# be passed through qemu-img.
env_vars = {'MALLOC_ARENA_MAX': '3'}
try:
utils.execute(*cmd, run_as_root=run_as_root,
prlimit=_qemu_img_limits(),
use_standard_locale=True,
env_variables=env_vars)
except processutils.ProcessExecutionError as e:
if ('Resource temporarily unavailable' in e.stderr
or 'Cannot allocate memory' in e.stderr):
LOG.debug('Failed to convert image, retrying. Error: %s', e)
# Sync disk caches before the next attempt
utils.execute('sync')
raise

View file

@ -27,6 +27,7 @@ from ironic.conf import database
from ironic.conf import default
from ironic.conf import deploy
from ironic.conf import dhcp
from ironic.conf import disk_utils
from ironic.conf import dnsmasq
from ironic.conf import drac
from ironic.conf import fake
@ -66,6 +67,7 @@ default.register_opts(CONF)
deploy.register_opts(CONF)
drac.register_opts(CONF)
dhcp.register_opts(CONF)
disk_utils.register_opts(CONF)
dnsmasq.register_opts(CONF)
fake.register_opts(CONF)
glance.register_opts(CONF)

View file

@ -411,6 +411,71 @@ opts = [
'seconds, or 30 minutes. If you need to wait longer '
'than the maximum value, we recommend exploring '
'hold steps.')),
cfg.BoolOpt('disable_deep_image_inspection',
default=False,
# Normally such an option would be mutable, but this is,
# a security guard and operators should not expect to change
# this option under normal circumstances.
mutable=False,
help=_('Security Option to permit an operator to disable '
'file content inspections. Under normal conditions, '
'the conductor will inspect requested image contents '
'which are transferred through the conductor. '
'Disabling this option is not advisable and opens '
'the risk of unsafe images being processed which may '
'allow an attacker to leverage unsafe features in '
'various disk image formats to perform a variety of '
'unsafe and potentially compromising actions. '
'This option is *not* mutable, and '
'requires a service restart to change.')),
cfg.BoolOpt('conductor_always_validates_images',
default=False,
# Normally mutable, however from a security context we do want
# all logging to be generated from this option to be changed,
# and as such is set to False to force a conductor restart.
mutable=False,
help=_('Security Option to enable the conductor to *always* '
'inspect the image content of any requested deploy, '
'even if the deployment would have normally bypassed '
'the conductor\'s cache. When this is set to False, '
'the Ironic-Python-Agent is responsible '
'for any necessary image checks. Setting this to '
'True will result in a higher utilization of '
'resources (disk space, network traffic) '
'as the conductor will evaluate *all* images. '
'This option is *not* mutable, and requires a '
'service restart to change. This option requires '
'[conductor]disable_deep_image_inspection to be set '
'to False.')),
cfg.ListOpt('permitted_image_formats',
default=['raw', 'qcow2', 'iso'],
mutable=True,
help=_('The supported list of image formats which are '
'permitted for deployment with Ironic. If an image '
'format outside of this list is detected, the image '
'validation logic will fail the deployment process.')),
cfg.BoolOpt('disable_file_checksum',
default=False,
mutable=False,
help=_('Deprecated Security option: In the default case, '
'image files have their checksums verified before '
'undergoing additional conductor side actions such '
'as image conversion. '
'Enabling this option opens the risk of files being '
'replaced at the source without the user\'s '
'knowledge.'),
deprecated_for_removal=True),
cfg.BoolOpt('disable_support_for_checksum_files',
default=False,
mutable=False,
help=_('Security option: By default Ironic will attempt to '
'retrieve a remote checksum file via HTTP(S) URL in '
'order to validate an image download. This is '
'functionality aligning with ironic-python-agent '
'support for standalone users. Disabling this '
'functionality by setting this option to True will '
'create a more secure environment, however it may '
'break users in an unexpected fashion.')),
]

33
ironic/conf/disk_utils.py Normal file
View file

@ -0,0 +1,33 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from oslo_config import cfg
# NOTE(TheJulia): If you make *any* chance to this code, you may need
# to make an identitical or similar change to ironic-python-agent.
# These options were originally taken from ironic-lib upon the decision
# to move the qemu-img image conversion calls into the projects in
# order to simplify fixes related to them.
opts = [
cfg.IntOpt('image_convert_memory_limit',
default=2048,
help='Memory limit for "qemu-img convert" in MiB. Implemented '
'via the address space resource limit.'),
cfg.IntOpt('image_convert_attempts',
default=3,
help='Number of attempts to convert an image.'),
]
def register_opts(conf):
conf.register_opts(opts, group='disk_utils')

View file

@ -27,6 +27,7 @@ _opts = [
('database', ironic.conf.database.opts),
('deploy', ironic.conf.deploy.opts),
('dhcp', ironic.conf.dhcp.opts),
('disk_utils', ironic.conf.disk_utils.opts),
('drac', ironic.conf.drac.opts),
('glance', ironic.conf.glance.list_opts()),
('healthcheck', ironic.conf.healthcheck.opts),

View file

@ -1,6 +1,6 @@
- name: convert and write
become: yes
command: qemu-img convert -t directsync -O host_device /tmp/{{ inventory_hostname }}.img {{ ironic_image_target }}
command: qemu-img convert -f {{ ironic.image.disk_format }} -t directsync -O host_device /tmp/{{ inventory_hostname }}.img {{ ironic_image_target }}
async: 1200
poll: 10
when: ironic.image.disk_format != 'raw'

View file

@ -16,7 +16,6 @@
import os
import re
import time
from ironic_lib import metrics_utils
from ironic_lib import utils as il_utils
@ -25,6 +24,8 @@ from oslo_utils import excutils
from oslo_utils import fileutils
from oslo_utils import strutils
from ironic.common import checksum_utils
from ironic.common import context
from ironic.common import exception
from ironic.common import faults
from ironic.common.glance_service import service_utils
@ -58,6 +59,7 @@ RESCUE_LIKE_STATES = (states.RESCUING, states.RESCUEWAIT, states.RESCUEFAIL,
DISK_LAYOUT_PARAMS = ('root_gb', 'swap_mb', 'ephemeral_gb')
# All functions are called from deploy() directly or indirectly.
# They are split for stub-out.
@ -203,7 +205,9 @@ def check_for_missing_params(info_dict, error_msg, param_prefix=''):
'missing_info': missing_info})
def fetch_images(ctx, cache, images_info, force_raw=True):
def fetch_images(ctx, cache, images_info, force_raw=True,
expected_format=None, expected_checksum=None,
expected_checksum_algo=None):
"""Check for available disk space and fetch images using ImageCache.
:param ctx: context
@ -211,7 +215,15 @@ def fetch_images(ctx, cache, images_info, force_raw=True):
:param images_info: list of tuples (image href, destination path)
:param force_raw: boolean value, whether to convert the image to raw
format
:param expected_format: The expected format of the image.
:param expected_checksum: The expected image checksum, to be used if we
need to convert the image to raw prior to deploying.
:param expected_checksum_algo: The checksum algo in use, if separately
set.
:raises: InstanceDeployFailure if unable to find enough disk space
:raises: InvalidImage if the supplied image metadata or contents are
deemed to be invalid, unsafe, or not matching the expectations
asserted by configuration supplied or set.
"""
try:
@ -223,8 +235,17 @@ def fetch_images(ctx, cache, images_info, force_raw=True):
# if disk space is used between the check and actual download.
# This is probably unavoidable, as we can't control other
# (probably unrelated) processes
image_list = []
for href, path in images_info:
cache.fetch_image(href, path, ctx=ctx, force_raw=force_raw)
# NOTE(TheJulia): Href in this case can be an image UUID or a URL.
image_format = cache.fetch_image(
href, path, ctx=ctx,
force_raw=force_raw,
expected_format=expected_format,
expected_checksum=expected_checksum,
expected_checksum_algo=expected_checksum_algo)
image_list.append((href, path, image_format))
return image_list
def set_failed_state(task, msg, collect_logs=True):
@ -1065,7 +1086,8 @@ class InstanceImageCache(image_cache.ImageCache):
@METRICS.timer('cache_instance_image')
def cache_instance_image(ctx, node, force_raw=None):
def cache_instance_image(ctx, node, force_raw=None, expected_format=None,
expected_checksum=None, expected_checksum_algo=None):
"""Fetch the instance's image from Glance
This method pulls the disk image and writes them to the appropriate
@ -1074,8 +1096,16 @@ def cache_instance_image(ctx, node, force_raw=None):
:param ctx: context
:param node: an ironic node object
:param force_raw: whether convert image to raw format
:param expected_format: The expected format of the disk image contents.
:param expected_checksum: The expected image checksum, to be used if we
need to convert the image to raw prior to deploying.
:param expected_checksum_algo: The checksum algo in use, if separately
set.
:returns: a tuple containing the uuid of the image and the path in
the filesystem where image is cached.
:raises: InvalidImage if the requested image is invalid and cannot be
used for deployed based upon contents of the image or the metadata
surrounding the image not matching the configured image.
"""
# NOTE(dtantsur): applying the default here to make the option mutable
if force_raw is None:
@ -1089,10 +1119,11 @@ def cache_instance_image(ctx, node, force_raw=None):
LOG.debug("Fetching image %(image)s for node %(uuid)s",
{'image': uuid, 'uuid': node.uuid})
fetch_images(ctx, InstanceImageCache(), [(uuid, image_path)],
force_raw)
return (uuid, image_path)
image_list = fetch_images(ctx, InstanceImageCache(), [(uuid, image_path)],
force_raw, expected_format=expected_format,
expected_checksum=expected_checksum,
expected_checksum_algo=expected_checksum_algo)
return (uuid, image_path, image_list[0][2])
@METRICS.timer('destroy_images')
@ -1109,17 +1140,11 @@ def destroy_images(node_uuid):
@METRICS.timer('compute_image_checksum')
def compute_image_checksum(image_path, algorithm='md5'):
"""Compute checksum by given image path and algorithm."""
time_start = time.time()
LOG.debug('Start computing %(algo)s checksum for image %(image)s.',
{'algo': algorithm, 'image': image_path})
checksum = fileutils.compute_file_checksum(image_path,
algorithm=algorithm)
time_elapsed = time.time() - time_start
LOG.debug('Computed %(algo)s checksum for image %(image)s in '
'%(delta).2f seconds, checksum value: %(checksum)s.',
{'algo': algorithm, 'image': image_path, 'delta': time_elapsed,
'checksum': checksum})
return checksum
# NOTE(TheJulia): This likely wouldn't be removed, but if we do
# significant refactoring we could likely just change everything
# over to the images common code, if we don't need the metrics
# data anymore.
return checksum_utils.compute_image_checksum(image_path, algorithm)
def remove_http_instance_symlink(node_uuid):
@ -1133,13 +1158,39 @@ def destroy_http_instance_images(node):
destroy_images(node.uuid)
def _validate_image_url(node, url, secret=False):
def _validate_image_url(node, url, secret=False, inspect_image=None,
expected_format=None):
"""Validates image URL through the HEAD request.
:param url: URL to be validated
:param secret: if URL is secret (e.g. swift temp url),
it will not be shown in logs.
:param inspect_image: If the requested URL should have extensive
content checking applied. Defaults to the value provided by
the [conductor]conductor_always_validates_images configuration
parameter setting, but is also able to be turned off by supplying
False where needed to perform a redirect or URL head request only.
:param expected_format: The expected image format, if known, for
the image inspection logic.
:returns: Returns a dictionary with basic information about the
requested image if image introspection is
"""
if inspect_image is not None:
# The caller has a bit more context and we can rely upon it,
# for example if it knows we cannot or should not inspect
# the image contents.
inspect = inspect_image
elif not CONF.conductor.disable_deep_image_inspection:
inspect = CONF.conductor.conductor_always_validates_images
else:
# If we're here, file inspection has been explicitly disabled.
inspect = False
# NOTE(TheJulia): This method gets used in two different ways.
# The first is as a "i did a thing, let me make sure my url works."
# The second is to validate a remote URL is valid. In the remote case
# we will grab the file and proceed from there.
image_info = {}
try:
# NOTE(TheJulia): This method only validates that an exception
# is NOT raised. In other words, that the endpoint does not
@ -1151,20 +1202,59 @@ def _validate_image_url(node, url, secret=False):
LOG.error("The specified URL is not a valid HTTP(S) URL or is "
"not reachable for node %(node)s: %(msg)s",
{'node': node.uuid, 'msg': e})
if inspect:
LOG.info("Inspecting image contents for %(node)s with url %(url)s. "
"Expecting user supplied format: %(expected)s",
{'node': node.uuid,
'expected': expected_format,
'url': url})
# Utilizes the file cache since it knows how to pull files down
# and handles pathing and caching and all that fun, however with
# force_raw set as false.
# The goal here being to get the file we would normally just point
# IPA at, be it via swift transfer *or* direct URL request, and
# perform the safety check on it before allowing it to proceed.
ctx = context.get_admin_context()
# NOTE(TheJulia): Because we're using the image cache here, we
# let it run the image validation checking as it's normal course
# of action, and save what it tells us the image format is.
# if there *was* a mismatch, it will raise the error.
# NOTE(TheJulia): We don't need to supply the checksum here, because
# we are not converting the image. The net result is the deploy
# interface or remote agent has the responsibility to checksum the
# image.
_, image_path, img_format = cache_instance_image(
ctx,
node,
force_raw=False,
expected_format=expected_format)
# NOTE(TheJulia): We explicitly delete this file because it has no use
# in the cache after this point.
il_utils.unlink_without_raise(image_path)
image_info['disk_format'] = img_format
return image_info
def _cache_and_convert_image(task, instance_info, image_info=None):
"""Cache an image locally and covert it to RAW if needed."""
# Ironic cache and serve images from httpboot server
force_raw = direct_deploy_should_convert_raw_image(task.node)
_, image_path = cache_instance_image(task.context, task.node,
force_raw=force_raw)
if force_raw or image_info is None:
if image_info is None:
initial_format = instance_info.get('image_disk_format')
else:
initial_format = image_info.get('disk_format')
if image_info is None:
initial_format = instance_info.get('image_disk_format')
else:
initial_format = image_info.get('disk_format')
checksum, checksum_algo = checksum_utils.get_checksum_and_algo(
instance_info)
_, image_path, img_format = cache_instance_image(
task.context, task.node,
force_raw=force_raw,
expected_format=initial_format,
expected_checksum=checksum,
expected_checksum_algo=checksum_algo)
if force_raw or image_info is None:
if force_raw:
instance_info['image_disk_format'] = 'raw'
else:
@ -1199,7 +1289,8 @@ def _cache_and_convert_image(task, instance_info, image_info=None):
'%(node)s due to image conversion',
{'image': image_path, 'node': task.node.uuid})
instance_info['image_checksum'] = None
hash_value = compute_image_checksum(image_path, os_hash_algo)
hash_value = checksum_utils.compute_image_checksum(image_path,
os_hash_algo)
else:
instance_info['image_checksum'] = old_checksum
@ -1245,7 +1336,11 @@ def _cache_and_convert_image(task, instance_info, image_info=None):
task.node.uuid])
if file_extension:
http_image_url = http_image_url + file_extension
_validate_image_url(task.node, http_image_url, secret=False)
# We don't inspect the image in our url check because we just need to do
# an quick path validity check here, we should be checking contents way
# earlier on in this method.
_validate_image_url(task.node, http_image_url, secret=False,
inspect_image=False)
instance_info['image_url'] = http_image_url
@ -1270,29 +1365,57 @@ def build_instance_info_for_deploy(task):
instance_info = node.instance_info
iwdi = node.driver_internal_info.get('is_whole_disk_image')
image_source = instance_info['image_source']
# Flag if we know the source is a path, used for Anaconda
# deploy interface where you can just tell anaconda to
# consume artifacts from a path. In this case, we are not
# doing any image conversions, we're just passing through
# a URL in the form of configuration.
isap = node.driver_internal_info.get('is_source_a_path')
# If our url ends with a /, i.e. we have been supplied with a path,
# we can only deploy this in limited cases for drivers and tools
# which are aware of such. i.e. anaconda.
image_download_source = get_image_download_source(node)
boot_option = get_boot_option(task.node)
# There is no valid reason this should already be set, and
# and gets replaced at various points in this sequence.
instance_info['image_url'] = None
if service_utils.is_glance_image(image_source):
glance = image_service.GlanceImageService(context=task.context)
image_info = glance.show(image_source)
LOG.debug('Got image info: %(info)s for node %(node)s.',
{'info': image_info, 'node': node.uuid})
# Values are explicitly set into the instance info field
# so IPA have the values available.
instance_info['image_checksum'] = image_info['checksum']
instance_info['image_os_hash_algo'] = image_info['os_hash_algo']
instance_info['image_os_hash_value'] = image_info['os_hash_value']
if image_download_source == 'swift':
# In this case, we are getting a file *from* swift for a glance
# image which is backed by swift. IPA downloads the file directly
# from swift, but cannot get any metadata related to it otherwise.
swift_temp_url = glance.swift_temp_url(image_info)
_validate_image_url(node, swift_temp_url, secret=True)
image_format = image_info.get('disk_format')
# In the process of validating the URL is valid, we will perform
# the requisite safety checking of the asset as we can end up
# converting it in the agent, or needing the disk format value
# to be correct for the Ansible deployment interface.
validate_results = _validate_image_url(
node, swift_temp_url, secret=True,
expected_format=image_format)
instance_info['image_url'] = swift_temp_url
instance_info['image_checksum'] = image_info['checksum']
instance_info['image_disk_format'] = image_info['disk_format']
instance_info['image_os_hash_algo'] = image_info['os_hash_algo']
instance_info['image_os_hash_value'] = image_info['os_hash_value']
instance_info['image_disk_format'] = \
validate_results.get('disk_format', image_format)
else:
# In this case, we're directly downloading the glance image and
# hosting it locally for retrieval by the IPA.
_cache_and_convert_image(task, instance_info, image_info)
# We're just populating extra information for a glance backed image in
# case a deployment interface driver needs them at some point.
instance_info['image_container_format'] = (
image_info['container_format'])
instance_info['image_tags'] = image_info.get('tags', [])
@ -1303,20 +1426,80 @@ def build_instance_info_for_deploy(task):
instance_info['ramdisk'] = image_info['properties']['ramdisk_id']
elif (image_source.startswith('file://')
or image_download_source == 'local'):
# In this case, we're explicitly downloading (or copying a file)
# hosted locally so IPA can download it directly from Ironic.
# NOTE(TheJulia): Intentionally only supporting file:/// as image
# based deploy source since we don't want to, nor should we be in
# in the business of copying large numbers of files as it is a
# huge performance impact.
_cache_and_convert_image(task, instance_info)
else:
# This is the "all other cases" logic for aspects like the user
# has supplied us a direct URL to reference. In cases like the
# anaconda deployment interface where we might just have a path
# and not a file, or where a user may be supplying a full URL to
# a remotely hosted image, we at a minimum need to check if the url
# is valid, and address any redirects upfront.
try:
_validate_image_url(node, image_source)
# NOTE(TheJulia): In the case we're here, we not doing an
# integrated image based deploy, but we may also be doing
# a path based anaconda base deploy, in which case we have
# no backing image, but we need to check for a URL
# redirection. So, if the source is a path (i.e. isap),
# we don't need to inspect the image as there is no image
# in the case for the deployment to drive.
validated_results = {}
if isap:
# This is if the source is a path url, such as one used by
# anaconda templates to to rely upon bootstrapping defaults.
_validate_image_url(node, image_source, inspect_image=False)
else:
# When not isap, we can just let _validate_image_url make a
# the required decision on if contents need to be sampled,
# or not. We try to pass the image_disk_format which may be
# declared by the user, and if not we set expected_format to
# None.
validate_results = _validate_image_url(
node,
image_source,
expected_format=instance_info.get('image_disk_format',
None))
# image_url is internal, and used by IPA and some boot templates.
# in most cases, it needs to come from image_source explicitly.
if 'disk_format' in validated_results:
# Ensure IPA has the value available, so write what we detect,
# if anything. This is also an item which might be needful
# with ansible deploy interface, when used in standalone mode.
instance_info['image_disk_format'] = \
validate_results.get('disk_format')
instance_info['image_url'] = image_source
except exception.ImageRefIsARedirect as e:
# At this point, we've got a redirect response from the webserver,
# and we're going to try to handle it as a single redirect action,
# as requests, by default, only lets a single redirect to occur.
# This is likely a URL pathing fix, like a trailing / on a path,
# or move to HTTPS from a user supplied HTTP url.
if e.redirect_url:
# Since we've got a redirect, we need to carry the rest of the
# request logic as well, which includes recording a disk
# format, if applicable.
instance_info['image_url'] = e.redirect_url
# We need to save the image_source back out so it caches
instance_info['image_source'] = e.redirect_url
task.node.instance_info = instance_info
if not isap:
# The redirect doesn't relate to a path being used, so
# the target is a filename, likely cause is webserver
# telling the client to use HTTPS.
validated_results = _validate_image_url(
node, e.redirect_url,
expected_format=instance_info.get('image_disk_format',
None))
if 'disk_format' in validated_results:
instance_info['image_disk_format'] = \
validated_results.get('disk_format')
else:
raise
@ -1331,7 +1514,6 @@ def build_instance_info_for_deploy(task):
# Call central parsing so we retain things like config drives.
i_info = parse_instance_info(node, image_deploy=False)
instance_info.update(i_info)
return instance_info

View file

@ -65,7 +65,9 @@ class ImageCache(object):
if master_dir is not None:
fileutils.ensure_tree(master_dir)
def fetch_image(self, href, dest_path, ctx=None, force_raw=True):
def fetch_image(self, href, dest_path, ctx=None, force_raw=True,
expected_format=None, expected_checksum=None,
expected_checksum_algo=None):
"""Fetch image by given href to the destination path.
Does nothing if destination path exists and is up to date with cache
@ -80,16 +82,33 @@ class ImageCache(object):
:param ctx: context
:param force_raw: boolean value, whether to convert the image to raw
format
:param expected_format: The expected image format.
:param expected_checksum: The expected image checksum
:param expected_checksum_algo: The expected image checksum algorithm,
if needed/supplied.
"""
img_download_lock_name = 'download-image'
if self.master_dir is None:
# NOTE(ghe): We don't share images between instances/hosts
# NOTE(TheJulia): These is a weird code path, because master_dir
# has to be None, which by default it never should be unless
# an operator forces it to None, which is a path we just never
# expect.
# TODO(TheJulia): This may be dead-ish code and likely needs
# to be removed. Likely originated *out* of just the iscsi
# deployment interface and local image caching.
if not CONF.parallel_image_downloads:
with lockutils.lock(img_download_lock_name):
_fetch(ctx, href, dest_path, force_raw)
_fetch(ctx, href, dest_path, force_raw,
expected_format=expected_format,
expected_checksum=expected_checksum,
expected_checksum_algo=expected_checksum_algo)
else:
with _concurrency_semaphore:
_fetch(ctx, href, dest_path, force_raw)
_fetch(ctx, href, dest_path, force_raw,
expected_format=expected_format,
expected_checksum=expected_checksum,
expected_checksum_algo=expected_checksum_algo)
return
# TODO(ghe): have hard links and counts the same behaviour in all fs
@ -140,13 +159,18 @@ class ImageCache(object):
{'href': href})
self._download_image(
href, master_path, dest_path, img_info,
ctx=ctx, force_raw=force_raw)
ctx=ctx, force_raw=force_raw,
expected_format=expected_format,
expected_checksum=expected_checksum,
expected_checksum_algo=expected_checksum_algo)
# NOTE(dtantsur): we increased cache size - time to clean up
self.clean_up()
def _download_image(self, href, master_path, dest_path, img_info,
ctx=None, force_raw=True):
ctx=None, force_raw=True, expected_format=None,
expected_checksum=None,
expected_checksum_algo=None):
"""Download image by href and store at a given path.
This method should be called with uuid-specific lock taken.
@ -158,6 +182,9 @@ class ImageCache(object):
:param ctx: context
:param force_raw: boolean value, whether to convert the image to raw
format
:param expected_format: The expected original format for the image.
:param expected_checksum: The expected image checksum.
:param expected_checksum_algo: The expected image checksum algorithm.
:raise ImageDownloadFailed: when the image cache and the image HTTP or
TFTP location are on different file system,
causing hard link to fail.
@ -169,7 +196,9 @@ class ImageCache(object):
try:
with _concurrency_semaphore:
_fetch(ctx, href, tmp_path, force_raw)
_fetch(ctx, href, tmp_path, force_raw, expected_format,
expected_checksum=expected_checksum,
expected_checksum_algo=expected_checksum_algo)
if img_info.get('no_cache'):
LOG.debug("Caching is disabled for image %s", href)
@ -333,34 +362,62 @@ def _free_disk_space_for(path):
return stat.f_frsize * stat.f_bavail
def _fetch(context, image_href, path, force_raw=False):
def _fetch(context, image_href, path, force_raw=False, expected_format=None,
expected_checksum=None, expected_checksum_algo=None):
"""Fetch image and convert to raw format if needed."""
path_tmp = "%s.part" % path
if os.path.exists(path_tmp):
LOG.warning("%s exist, assuming it's stale", path_tmp)
os.remove(path_tmp)
images.fetch(context, image_href, path_tmp, force_raw=False)
images.fetch(context, image_href, path_tmp, force_raw=False,
checksum=expected_checksum,
checksum_algo=expected_checksum_algo)
# By default, the image format is unknown
image_format = None
disable_dii = CONF.conductor.disable_deep_image_inspection
if not disable_dii:
if not expected_format:
# Call of last resort to check the image format. Caching other
# artifacts like kernel/ramdisks are not going to have an expected
# format known even if they are not passed to qemu-img.
remote_image_format = images.image_show(
context,
image_href).get('disk_format')
else:
remote_image_format = expected_format
image_format = images.safety_check_image(path_tmp)
images.check_if_image_format_is_permitted(
image_format, remote_image_format)
# Notes(yjiang5): If glance can provide the virtual size information,
# then we can firstly clean cache and then invoke images.fetch().
if force_raw:
if images.force_raw_will_convert(image_href, path_tmp):
required_space = images.converted_size(path_tmp, estimate=False)
directory = os.path.dirname(path_tmp)
if (force_raw
and ((disable_dii
and images.force_raw_will_convert(image_href, path_tmp))
or (not disable_dii and image_format != 'raw'))):
# NOTE(TheJulia): What is happening here is the rest of the logic
# is hinged on force_raw, but also we don't need to take the entire
# path *if* the image on disk is *already* raw. Depending on settings,
# the path differs slightly because if we have deep image inspection,
# we can just rely upon the inspection image format, otherwise we
# need to ask the image format.
required_space = images.converted_size(path_tmp, estimate=False)
directory = os.path.dirname(path_tmp)
try:
_clean_up_caches(directory, required_space)
except exception.InsufficientDiskSpace:
# try again with an estimated raw size instead of the full size
required_space = images.converted_size(path_tmp, estimate=True)
try:
_clean_up_caches(directory, required_space)
except exception.InsufficientDiskSpace:
# try again with an estimated raw size instead of the full size
required_space = images.converted_size(path_tmp, estimate=True)
try:
_clean_up_caches(directory, required_space)
except exception.InsufficientDiskSpace:
LOG.warning('Not enough space for estimated image size. '
'Consider lowering '
'[DEFAULT]raw_image_growth_factor=%s',
CONF.raw_image_growth_factor)
raise
LOG.error('Not enough space for estimated image size. '
'Consider lowering '
'[DEFAULT]raw_image_growth_factor=%s',
CONF.raw_image_growth_factor)
raise
images.image_to_raw(image_href, path, path_tmp)
else:
os.rename(path_tmp, path)

View file

@ -81,6 +81,8 @@ class TestLookup(test_api_base.BaseApiTest):
'agent_token': mock.ANY,
'agent_token_required': True,
'agent_md5_checksum_enable': CONF.agent.allow_md5_checksum,
'disable_deep_image_inspection': CONF.conductor.disable_deep_image_inspection, # noqa
'permitted_image_formats': CONF.conductor.permitted_image_formats,
}
self.assertEqual(expected_config, data['config'])
self.assertIsNotNone(data['config']['agent_token'])
@ -457,6 +459,19 @@ class TestContinueInspection(test_api_base.BaseApiTest):
mock.ANY, mock.ANY, self.node.uuid, inventory=self.inventory,
plugin_data={'test': 42}, topic='test-topic')
def test_bmc_address_as_none(self, mock_lookup, mock_continue):
mock_lookup.return_value = self.node
self.inventory['bmc_address'] = None
self.inventory['bmc_v6address'] = None
response = self.post_json('/continue_inspection', self.data)
self.assertEqual(http_client.ACCEPTED, response.status_int)
self.assertEqual({'uuid': self.node.uuid}, response.json)
mock_lookup.assert_called_once_with(
mock.ANY, self.addresses, [], node_uuid=None)
mock_continue.assert_called_once_with(
mock.ANY, mock.ANY, self.node.uuid, inventory=self.inventory,
plugin_data={'test': 42}, topic='test-topic')
@mock.patch.object(rpcapi.ConductorAPI, 'get_node_with_token',
autospec=True)
def test_new_api(self, mock_get_node, mock_lookup, mock_continue):

View file

@ -0,0 +1,224 @@
# coding=utf-8
# Copyright 2024 Red Hat, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from unittest import mock
from oslo_config import cfg
from oslo_utils import fileutils
from ironic.common import checksum_utils
from ironic.common import exception
from ironic.common import image_service
from ironic.tests import base
CONF = cfg.CONF
@mock.patch.object(checksum_utils, 'compute_image_checksum',
autospec=True)
class IronicChecksumUtilsValidateTestCase(base.TestCase):
def test_validate_checksum(self, mock_compute):
mock_compute.return_value = 'f00'
checksum_utils.validate_checksum('path', 'f00', 'algo')
mock_compute.assert_called_once_with('path', 'algo')
def test_validate_checksum_mixed_case(self, mock_compute):
mock_compute.return_value = 'f00'
checksum_utils.validate_checksum('path', 'F00', 'ALGO')
mock_compute.assert_called_once_with('path', 'algo')
def test_validate_checksum_mixed_md5(self, mock_compute):
mock_compute.return_value = 'f00'
checksum_utils.validate_checksum('path', 'F00')
mock_compute.assert_called_once_with('path')
def test_validate_checksum_mismatch(self, mock_compute):
mock_compute.return_value = 'a00'
self.assertRaises(exception.ImageChecksumError,
checksum_utils.validate_checksum,
'path', 'f00', 'algo')
mock_compute.assert_called_once_with('path', 'algo')
def test_validate_checksum_hashlib_not_supports_algo(self, mock_compute):
mock_compute.side_effect = ValueError()
self.assertRaises(exception.ImageChecksumAlgorithmFailure,
checksum_utils.validate_checksum,
'path', 'f00', 'algo')
mock_compute.assert_called_once_with('path', 'algo')
def test_validate_checksum_file_not_found(self, mock_compute):
mock_compute.side_effect = OSError()
self.assertRaises(exception.ImageChecksumFileReadFailure,
checksum_utils.validate_checksum,
'path', 'f00', 'algo')
mock_compute.assert_called_once_with('path', 'algo')
def test_validate_checksum_mixed_case_delimited(self, mock_compute):
mock_compute.return_value = 'f00'
checksum_utils.validate_checksum('path', 'algo:F00')
mock_compute.assert_called_once_with('path', 'algo')
class IronicChecksumUtilsTestCase(base.TestCase):
def test_is_checksum_url_string(self):
self.assertFalse(checksum_utils.is_checksum_url('f00'))
def test_is_checksum_url_file(self):
self.assertFalse(checksum_utils.is_checksum_url('file://foo'))
def test_is_checksum_url(self):
urls = ['http://foo.local/file',
'https://foo.local/file']
for url in urls:
self.assertTrue(checksum_utils.is_checksum_url(url))
def test_get_checksum_and_algo_image_checksum(self):
value = 'c46f2c98efe1cd246be1796cd842246e'
i_info = {'image_checksum': value}
csum, algo = checksum_utils.get_checksum_and_algo(i_info)
self.assertEqual(value, csum)
self.assertIsNone(algo)
def test_get_checksum_and_algo_image_checksum_not_allowed(self):
CONF.set_override('allow_md5_checksum', False, group='agent')
value = 'c46f2c98efe1cd246be1796cd842246e'
i_info = {'image_checksum': value}
self.assertRaises(exception.ImageChecksumAlgorithmFailure,
checksum_utils.get_checksum_and_algo,
i_info)
def test_get_checksum_and_algo_image_checksum_glance(self):
value = 'c46f2c98efe1cd246be1796cd842246e'
i_info = {'image_os_hash_value': value,
'image_os_hash_algo': 'foobar'}
csum, algo = checksum_utils.get_checksum_and_algo(i_info)
self.assertEqual(value, csum)
self.assertEqual('foobar', algo)
def test_get_checksum_and_algo_image_checksum_sha256(self):
value = 'a' * 64
i_info = {'image_checksum': value}
csum, algo = checksum_utils.get_checksum_and_algo(i_info)
self.assertEqual(value, csum)
self.assertEqual('sha256', algo)
def test_get_checksum_and_algo_image_checksum_sha512(self):
value = 'f' * 128
i_info = {'image_checksum': value}
csum, algo = checksum_utils.get_checksum_and_algo(i_info)
self.assertEqual(value, csum)
self.assertEqual('sha512', algo)
@mock.patch.object(checksum_utils, 'get_checksum_from_url', autospec=True)
def test_get_checksum_and_algo_image_checksum_http_url(self, mock_get):
value = 'http://checksum-url'
i_info = {
'image_checksum': value,
'image_source': 'image-ref'
}
mock_get.return_value = 'f' * 64
csum, algo = checksum_utils.get_checksum_and_algo(i_info)
mock_get.assert_called_once_with(value, 'image-ref')
self.assertEqual('f' * 64, csum)
self.assertEqual('sha256', algo)
@mock.patch.object(checksum_utils, 'get_checksum_from_url', autospec=True)
def test_get_checksum_and_algo_image_checksum_https_url(self, mock_get):
value = 'https://checksum-url'
i_info = {
'image_checksum': value,
'image_source': 'image-ref'
}
mock_get.return_value = 'f' * 128
csum, algo = checksum_utils.get_checksum_and_algo(i_info)
mock_get.assert_called_once_with(value, 'image-ref')
self.assertEqual('f' * 128, csum)
self.assertEqual('sha512', algo)
@mock.patch.object(fileutils, 'compute_file_checksum', autospec=True)
def test_get_checksum_and_algo_no_checksum_file_url(self, mock_cfc):
i_info = {
'image_source': 'file:///var/lib/ironic/images/foo.raw'
}
mock_cfc.return_value = 'f' * 64
csum, algo = checksum_utils.get_checksum_and_algo(i_info)
mock_cfc.assert_called_once_with('/var/lib/ironic/images/foo.raw',
algorithm='sha256')
self.assertEqual('f' * 64, csum)
self.assertEqual('sha256', algo)
@mock.patch.object(image_service.HttpImageService, 'get',
autospec=True)
class IronicChecksumUtilsGetChecksumTestCase(base.TestCase):
def test_get_checksum_from_url_empty_response(self, mock_get):
mock_get.return_value = ''
error = ('Failed to download image https://checksum-url, '
'reason: Checksum file empty.')
self.assertRaisesRegex(exception.ImageDownloadFailed,
error,
checksum_utils.get_checksum_from_url,
'https://checksum-url',
'https://image-url/file')
mock_get.assert_called_once_with('https://checksum-url')
def test_get_checksum_from_url_one_line(self, mock_get):
mock_get.return_value = 'a' * 32
csum = checksum_utils.get_checksum_from_url(
'https://checksum-url', 'https://image-url/file')
mock_get.assert_called_once_with('https://checksum-url')
self.assertEqual('a' * 32, csum)
def test_get_checksum_from_url_nomatch_line(self, mock_get):
mock_get.return_value = 'foobar'
# For some reason assertRaisesRegex really doesn't like
# the error. Easiest path is just to assertTrue the compare.
exc = self.assertRaises(exception.ImageDownloadFailed,
checksum_utils.get_checksum_from_url,
'https://checksum-url',
'https://image-url/file')
self.assertTrue(
'Invalid checksum file (No valid checksum found' in str(exc))
mock_get.assert_called_once_with('https://checksum-url')
def test_get_checksum_from_url_multiline(self, mock_get):
test_csum = ('f2ca1bb6c7e907d06dafe4687e579fce76b37e4e9'
'3b7605022da52e6ccc26fd2')
mock_get.return_value = ('fee f00\n%s file\nbar fee\nf00' % test_csum)
# For some reason assertRaisesRegex really doesn't like
# the error. Easiest path is just to assertTrue the compare.
checksum = checksum_utils.get_checksum_from_url(
'https://checksum-url',
'https://image-url/file')
self.assertEqual(test_csum, checksum)
mock_get.assert_called_once_with('https://checksum-url')
def test_get_checksum_from_url_multiline_no_file(self, mock_get):
test_csum = 'a' * 64
error = ("Failed to download image https://checksum-url, reason: "
"Checksum file does not contain name file")
mock_get.return_value = ('f00\n%s\nbar\nf00' % test_csum)
# For some reason assertRaisesRegex really doesn't like
# the error. Easiest path is just to assertTrue the compare.
self.assertRaisesRegex(exception.ImageDownloadFailed,
error,
checksum_utils.get_checksum_from_url,
'https://checksum-url',
'https://image-url/file')
mock_get.assert_called_once_with('https://checksum-url')

View file

@ -0,0 +1,668 @@
# Copyright 2020 Red Hat, Inc
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import io
import os
import re
import struct
import subprocess
import tempfile
from unittest import mock
from oslo_utils import units
from ironic.common import image_format_inspector as format_inspector
from ironic.tests import base as test_base
TEST_IMAGE_PREFIX = 'ironic-unittest-formatinspector-'
def get_size_from_qemu_img(filename):
output = subprocess.check_output('qemu-img info "%s"' % filename,
shell=True)
for line in output.split(b'\n'):
m = re.search(b'^virtual size: .* .([0-9]+) bytes', line.strip())
if m:
return int(m.group(1))
raise Exception('Could not find virtual size with qemu-img')
class TestFormatInspectors(test_base.TestCase):
block_execute = False
def setUp(self):
super(TestFormatInspectors, self).setUp()
self._created_files = []
def tearDown(self):
super(TestFormatInspectors, self).tearDown()
for fn in self._created_files:
try:
os.remove(fn)
except Exception:
pass
def _create_iso(self, image_size, subformat='9660'):
"""Create an ISO file of the given size.
:param image_size: The size of the image to create in bytes
:param subformat: The subformat to use, if any
"""
# these tests depend on mkisofs
# being installed and in the path,
# if it is not installed, skip
try:
subprocess.check_output('mkisofs --version', shell=True)
except Exception:
self.skipTest('mkisofs not installed')
size = image_size // units.Mi
base_cmd = "mkisofs"
if subformat == 'udf':
# depending on the distribution mkisofs may not support udf
# and may be provided by genisoimage instead. As a result we
# need to check if the command supports udf via help
# instead of checking the installed version.
# mkisofs --help outputs to stderr so we need to
# redirect it to stdout to use grep.
try:
subprocess.check_output(
'mkisofs --help 2>&1 | grep udf', shell=True)
except Exception:
self.skipTest('mkisofs does not support udf format')
base_cmd += " -udf"
prefix = TEST_IMAGE_PREFIX
prefix += '-%s-' % subformat
fn = tempfile.mktemp(prefix=prefix, suffix='.iso')
self._created_files.append(fn)
subprocess.check_output(
'dd if=/dev/zero of=%s bs=1M count=%i' % (fn, size),
shell=True)
# We need to use different file as input and output as the behavior
# of mkisofs is version dependent if both the input and the output
# are the same and can cause test failures
out_fn = "%s.iso" % fn
subprocess.check_output(
'%s -V "TEST" -o %s %s' % (base_cmd, out_fn, fn),
shell=True)
self._created_files.append(out_fn)
return out_fn
def _create_img(
self, fmt, size, subformat=None, options=None,
backing_file=None):
"""Create an image file of the given format and size.
:param fmt: The format to create
:param size: The size of the image to create in bytes
:param subformat: The subformat to use, if any
:param options: A dictionary of options to pass to the format
:param backing_file: The backing file to use, if any
"""
if fmt == 'iso':
return self._create_iso(size, subformat)
if fmt == 'vhd':
# QEMU calls the vhd format vpc
fmt = 'vpc'
# these tests depend on qemu-img being installed and in the path,
# if it is not installed, skip. we also need to ensure that the
# format is supported by qemu-img, this can vary depending on the
# distribution so we need to check if the format is supported via
# the help output.
try:
subprocess.check_output(
'qemu-img --help | grep %s' % fmt, shell=True)
except Exception:
self.skipTest(
'qemu-img not installed or does not support %s format' % fmt)
if options is None:
options = {}
opt = ''
prefix = TEST_IMAGE_PREFIX
if subformat:
options['subformat'] = subformat
prefix += subformat + '-'
if options:
opt += '-o ' + ','.join('%s=%s' % (k, v)
for k, v in options.items())
if backing_file is not None:
opt += ' -b %s -F raw' % backing_file
fn = tempfile.mktemp(prefix=prefix,
suffix='.%s' % fmt)
self._created_files.append(fn)
subprocess.check_output(
'qemu-img create -f %s %s %s %i' % (fmt, opt, fn, size),
shell=True)
return fn
def _create_allocated_vmdk(self, size_mb, subformat=None):
# We need a "big" VMDK file to exercise some parts of the code of the
# format_inspector. A way to create one is to first create an empty
# file, and then to convert it with the -S 0 option.
if subformat is None:
# Matches qemu-img default, see `qemu-img convert -O vmdk -o help`
subformat = 'monolithicSparse'
prefix = TEST_IMAGE_PREFIX
prefix += '-%s-' % subformat
fn = tempfile.mktemp(prefix=prefix, suffix='.vmdk')
self._created_files.append(fn)
raw = tempfile.mktemp(prefix=prefix, suffix='.raw')
self._created_files.append(raw)
# Create a file with pseudo-random data, otherwise it will get
# compressed in the streamOptimized format
subprocess.check_output(
'dd if=/dev/urandom of=%s bs=1M count=%i' % (raw, size_mb),
shell=True)
# Convert it to VMDK
subprocess.check_output(
'qemu-img convert -f raw -O vmdk -o subformat=%s -S 0 %s %s' % (
subformat, raw, fn),
shell=True)
return fn
def _test_format_at_block_size(self, format_name, img, block_size):
fmt = format_inspector.get_inspector(format_name)()
self.assertIsNotNone(fmt,
'Did not get format inspector for %s' % (
format_name))
wrapper = format_inspector.InfoWrapper(open(img, 'rb'), fmt)
while True:
chunk = wrapper.read(block_size)
if not chunk:
break
wrapper.close()
return fmt
def _test_format_at_image_size(self, format_name, image_size,
subformat=None):
"""Test the format inspector for the given format at the given image size.
:param format_name: The format to test
:param image_size: The size of the image to create in bytes
:param subformat: The subformat to use, if any
""" # noqa
img = self._create_img(format_name, image_size, subformat=subformat)
# Some formats have internal alignment restrictions making this not
# always exactly like image_size, so get the real value for comparison
virtual_size = get_size_from_qemu_img(img)
# Read the format in various sizes, some of which will read whole
# sections in a single read, others will be completely unaligned, etc.
block_sizes = [64 * units.Ki, 1 * units.Mi]
# ISO images have a 32KB system area at the beginning of the image
# as a result reading that in 17 or 512 byte blocks takes too long,
# causing the test to fail. The 64KiB block size is enough to read
# the system area and header in a single read. the 1MiB block size
# adds very little time to the test so we include it.
if format_name != 'iso':
block_sizes.extend([17, 512])
for block_size in block_sizes:
fmt = self._test_format_at_block_size(format_name, img, block_size)
self.assertTrue(fmt.format_match,
'Failed to match %s at size %i block %i' % (
format_name, image_size, block_size))
self.assertEqual(virtual_size, fmt.virtual_size,
('Failed to calculate size for %s at size %i '
'block %i') % (format_name, image_size,
block_size))
memory = sum(fmt.context_info.values())
self.assertLess(memory, 512 * units.Ki,
'Format used more than 512KiB of memory: %s' % (
fmt.context_info))
def _test_format(self, format_name, subformat=None):
# Try a few different image sizes, including some odd and very small
# sizes
for image_size in (512, 513, 2057, 7):
self._test_format_at_image_size(format_name, image_size * units.Mi,
subformat=subformat)
def test_qcow2(self):
self._test_format('qcow2')
def test_iso_9660(self):
self._test_format('iso', subformat='9660')
def test_iso_udf(self):
self._test_format('iso', subformat='udf')
def _generate_bad_iso(self):
# we want to emulate a malicious user who uploads a an
# ISO file has a qcow2 header in the system area
# of the ISO file
# we will create a qcow2 image and an ISO file
# and then copy the qcow2 header to the ISO file
# e.g.
# mkisofs -o orig.iso /etc/resolv.conf
# qemu-img create orig.qcow2 -f qcow2 64M
# dd if=orig.qcow2 of=outcome bs=32K count=1
# dd if=orig.iso of=outcome bs=32K skip=1 seek=1
qcow = self._create_img('qcow2', 10 * units.Mi)
iso = self._create_iso(64 * units.Mi, subformat='9660')
# first ensure the files are valid
iso_fmt = self._test_format_at_block_size('iso', iso, 4 * units.Ki)
self.assertTrue(iso_fmt.format_match)
qcow_fmt = self._test_format_at_block_size('qcow2', qcow, 4 * units.Ki)
self.assertTrue(qcow_fmt.format_match)
# now copy the qcow2 header to an ISO file
prefix = TEST_IMAGE_PREFIX
prefix += '-bad-'
fn = tempfile.mktemp(prefix=prefix, suffix='.iso')
self._created_files.append(fn)
subprocess.check_output(
'dd if=%s of=%s bs=32K count=1' % (qcow, fn),
shell=True)
subprocess.check_output(
'dd if=%s of=%s bs=32K skip=1 seek=1' % (iso, fn),
shell=True)
return qcow, iso, fn
def test_bad_iso_qcow2(self):
_, _, fn = self._generate_bad_iso()
iso_check = self._test_format_at_block_size('iso', fn, 4 * units.Ki)
qcow_check = self._test_format_at_block_size('qcow2', fn, 4 * units.Ki)
# this system area of the ISO file is not considered part of the format
# the qcow2 header is in the system area of the ISO file
# so the ISO file is still valid
self.assertTrue(iso_check.format_match)
# the qcow2 header is in the system area of the ISO file
# but that will be parsed by the qcow2 format inspector
# and it will match
self.assertTrue(qcow_check.format_match)
# if we call format_inspector.detect_file_format it should detect
# and raise an exception because both match internally.
e = self.assertRaises(
format_inspector.ImageFormatError,
format_inspector.detect_file_format, fn)
self.assertIn('Multiple formats detected', str(e))
def test_vhd(self):
self._test_format('vhd')
# NOTE(TheJulia): This is not a supported format, and we know this
# test can timeout due to some of the inner workings. Overall the
# code voered by this is being moved to oslo in the future, so this
# test being in ironic is also not the needful.
# def test_vhdx(self):
# self._test_format('vhdx')
def test_vmdk(self):
self._test_format('vmdk')
def test_vmdk_stream_optimized(self):
self._test_format('vmdk', 'streamOptimized')
def test_from_file_reads_minimum(self):
img = self._create_img('qcow2', 10 * units.Mi)
file_size = os.stat(img).st_size
fmt = format_inspector.QcowInspector.from_file(img)
# We know everything we need from the first 512 bytes of a QCOW image,
# so make sure that we did not read the whole thing when we inspect
# a local file.
self.assertLess(fmt.actual_size, file_size)
def test_qed_always_unsafe(self):
img = self._create_img('qed', 10 * units.Mi)
fmt = format_inspector.get_inspector('qed').from_file(img)
self.assertTrue(fmt.format_match)
self.assertFalse(fmt.safety_check())
def _test_vmdk_bad_descriptor_offset(self, subformat=None):
format_name = 'vmdk'
image_size = 10 * units.Mi
descriptorOffsetAddr = 0x1c
BAD_ADDRESS = 0x400
img = self._create_img(format_name, image_size, subformat=subformat)
# Corrupt the header
fd = open(img, 'r+b')
fd.seek(descriptorOffsetAddr)
fd.write(struct.pack('<Q', BAD_ADDRESS // 512))
fd.close()
# Read the format in various sizes, some of which will read whole
# sections in a single read, others will be completely unaligned, etc.
for block_size in (64 * units.Ki, 512, 17, 1 * units.Mi):
fmt = self._test_format_at_block_size(format_name, img, block_size)
self.assertTrue(fmt.format_match,
'Failed to match %s at size %i block %i' % (
format_name, image_size, block_size))
self.assertEqual(0, fmt.virtual_size,
('Calculated a virtual size for a corrupt %s at '
'size %i block %i') % (format_name, image_size,
block_size))
def test_vmdk_bad_descriptor_offset(self):
self._test_vmdk_bad_descriptor_offset()
def test_vmdk_bad_descriptor_offset_stream_optimized(self):
self._test_vmdk_bad_descriptor_offset(subformat='streamOptimized')
def _test_vmdk_bad_descriptor_mem_limit(self, subformat=None):
format_name = 'vmdk'
image_size = 5 * units.Mi
virtual_size = 5 * units.Mi
descriptorOffsetAddr = 0x1c
descriptorSizeAddr = descriptorOffsetAddr + 8
twoMBInSectors = (2 << 20) // 512
# We need a big VMDK because otherwise we will not have enough data to
# fill-up the CaptureRegion.
img = self._create_allocated_vmdk(image_size // units.Mi,
subformat=subformat)
# Corrupt the end of descriptor address so it "ends" at 2MB
fd = open(img, 'r+b')
fd.seek(descriptorSizeAddr)
fd.write(struct.pack('<Q', twoMBInSectors))
fd.close()
# Read the format in various sizes, some of which will read whole
# sections in a single read, others will be completely unaligned, etc.
for block_size in (64 * units.Ki, 512, 17, 1 * units.Mi):
fmt = self._test_format_at_block_size(format_name, img, block_size)
self.assertTrue(fmt.format_match,
'Failed to match %s at size %i block %i' % (
format_name, image_size, block_size))
self.assertEqual(virtual_size, fmt.virtual_size,
('Failed to calculate size for %s at size %i '
'block %i') % (format_name, image_size,
block_size))
memory = sum(fmt.context_info.values())
self.assertLess(memory, 1.5 * units.Mi,
'Format used more than 1.5MiB of memory: %s' % (
fmt.context_info))
def test_vmdk_bad_descriptor_mem_limit(self):
self._test_vmdk_bad_descriptor_mem_limit()
def test_vmdk_bad_descriptor_mem_limit_stream_optimized(self):
self._test_vmdk_bad_descriptor_mem_limit(subformat='streamOptimized')
def test_qcow2_safety_checks(self):
# Create backing and data-file names (and initialize the backing file)
backing_fn = tempfile.mktemp(prefix='backing')
self._created_files.append(backing_fn)
with open(backing_fn, 'w') as f:
f.write('foobar')
data_fn = tempfile.mktemp(prefix='data')
self._created_files.append(data_fn)
# A qcow with no backing or data file is safe
fn = self._create_img('qcow2', 5 * units.Mi, None)
inspector = format_inspector.QcowInspector.from_file(fn)
self.assertTrue(inspector.safety_check())
# A backing file makes it unsafe
fn = self._create_img('qcow2', 5 * units.Mi, None,
backing_file=backing_fn)
inspector = format_inspector.QcowInspector.from_file(fn)
self.assertFalse(inspector.safety_check())
# A data-file makes it unsafe
fn = self._create_img('qcow2', 5 * units.Mi,
options={'data_file': data_fn,
'data_file_raw': 'on'})
inspector = format_inspector.QcowInspector.from_file(fn)
self.assertFalse(inspector.safety_check())
# Trying to load a non-QCOW file is an error
self.assertRaises(format_inspector.ImageFormatError,
format_inspector.QcowInspector.from_file,
backing_fn)
def test_qcow2_feature_flag_checks(self):
data = bytearray(512)
data[0:4] = b'QFI\xFB'
inspector = format_inspector.QcowInspector()
inspector.region('header').data = data
# All zeros, no feature flags - all good
self.assertFalse(inspector.has_unknown_features)
# A feature flag set in the first byte (highest-order) is not
# something we know about, so fail.
data[0x48] = 0x01
self.assertTrue(inspector.has_unknown_features)
# The first bit in the last byte (lowest-order) is known (the dirty
# bit) so that should pass
data[0x48] = 0x00
data[0x4F] = 0x01
self.assertFalse(inspector.has_unknown_features)
# Currently (as of 2024), the high-order feature flag bit in the low-
# order byte is not assigned, so make sure we reject it.
data[0x4F] = 0x80
self.assertTrue(inspector.has_unknown_features)
def test_vdi(self):
self._test_format('vdi')
def _test_format_with_invalid_data(self, format_name):
fmt = format_inspector.get_inspector(format_name)()
wrapper = format_inspector.InfoWrapper(open(__file__, 'rb'), fmt)
while True:
chunk = wrapper.read(32)
if not chunk:
break
wrapper.close()
self.assertFalse(fmt.format_match)
self.assertEqual(0, fmt.virtual_size)
memory = sum(fmt.context_info.values())
self.assertLess(memory, 512 * units.Ki,
'Format used more than 512KiB of memory: %s' % (
fmt.context_info))
def test_qcow2_invalid(self):
self._test_format_with_invalid_data('qcow2')
def test_vhd_invalid(self):
self._test_format_with_invalid_data('vhd')
def test_vhdx_invalid(self):
self._test_format_with_invalid_data('vhdx')
def test_vmdk_invalid(self):
self._test_format_with_invalid_data('vmdk')
def test_vdi_invalid(self):
self._test_format_with_invalid_data('vdi')
def test_vmdk_invalid_type(self):
fmt = format_inspector.get_inspector('vmdk')()
wrapper = format_inspector.InfoWrapper(open(__file__, 'rb'), fmt)
while True:
chunk = wrapper.read(32)
if not chunk:
break
wrapper.close()
fake_rgn = mock.MagicMock()
fake_rgn.complete = True
fake_rgn.data = b'foocreateType="someunknownformat"bar'
with mock.patch.object(fmt, 'has_region', return_value=True,
autospec=True):
with mock.patch.object(fmt, 'region', return_value=fake_rgn,
autospec=True):
self.assertEqual(0, fmt.virtual_size)
class TestFormatInspectorInfra(test_base.TestCase):
def _test_capture_region_bs(self, bs):
data = b''.join(chr(x).encode() for x in range(ord('A'), ord('z')))
regions = [
format_inspector.CaptureRegion(3, 9),
format_inspector.CaptureRegion(0, 256),
format_inspector.CaptureRegion(32, 8),
]
for region in regions:
# None of them should be complete yet
self.assertFalse(region.complete)
pos = 0
for i in range(0, len(data), bs):
chunk = data[i:i + bs]
pos += len(chunk)
for region in regions:
region.capture(chunk, pos)
self.assertEqual(data[3:12], regions[0].data)
self.assertEqual(data[0:256], regions[1].data)
self.assertEqual(data[32:40], regions[2].data)
# The small regions should be complete
self.assertTrue(regions[0].complete)
self.assertTrue(regions[2].complete)
# This region extended past the available data, so not complete
self.assertFalse(regions[1].complete)
def test_capture_region(self):
for block_size in (1, 3, 7, 13, 32, 64):
self._test_capture_region_bs(block_size)
def _get_wrapper(self, data):
source = io.BytesIO(data)
fake_fmt = mock.create_autospec(format_inspector.get_inspector('raw'))
return format_inspector.InfoWrapper(source, fake_fmt)
def test_info_wrapper_file_like(self):
data = b''.join(chr(x).encode() for x in range(ord('A'), ord('z')))
wrapper = self._get_wrapper(data)
read_data = b''
while True:
chunk = wrapper.read(8)
if not chunk:
break
read_data += chunk
self.assertEqual(data, read_data)
def test_info_wrapper_iter_like(self):
data = b''.join(chr(x).encode() for x in range(ord('A'), ord('z')))
wrapper = self._get_wrapper(data)
read_data = b''
for chunk in wrapper:
read_data += chunk
self.assertEqual(data, read_data)
def test_info_wrapper_file_like_eats_error(self):
wrapper = self._get_wrapper(b'123456')
wrapper._format.eat_chunk.side_effect = Exception('fail')
data = b''
while True:
chunk = wrapper.read(3)
if not chunk:
break
data += chunk
# Make sure we got all the data despite the error
self.assertEqual(b'123456', data)
# Make sure we only called this once and never again after
# the error was raised
wrapper._format.eat_chunk.assert_called_once_with(b'123')
def test_info_wrapper_iter_like_eats_error(self):
fake_fmt = mock.create_autospec(format_inspector.get_inspector('raw'))
wrapper = format_inspector.InfoWrapper(iter([b'123', b'456']),
fake_fmt)
fake_fmt.eat_chunk.side_effect = Exception('fail')
data = b''
for chunk in wrapper:
data += chunk
# Make sure we got all the data despite the error
self.assertEqual(b'123456', data)
# Make sure we only called this once and never again after
# the error was raised
fake_fmt.eat_chunk.assert_called_once_with(b'123')
def test_get_inspector(self):
self.assertEqual(format_inspector.QcowInspector,
format_inspector.get_inspector('qcow2'))
self.assertIsNone(format_inspector.get_inspector('foo'))
class TestFormatInspectorsTargeted(test_base.TestCase):
def _make_vhd_meta(self, guid_raw, item_length):
# Meta region header, padded to 32 bytes
data = struct.pack('<8sHH', b'metadata', 0, 1)
data += b'0' * 20
# Metadata table entry, 16-byte GUID, 12-byte information,
# padded to 32-bytes
data += guid_raw
data += struct.pack('<III', 256, item_length, 0)
data += b'0' * 6
return data
def test_vhd_table_over_limit(self):
ins = format_inspector.VHDXInspector()
meta = format_inspector.CaptureRegion(0, 0)
desired = b'012345678ABCDEF0'
# This is a poorly-crafted image that specifies a larger table size
# than is allowed
meta.data = self._make_vhd_meta(desired, 33 * 2048)
ins.new_region('metadata', meta)
new_region = ins._find_meta_entry(ins._guid(desired))
# Make sure we clamp to our limit of 32 * 2048
self.assertEqual(
format_inspector.VHDXInspector.VHDX_METADATA_TABLE_MAX_SIZE,
new_region.length)
def test_vhd_table_under_limit(self):
ins = format_inspector.VHDXInspector()
meta = format_inspector.CaptureRegion(0, 0)
desired = b'012345678ABCDEF0'
meta.data = self._make_vhd_meta(desired, 16 * 2048)
ins.new_region('metadata', meta)
new_region = ins._find_meta_entry(ins._guid(desired))
# Table size was under the limit, make sure we get it back
self.assertEqual(16 * 2048, new_region.length)

View file

@ -574,6 +574,40 @@ class HttpImageServiceTestCase(base.TestCase):
verify=True,
timeout=15, auth=None)
@mock.patch.object(requests, 'get', autospec=True)
def test_get_success(self, req_get_mock):
response_mock = req_get_mock.return_value
response_mock.status_code = http_client.OK
response_mock.text = 'value'
self.assertEqual('value', self.service.get('http://url'))
req_get_mock.assert_called_once_with('http://url', stream=False,
verify=True, timeout=60,
auth=None)
@mock.patch.object(requests, 'get', autospec=True)
def test_get_handles_exceptions(self, req_get_mock):
for exc in [OSError, requests.ConnectionError,
requests.RequestException, IOError]:
req_get_mock.reset_mock()
req_get_mock.side_effect = exc
self.assertRaises(exception.ImageDownloadFailed,
self.service.get,
'http://url')
req_get_mock.assert_called_once_with('http://url', stream=False,
verify=True, timeout=60,
auth=None)
@mock.patch.object(requests, 'get', autospec=True)
def test_get_success_verify_false(self, req_get_mock):
cfg.CONF.set_override('webserver_verify_ca', False)
response_mock = req_get_mock.return_value
response_mock.status_code = http_client.OK
response_mock.text = 'value'
self.assertEqual('value', self.service.get('http://url'))
req_get_mock.assert_called_once_with('http://url', stream=False,
verify=False, timeout=60,
auth=None)
class FileImageServiceTestCase(base.TestCase):
def setUp(self):

View file

@ -21,14 +21,16 @@ import os
import shutil
from unittest import mock
from ironic_lib import disk_utils
from oslo_concurrency import processutils
from oslo_config import cfg
from oslo_utils import fileutils
from ironic.common import exception
from ironic.common.glance_service import service_utils as glance_utils
from ironic.common import image_format_inspector
from ironic.common import image_service
from ironic.common import images
from ironic.common import qemu_img
from ironic.common import utils
from ironic.tests import base
@ -72,87 +74,279 @@ class IronicImagesTestCase(base.TestCase):
image_to_raw_mock.assert_called_once_with(
'image_href', 'path', 'path.part')
@mock.patch.object(disk_utils, 'qemu_img_info', autospec=True)
def test_image_to_raw_no_file_format(self, qemu_img_info_mock):
info = self.FakeImgInfo()
info.file_format = None
qemu_img_info_mock.return_value = info
@mock.patch.object(fileutils, 'compute_file_checksum',
autospec=True)
@mock.patch.object(image_service, 'get_image_service', autospec=True)
@mock.patch.object(images, 'image_to_raw', autospec=True)
@mock.patch.object(builtins, 'open', autospec=True)
def test_fetch_image_service_force_raw_with_checksum(
self, open_mock, image_to_raw_mock,
image_service_mock, mock_checksum):
mock_file_handle = mock.MagicMock(spec=io.BytesIO)
mock_file_handle.__enter__.return_value = 'file'
open_mock.return_value = mock_file_handle
mock_checksum.return_value = 'f00'
images.fetch('context', 'image_href', 'path', force_raw=True,
checksum='f00', checksum_algo='sha256')
mock_checksum.assert_called_once_with('path', algorithm='sha256')
open_mock.assert_called_once_with('path', 'wb')
image_service_mock.return_value.download.assert_called_once_with(
'image_href', 'file')
image_to_raw_mock.assert_called_once_with(
'image_href', 'path', 'path.part')
@mock.patch.object(fileutils, 'compute_file_checksum',
autospec=True)
@mock.patch.object(image_service, 'get_image_service', autospec=True)
@mock.patch.object(images, 'image_to_raw', autospec=True)
@mock.patch.object(builtins, 'open', autospec=True)
def test_fetch_image_service_with_checksum_mismatch(
self, open_mock, image_to_raw_mock,
image_service_mock, mock_checksum):
mock_file_handle = mock.MagicMock(spec=io.BytesIO)
mock_file_handle.__enter__.return_value = 'file'
open_mock.return_value = mock_file_handle
mock_checksum.return_value = 'a00'
self.assertRaises(exception.ImageChecksumError,
images.fetch, 'context', 'image_href',
'path', force_raw=True,
checksum='f00', checksum_algo='sha256')
mock_checksum.assert_called_once_with('path', algorithm='sha256')
open_mock.assert_called_once_with('path', 'wb')
image_service_mock.return_value.download.assert_called_once_with(
'image_href', 'file')
# If the checksum fails, then we don't attempt to convert the image.
image_to_raw_mock.assert_not_called()
@mock.patch.object(fileutils, 'compute_file_checksum',
autospec=True)
@mock.patch.object(image_service, 'get_image_service', autospec=True)
@mock.patch.object(images, 'image_to_raw', autospec=True)
@mock.patch.object(builtins, 'open', autospec=True)
def test_fetch_image_service_force_raw_no_checksum_algo(
self, open_mock, image_to_raw_mock,
image_service_mock, mock_checksum):
mock_file_handle = mock.MagicMock(spec=io.BytesIO)
mock_file_handle.__enter__.return_value = 'file'
open_mock.return_value = mock_file_handle
mock_checksum.return_value = 'f00'
images.fetch('context', 'image_href', 'path', force_raw=True,
checksum='f00')
mock_checksum.assert_called_once_with('path', algorithm='md5')
open_mock.assert_called_once_with('path', 'wb')
image_service_mock.return_value.download.assert_called_once_with(
'image_href', 'file')
image_to_raw_mock.assert_called_once_with(
'image_href', 'path', 'path.part')
@mock.patch.object(fileutils, 'compute_file_checksum',
autospec=True)
@mock.patch.object(image_service, 'get_image_service', autospec=True)
@mock.patch.object(images, 'image_to_raw', autospec=True)
@mock.patch.object(builtins, 'open', autospec=True)
def test_fetch_image_service_force_raw_combined_algo(
self, open_mock, image_to_raw_mock,
image_service_mock, mock_checksum):
mock_file_handle = mock.MagicMock(spec=io.BytesIO)
mock_file_handle.__enter__.return_value = 'file'
open_mock.return_value = mock_file_handle
mock_checksum.return_value = 'f00'
images.fetch('context', 'image_href', 'path', force_raw=True,
checksum='sha512:f00')
mock_checksum.assert_called_once_with('path', algorithm='sha512')
open_mock.assert_called_once_with('path', 'wb')
image_service_mock.return_value.download.assert_called_once_with(
'image_href', 'file')
image_to_raw_mock.assert_called_once_with(
'image_href', 'path', 'path.part')
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
def test_image_to_raw_not_permitted_format(self, detect_format_mock):
info = mock.MagicMock()
# In the case the image looks okay, but it is not in our permitted
# format list, we need to ensure we still fail appropriately.
info.safety_check.return_value = True
info.__str__.return_value = 'vhd'
detect_format_mock.return_value = info
e = self.assertRaises(exception.ImageUnacceptable, images.image_to_raw,
'image_href', 'path', 'path_tmp')
qemu_img_info_mock.assert_called_once_with('path_tmp')
self.assertIn("'qemu-img info' parsing failed.", str(e))
info.safety_check.assert_called_once()
detect_format_mock.assert_called_once_with('path_tmp')
self.assertIn("The requested image is not valid for use.", str(e))
@mock.patch.object(disk_utils, 'qemu_img_info', autospec=True)
def test_image_to_raw_backing_file_present(self, qemu_img_info_mock):
info = self.FakeImgInfo()
info.file_format = 'raw'
info.backing_file = 'backing_file'
qemu_img_info_mock.return_value = info
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
def test_image_to_raw_fails_safety_check(self, detect_format_mock):
info = mock.MagicMock()
info.__str__.return_value = 'qcow2'
info.safety_check.return_value = False
detect_format_mock.return_value = info
e = self.assertRaises(exception.ImageUnacceptable, images.image_to_raw,
'image_href', 'path', 'path_tmp')
qemu_img_info_mock.assert_called_once_with('path_tmp')
self.assertIn("fmt=raw backed by: backing_file", str(e))
info.safety_check.assert_called_once()
detect_format_mock.assert_called_once_with('path_tmp')
self.assertIn("The requested image is not valid for use.", str(e))
@mock.patch.object(os, 'rename', autospec=True)
@mock.patch.object(os, 'unlink', autospec=True)
@mock.patch.object(disk_utils, 'convert_image', autospec=True)
@mock.patch.object(disk_utils, 'qemu_img_info', autospec=True)
def test_image_to_raw(self, qemu_img_info_mock, convert_image_mock,
@mock.patch.object(qemu_img, 'convert_image', autospec=True)
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
def test_image_to_raw(self, detect_format_mock, convert_image_mock,
unlink_mock, rename_mock):
CONF.set_override('force_raw_images', True)
info = self.FakeImgInfo()
info.file_format = 'fmt'
info = mock.MagicMock()
info.__str__.side_effect = iter(['qcow2', 'raw'])
info.backing_file = None
qemu_img_info_mock.return_value = info
info.saftey_check.return_value = True
detect_format_mock.return_value = info
def convert_side_effect(source, dest, out_format):
def convert_side_effect(source, dest, out_format, source_format):
info.file_format = 'raw'
convert_image_mock.side_effect = convert_side_effect
images.image_to_raw('image_href', 'path', 'path_tmp')
qemu_img_info_mock.assert_has_calls([mock.call('path_tmp'),
mock.call('path.converted')])
info.safety_check.assert_called_once()
self.assertEqual(2, info.__str__.call_count)
detect_format_mock.assert_has_calls([
mock.call('path_tmp'),
mock.call('path.converted')])
convert_image_mock.assert_called_once_with('path_tmp',
'path.converted', 'raw')
'path.converted', 'raw',
source_format='qcow2')
unlink_mock.assert_called_once_with('path_tmp')
rename_mock.assert_called_once_with('path.converted', 'path')
@mock.patch.object(os, 'rename', autospec=True)
@mock.patch.object(os, 'unlink', autospec=True)
@mock.patch.object(disk_utils, 'convert_image', autospec=True)
@mock.patch.object(disk_utils, 'qemu_img_info', autospec=True)
def test_image_to_raw_not_raw_after_conversion(self, qemu_img_info_mock,
@mock.patch.object(qemu_img, 'convert_image', autospec=True)
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
def test_image_to_raw_safety_check_disabled(
self, detect_format_mock, convert_image_mock,
unlink_mock, rename_mock):
CONF.set_override('force_raw_images', True)
CONF.set_override('disable_deep_image_inspection', True,
group='conductor')
info = mock.MagicMock()
info.__str__.side_effect = iter(['vmdk', 'raw'])
info.backing_file = None
info.saftey_check.return_value = None
detect_format_mock.return_value = info
def convert_side_effect(source, dest, out_format, source_format):
info.file_format = 'raw'
convert_image_mock.side_effect = convert_side_effect
images.image_to_raw('image_href', 'path', 'path_tmp')
info.safety_check.assert_not_called()
detect_format_mock.assert_has_calls([
mock.call('path')])
self.assertEqual(2, info.__str__.call_count)
convert_image_mock.assert_called_once_with('path_tmp',
'path.converted', 'raw',
source_format='vmdk')
unlink_mock.assert_called_once_with('path_tmp')
rename_mock.assert_called_once_with('path.converted', 'path')
@mock.patch.object(os, 'rename', autospec=True)
@mock.patch.object(os, 'unlink', autospec=True)
@mock.patch.object(qemu_img, 'convert_image', autospec=True)
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
def test_image_to_raw_safety_check_disabled_fails_to_convert(
self, detect_format_mock, convert_image_mock,
unlink_mock, rename_mock):
CONF.set_override('force_raw_images', True)
CONF.set_override('disable_deep_image_inspection', True,
group='conductor')
info = mock.MagicMock()
info.__str__.return_value = 'vmdk'
info.backing_file = None
info.saftey_check.return_value = None
detect_format_mock.return_value = info
self.assertRaises(exception.ImageConvertFailed,
images.image_to_raw,
'image_href', 'path', 'path_tmp')
info.safety_check.assert_not_called()
self.assertEqual(2, info.__str__.call_count)
detect_format_mock.assert_has_calls([
mock.call('path')])
convert_image_mock.assert_called_once_with('path_tmp',
'path.converted', 'raw',
source_format='vmdk')
unlink_mock.assert_called_once_with('path_tmp')
rename_mock.assert_not_called()
@mock.patch.object(os, 'unlink', autospec=True)
@mock.patch.object(qemu_img, 'convert_image', autospec=True)
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
def test_image_to_raw_not_raw_after_conversion(self, detect_format_mock,
convert_image_mock,
unlink_mock):
CONF.set_override('force_raw_images', True)
info = self.FakeImgInfo()
info.file_format = 'fmt'
info.backing_file = None
qemu_img_info_mock.return_value = info
info = mock.MagicMock()
info.__str__.return_value = 'qcow2'
detect_format_mock.return_value = info
self.assertRaises(exception.ImageConvertFailed, images.image_to_raw,
'image_href', 'path', 'path_tmp')
qemu_img_info_mock.assert_has_calls([mock.call('path_tmp'),
mock.call('path.converted')])
convert_image_mock.assert_called_once_with('path_tmp',
'path.converted', 'raw')
'path.converted', 'raw',
source_format='qcow2')
unlink_mock.assert_called_once_with('path_tmp')
info.safety_check.assert_called_once()
info.safety_check.assert_called_once()
self.assertEqual(2, info.__str__.call_count)
detect_format_mock.assert_has_calls([
mock.call('path_tmp'),
mock.call('path.converted')])
@mock.patch.object(os, 'rename', autospec=True)
@mock.patch.object(disk_utils, 'qemu_img_info', autospec=True)
def test_image_to_raw_already_raw_format(self, qemu_img_info_mock,
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
def test_image_to_raw_already_raw_format(self, detect_format_mock,
rename_mock):
info = self.FakeImgInfo()
info.file_format = 'raw'
info.backing_file = None
qemu_img_info_mock.return_value = info
info = mock.MagicMock()
info.__str__.return_value = 'raw'
detect_format_mock.return_value = info
images.image_to_raw('image_href', 'path', 'path_tmp')
qemu_img_info_mock.assert_called_once_with('path_tmp')
rename_mock.assert_called_once_with('path_tmp', 'path')
info.safety_check.assert_called_once()
info.safety_check.assert_called_once()
self.assertEqual(1, info.__str__.call_count)
detect_format_mock.assert_called_once_with('path_tmp')
@mock.patch.object(os, 'rename', autospec=True)
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
def test_image_to_raw_already_iso(self, detect_format_mock,
rename_mock):
info = mock.MagicMock()
info.__str__.return_value = 'iso'
detect_format_mock.return_value = info
images.image_to_raw('image_href', 'path', 'path_tmp')
rename_mock.assert_called_once_with('path_tmp', 'path')
info.safety_check.assert_called_once()
self.assertEqual(1, info.__str__.call_count)
detect_format_mock.assert_called_once_with('path_tmp')
@mock.patch.object(image_service, 'get_image_service', autospec=True)
def test_image_show_no_image_service(self, image_service_mock):
@ -175,36 +369,39 @@ class IronicImagesTestCase(base.TestCase):
show_mock.assert_called_once_with('context', 'image_href',
'image_service')
@mock.patch.object(disk_utils, 'qemu_img_info', autospec=True)
def test_converted_size_estimate_default(self, qemu_img_info_mock):
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
def test_converted_size_estimate_default(self, image_info_mock):
info = self.FakeImgInfo()
info.disk_size = 2
info.actual_size = 2
info.virtual_size = 10 ** 10
qemu_img_info_mock.return_value = info
image_info_mock.return_value = info
size = images.converted_size('path', estimate=True)
qemu_img_info_mock.assert_called_once_with('path')
image_info_mock.assert_called_once_with('path')
self.assertEqual(4, size)
@mock.patch.object(disk_utils, 'qemu_img_info', autospec=True)
def test_converted_size_estimate_custom(self, qemu_img_info_mock):
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
def test_converted_size_estimate_custom(self, image_info_mock):
CONF.set_override('raw_image_growth_factor', 3)
info = self.FakeImgInfo()
info.disk_size = 2
info.actual_size = 2
info.virtual_size = 10 ** 10
qemu_img_info_mock.return_value = info
image_info_mock.return_value = info
size = images.converted_size('path', estimate=True)
qemu_img_info_mock.assert_called_once_with('path')
image_info_mock.assert_called_once_with('path')
self.assertEqual(6, size)
@mock.patch.object(disk_utils, 'qemu_img_info', autospec=True)
def test_converted_size_estimate_raw_smaller(self, qemu_img_info_mock):
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
def test_converted_size_estimate_raw_smaller(self, image_info_mock):
CONF.set_override('raw_image_growth_factor', 3)
info = self.FakeImgInfo()
info.disk_size = 2
info.actual_size = 2
info.virtual_size = 5
qemu_img_info_mock.return_value = info
image_info_mock.return_value = info
size = images.converted_size('path', estimate=True)
qemu_img_info_mock.assert_called_once_with('path')
image_info_mock.assert_called_once_with('path')
self.assertEqual(5, size)
@mock.patch.object(images, 'get_image_properties', autospec=True)

View file

@ -0,0 +1,146 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from unittest import mock
from oslo_concurrency import processutils
from oslo_config import cfg
from ironic.common import qemu_img
from ironic.common import utils
from ironic.tests import base
CONF = cfg.CONF
class ConvertImageTestCase(base.TestCase):
@mock.patch.object(utils, 'execute', autospec=True)
def test_convert_image(self, execute_mock):
qemu_img.convert_image('source', 'dest', 'out_format')
execute_mock.assert_called_once_with(
'qemu-img', 'convert', '-f', 'qcow2', '-O',
'out_format', 'source', 'dest',
run_as_root=False,
prlimit=mock.ANY,
use_standard_locale=True,
env_variables={'MALLOC_ARENA_MAX': '3'})
@mock.patch.object(utils, 'execute', autospec=True)
def test_convert_image_flags(self, execute_mock):
qemu_img.convert_image('source', 'dest', 'out_format',
cache='directsync', out_of_order=True,
sparse_size='0')
execute_mock.assert_called_once_with(
'qemu-img', 'convert', '-f', 'qcow2', '-O',
'out_format', '-t', 'directsync',
'-S', '0', '-W', 'source', 'dest',
run_as_root=False,
prlimit=mock.ANY,
use_standard_locale=True,
env_variables={'MALLOC_ARENA_MAX': '3'})
@mock.patch.object(utils, 'execute', autospec=True)
def test_convert_image_retries(self, execute_mock):
ret_err = 'qemu: qemu_thread_create: Resource temporarily unavailable'
execute_mock.side_effect = [
processutils.ProcessExecutionError(stderr=ret_err), ('', ''),
processutils.ProcessExecutionError(stderr=ret_err), ('', ''),
('', ''),
]
qemu_img.convert_image('source', 'dest', 'out_format',
source_format='raw')
convert_call = mock.call('qemu-img', 'convert', '-f', 'raw', '-O',
'out_format', 'source', 'dest',
run_as_root=False,
prlimit=mock.ANY,
use_standard_locale=True,
env_variables={'MALLOC_ARENA_MAX': '3'})
execute_mock.assert_has_calls([
convert_call,
mock.call('sync'),
convert_call,
mock.call('sync'),
convert_call,
])
@mock.patch.object(utils, 'execute', autospec=True)
def test_convert_image_retries_alternate_error(self, execute_mock):
ret_err = 'Failed to allocate memory: Cannot allocate memory\n'
execute_mock.side_effect = [
processutils.ProcessExecutionError(stderr=ret_err), ('', ''),
processutils.ProcessExecutionError(stderr=ret_err), ('', ''),
('', ''),
]
qemu_img.convert_image('source', 'dest', 'out_format')
convert_call = mock.call('qemu-img', 'convert', '-f', 'qcow2', '-O',
'out_format', 'source', 'dest',
run_as_root=False,
prlimit=mock.ANY,
use_standard_locale=True,
env_variables={'MALLOC_ARENA_MAX': '3'})
execute_mock.assert_has_calls([
convert_call,
mock.call('sync'),
convert_call,
mock.call('sync'),
convert_call,
])
@mock.patch.object(utils, 'execute', autospec=True)
def test_convert_image_retries_and_fails(self, execute_mock):
ret_err = 'qemu: qemu_thread_create: Resource temporarily unavailable'
execute_mock.side_effect = [
processutils.ProcessExecutionError(stderr=ret_err), ('', ''),
processutils.ProcessExecutionError(stderr=ret_err), ('', ''),
processutils.ProcessExecutionError(stderr=ret_err), ('', ''),
processutils.ProcessExecutionError(stderr=ret_err),
]
self.assertRaises(processutils.ProcessExecutionError,
qemu_img.convert_image,
'source', 'dest', 'out_format')
convert_call = mock.call('qemu-img', 'convert', '-f', 'qcow2', '-O',
'out_format', 'source', 'dest',
run_as_root=False,
prlimit=mock.ANY,
use_standard_locale=True,
env_variables={'MALLOC_ARENA_MAX': '3'})
execute_mock.assert_has_calls([
convert_call,
mock.call('sync'),
convert_call,
mock.call('sync'),
convert_call,
])
@mock.patch.object(utils, 'execute', autospec=True)
def test_convert_image_just_fails(self, execute_mock):
ret_err = 'Aliens'
execute_mock.side_effect = [
processutils.ProcessExecutionError(stderr=ret_err),
]
self.assertRaises(processutils.ProcessExecutionError,
qemu_img.convert_image,
'source', 'dest', 'out_format')
convert_call = mock.call('qemu-img', 'convert', '-f', 'qcow2', '-O',
'out_format', 'source', 'dest',
run_as_root=False,
prlimit=mock.ANY,
use_standard_locale=True,
env_variables={'MALLOC_ARENA_MAX': '3'})
execute_mock.assert_has_calls([
convert_call,
])

View file

@ -24,6 +24,7 @@ from oslo_utils import fileutils
from oslo_utils import uuidutils
from ironic.common import boot_devices
from ironic.common import checksum_utils
from ironic.common import exception
from ironic.common import faults
from ironic.common import image_service
@ -604,9 +605,33 @@ class OtherFunctionTestCase(db_base.DbTestCase):
utils.fetch_images(None, mock_cache, [('uuid', 'path')])
mock_clean_up_caches.assert_called_once_with(None, 'master_dir',
[('uuid', 'path')])
mock_cache.fetch_image.assert_called_once_with('uuid', 'path',
ctx=None,
force_raw=True)
mock_cache.fetch_image.assert_called_once_with(
'uuid', 'path',
ctx=None,
force_raw=True,
expected_format=None,
expected_checksum=None,
expected_checksum_algo=None)
@mock.patch.object(image_cache, 'clean_up_caches', autospec=True)
def test_fetch_images_checksum(self, mock_clean_up_caches):
mock_cache = mock.MagicMock(
spec_set=['fetch_image', 'master_dir'], master_dir='master_dir')
utils.fetch_images(None, mock_cache, [('uuid', 'path')],
force_raw=True,
expected_format='qcow2',
expected_checksum='f00',
expected_checksum_algo='sha256')
mock_clean_up_caches.assert_called_once_with(None, 'master_dir',
[('uuid', 'path')])
mock_cache.fetch_image.assert_called_once_with(
'uuid', 'path',
ctx=None,
force_raw=True,
expected_format='qcow2',
expected_checksum='f00',
expected_checksum_algo='sha256')
@mock.patch.object(image_cache, 'clean_up_caches', autospec=True)
def test_fetch_images_fail(self, mock_clean_up_caches):
@ -1683,13 +1708,18 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
deploy_interface='direct')
cfg.CONF.set_override('image_download_source', 'swift', group='agent')
@mock.patch.object(utils, 'cache_instance_image', autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
@mock.patch.object(image_service, 'GlanceImageService', autospec=True)
def test_build_instance_info_for_deploy_glance_image(self, glance_mock,
validate_mock):
validate_mock,
mock_cache_image):
# NOTE(TheJulia): For humans later: This test is geared towards the
# swift backed glance path where temprul will be used.
i_info = self.node.instance_info
i_info['image_source'] = '733d1c44-a2ea-414b-aca7-69decf20d810'
i_info['image_url'] = 'invalid'
driver_internal_info = self.node.driver_internal_info
driver_internal_info['is_whole_disk_image'] = True
self.node.driver_internal_info = driver_internal_info
@ -1703,10 +1733,26 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
return_value=image_info)
glance_mock.return_value.swift_temp_url.return_value = (
'http://temp-url')
expected_info = {
'configdrive': 'TG9yZW0gaXBzdW0gZG9sb3Igc2l0IGFtZXQ=',
'foo': 'bar',
'image_checksum': 'aa',
'image_container_format': 'bare',
'image_disk_format': 'qcow2',
'image_os_hash_algo': 'sha512',
'image_os_hash_value': 'fake-sha512',
'image_properties': {},
'image_source': '733d1c44-a2ea-414b-aca7-69decf20d810',
'image_tags': [],
'image_type': 'whole-disk',
'image_url': 'http://temp-url'
}
mock_cache_image.return_value = ('fake', '/tmp/foo', 'qcow2')
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
utils.build_instance_info_for_deploy(task)
info = utils.build_instance_info_for_deploy(task)
glance_mock.assert_called_once_with(context=task.context)
glance_mock.return_value.show.assert_called_once_with(
@ -1715,13 +1761,77 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
image_info)
validate_mock.assert_called_once_with(mock.ANY, 'http://temp-url',
secret=True)
self.assertEqual(expected_info, info)
mock_cache_image.assert_not_called()
@mock.patch.object(utils, 'cache_instance_image', autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
@mock.patch.object(image_service, 'GlanceImageService', autospec=True)
def test_build_instance_info_for_deploy_glance_image_checked(
self, glance_mock, validate_mock, mock_cache_image):
# NOTE(TheJulia): For humans later: This test is geared towards the
# swift backed glance path where temprul will be used.
cfg.CONF.set_override('conductor_always_validates_images', True,
group='conductor')
i_info = self.node.instance_info
i_info['image_source'] = '733d1c44-a2ea-414b-aca7-69decf20d810'
i_info['image_url'] = 'invalid'
driver_internal_info = self.node.driver_internal_info
driver_internal_info['is_whole_disk_image'] = True
self.node.driver_internal_info = driver_internal_info
self.node.instance_info = i_info
self.node.save()
image_info = {'checksum': 'aa', 'disk_format': 'qcow2',
'os_hash_algo': 'sha512', 'os_hash_value': 'fake-sha512',
'container_format': 'bare', 'properties': {}}
glance_mock.return_value.show = mock.MagicMock(spec_set=[],
return_value=image_info)
glance_mock.return_value.swift_temp_url.return_value = (
'http://temp-url')
expected_info = {
'configdrive': 'TG9yZW0gaXBzdW0gZG9sb3Igc2l0IGFtZXQ=',
'foo': 'bar',
'image_checksum': 'aa',
'image_container_format': 'bare',
'image_disk_format': 'qcow2',
'image_os_hash_algo': 'sha512',
'image_os_hash_value': 'fake-sha512',
'image_properties': {},
'image_source': '733d1c44-a2ea-414b-aca7-69decf20d810',
'image_tags': [],
'image_type': 'whole-disk',
'image_url': 'http://temp-url'
}
mock_cache_image.return_value = ('fake', '/tmp/foo', 'qcow2')
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
info = utils.build_instance_info_for_deploy(task)
glance_mock.assert_called_once_with(context=task.context)
glance_mock.return_value.show.assert_called_once_with(
self.node.instance_info['image_source'])
glance_mock.return_value.swift_temp_url.assert_called_once_with(
image_info)
validate_mock.assert_called_once_with(mock.ANY, 'http://temp-url',
secret=True)
self.assertEqual(expected_info, info)
mock_cache_image.assert_called_once_with(
mock.ANY, mock.ANY, force_raw=False, expected_format='qcow2')
@mock.patch.object(utils, 'cache_instance_image', autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
@mock.patch.object(utils, 'parse_instance_info', autospec=True)
@mock.patch.object(image_service, 'GlanceImageService', autospec=True)
def test_build_instance_info_for_deploy_glance_partition_image(
self, glance_mock, parse_instance_info_mock, validate_mock):
self, glance_mock, parse_instance_info_mock, validate_mock,
mock_cache_image):
# NOTE(TheJulia): For humans later: This test is geared towards the
# swift backed glance path where temprul will be used.
i_info = {}
i_info['image_source'] = '733d1c44-a2ea-414b-aca7-69decf20d810'
i_info['image_type'] = 'partition'
@ -1730,6 +1840,7 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
i_info['ephemeral_gb'] = 0
i_info['ephemeral_format'] = None
i_info['configdrive'] = 'configdrive'
i_info['image_url'] = 'invalid'
driver_internal_info = self.node.driver_internal_info
driver_internal_info['is_whole_disk_image'] = False
self.node.driver_internal_info = driver_internal_info
@ -1763,6 +1874,8 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
'image_os_hash_value': 'fake-sha512',
'image_container_format': 'bare',
'image_disk_format': 'qcow2'}
mock_cache_image.return_value = ('fake', '/tmp/foo', 'qcow2')
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
@ -1779,16 +1892,93 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
self.assertEqual('partition', image_type)
self.assertEqual(expected_i_info, info)
parse_instance_info_mock.assert_called_once_with(task.node)
mock_cache_image.assert_not_called()
@mock.patch.object(utils, 'cache_instance_image', autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
@mock.patch.object(utils, 'parse_instance_info', autospec=True)
@mock.patch.object(image_service, 'GlanceImageService', autospec=True)
def test_build_instance_info_for_deploy_glance_partition_image_checked(
self, glance_mock, parse_instance_info_mock, validate_mock,
mock_cache_image):
# NOTE(TheJulia): For humans later: This test is geared towards the
# swift backed glance path where temprul will be used.
cfg.CONF.set_override('conductor_always_validates_images', True,
group='conductor')
i_info = {}
i_info['image_source'] = '733d1c44-a2ea-414b-aca7-69decf20d810'
i_info['image_type'] = 'partition'
i_info['root_gb'] = 5
i_info['swap_mb'] = 4
i_info['ephemeral_gb'] = 0
i_info['ephemeral_format'] = None
i_info['configdrive'] = 'configdrive'
i_info['image_url'] = 'invalid'
driver_internal_info = self.node.driver_internal_info
driver_internal_info['is_whole_disk_image'] = False
self.node.driver_internal_info = driver_internal_info
self.node.instance_info = i_info
self.node.save()
image_info = {'checksum': 'aa', 'disk_format': 'qcow2',
'os_hash_algo': 'sha512', 'os_hash_value': 'fake-sha512',
'container_format': 'bare',
'properties': {'kernel_id': 'kernel',
'ramdisk_id': 'ramdisk'}}
glance_mock.return_value.show = mock.MagicMock(spec_set=[],
return_value=image_info)
glance_obj_mock = glance_mock.return_value
glance_obj_mock.swift_temp_url.return_value = 'http://temp-url'
parse_instance_info_mock.return_value = {'swap_mb': 4}
image_source = '733d1c44-a2ea-414b-aca7-69decf20d810'
expected_i_info = {'root_gb': 5,
'swap_mb': 4,
'ephemeral_gb': 0,
'ephemeral_format': None,
'configdrive': 'configdrive',
'image_source': image_source,
'image_url': 'http://temp-url',
'image_type': 'partition',
'image_tags': [],
'image_properties': {'kernel_id': 'kernel',
'ramdisk_id': 'ramdisk'},
'image_checksum': 'aa',
'image_os_hash_algo': 'sha512',
'image_os_hash_value': 'fake-sha512',
'image_container_format': 'bare',
'image_disk_format': 'qcow2'}
mock_cache_image.return_value = ('fake', '/tmp/foo', 'qcow2')
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
info = utils.build_instance_info_for_deploy(task)
glance_mock.assert_called_once_with(context=task.context)
glance_mock.return_value.show.assert_called_once_with(
self.node.instance_info['image_source'])
glance_mock.return_value.swift_temp_url.assert_called_once_with(
image_info)
validate_mock.assert_called_once_with(
mock.ANY, 'http://temp-url', secret=True)
image_type = task.node.instance_info['image_type']
self.assertEqual('partition', image_type)
self.assertEqual(expected_i_info, info)
parse_instance_info_mock.assert_called_once_with(task.node)
mock_cache_image.assert_called_once_with(
mock.ANY, mock.ANY, force_raw=False, expected_format='qcow2')
@mock.patch.object(utils, 'cache_instance_image', autospec=True)
@mock.patch.object(utils, 'get_boot_option', autospec=True,
return_value='kickstart')
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
@mock.patch.object(utils, 'parse_instance_info', autospec=True)
@mock.patch.object(image_service, 'GlanceImageService', autospec=True)
def test_build_instance_info_for_deploy_glance_partition_image_anaconda(
def test_build_instance_info_for_deploy_glance_anaconda(
self, glance_mock, parse_instance_info_mock, validate_mock,
boot_opt_mock):
boot_opt_mock, mock_cache_image):
i_info = {}
i_info['image_source'] = '733d1c44-a2ea-414b-aca7-69decf20d810'
i_info['kernel'] = '13ce5a56-1de3-4916-b8b2-be778645d003'
@ -1833,6 +2023,7 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
'image_os_hash_value': 'fake-sha512',
'image_container_format': 'bare',
'image_disk_format': 'qcow2'}
mock_cache_image.return_value = ('fake', '/tmp/foo', 'qcow2')
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
@ -1851,22 +2042,104 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
self.assertEqual('ramdisk', info['ramdisk'])
self.assertEqual(expected_i_info, info)
parse_instance_info_mock.assert_called_once_with(task.node)
mock_cache_image.assert_not_called()
@mock.patch.object(utils, 'cache_instance_image', autospec=True)
@mock.patch.object(utils, 'get_boot_option', autospec=True,
return_value='kickstart')
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
@mock.patch.object(utils, 'parse_instance_info', autospec=True)
@mock.patch.object(image_service, 'GlanceImageService', autospec=True)
def test_build_instance_info_for_deploy_glance_anaconda_img_checked(
self, glance_mock, parse_instance_info_mock, validate_mock,
boot_opt_mock, mock_cache_image):
cfg.CONF.set_override('conductor_always_validates_images', True,
group='conductor')
i_info = {}
i_info['image_source'] = '733d1c44-a2ea-414b-aca7-69decf20d810'
i_info['kernel'] = '13ce5a56-1de3-4916-b8b2-be778645d003'
i_info['ramdisk'] = 'a5a370a8-1b39-433f-be63-2c7d708e4b4e'
i_info['root_gb'] = 5
i_info['swap_mb'] = 4
i_info['ephemeral_gb'] = 0
i_info['ephemeral_format'] = None
i_info['configdrive'] = 'configdrive'
driver_internal_info = self.node.driver_internal_info
driver_internal_info['is_whole_disk_image'] = False
self.node.driver_internal_info = driver_internal_info
self.node.instance_info = i_info
self.node.save()
image_info = {'checksum': 'aa', 'disk_format': 'qcow2',
'os_hash_algo': 'sha512', 'os_hash_value': 'fake-sha512',
'container_format': 'bare',
'properties': {'kernel_id': 'kernel',
'ramdisk_id': 'ramdisk'}}
glance_mock.return_value.show = mock.MagicMock(spec_set=[],
return_value=image_info)
glance_obj_mock = glance_mock.return_value
glance_obj_mock.swift_temp_url.return_value = 'http://temp-url'
parse_instance_info_mock.return_value = {'swap_mb': 4}
image_source = '733d1c44-a2ea-414b-aca7-69decf20d810'
expected_i_info = {'root_gb': 5,
'swap_mb': 4,
'ephemeral_gb': 0,
'ephemeral_format': None,
'configdrive': 'configdrive',
'image_source': image_source,
'image_url': 'http://temp-url',
'kernel': 'kernel',
'ramdisk': 'ramdisk',
'image_type': 'partition',
'image_tags': [],
'image_properties': {'kernel_id': 'kernel',
'ramdisk_id': 'ramdisk'},
'image_checksum': 'aa',
'image_os_hash_algo': 'sha512',
'image_os_hash_value': 'fake-sha512',
'image_container_format': 'bare',
'image_disk_format': 'qcow2'}
mock_cache_image.return_value = ('fake', '/tmp/foo', 'qcow2')
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
info = utils.build_instance_info_for_deploy(task)
glance_mock.assert_called_once_with(context=task.context)
glance_mock.return_value.show.assert_called_once_with(
self.node.instance_info['image_source'])
glance_mock.return_value.swift_temp_url.assert_called_once_with(
image_info)
validate_mock.assert_called_once_with(
mock.ANY, 'http://temp-url', secret=True)
image_type = task.node.instance_info['image_type']
self.assertEqual('partition', image_type)
self.assertEqual('kernel', info['kernel'])
self.assertEqual('ramdisk', info['ramdisk'])
self.assertEqual(expected_i_info, info)
parse_instance_info_mock.assert_called_once_with(task.node)
mock_cache_image.assert_called_once_with(
mock.ANY, mock.ANY, force_raw=False, expected_format='qcow2')
@mock.patch.object(utils, 'cache_instance_image', autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
def test_build_instance_info_for_deploy_nonglance_image(
self, validate_href_mock):
self, validate_href_mock, mock_cache_image):
i_info = self.node.instance_info
driver_internal_info = self.node.driver_internal_info
i_info['image_source'] = 'http://image-ref'
i_info['image_checksum'] = 'aa'
i_info['root_gb'] = 10
i_info['image_checksum'] = 'aa'
i_info['image_url'] = 'prior_failed_url'
driver_internal_info['is_whole_disk_image'] = True
self.node.instance_info = i_info
self.node.driver_internal_info = driver_internal_info
self.node.save()
mock_cache_image.return_value = ('fake', '/tmp/foo', 'qcow2')
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
@ -1876,12 +2149,82 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
info['image_url'])
validate_href_mock.assert_called_once_with(
mock.ANY, 'http://image-ref', False)
mock_cache_image.assert_not_called()
@mock.patch.object(utils, 'cache_instance_image', autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
def test_build_instance_info_for_deploy_nonglance_image_fmt_checked(
self, validate_href_mock, mock_cache_image):
cfg.CONF.set_override('conductor_always_validates_images', True,
group='conductor')
i_info = self.node.instance_info
driver_internal_info = self.node.driver_internal_info
i_info['image_source'] = 'http://image-ref'
i_info['image_checksum'] = 'aa'
i_info['root_gb'] = 10
i_info['image_checksum'] = 'aa'
i_info['image_url'] = 'prior_failed_url'
driver_internal_info['is_whole_disk_image'] = True
self.node.instance_info = i_info
self.node.driver_internal_info = driver_internal_info
self.node.save()
mock_cache_image.return_value = ('fake', '/tmp/foo', 'qcow2')
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
info = utils.build_instance_info_for_deploy(task)
self.assertEqual(self.node.instance_info['image_source'],
info['image_url'])
validate_href_mock.assert_called_once_with(
mock.ANY, 'http://image-ref', False)
mock_cache_image.assert_called_once_with(
mock.ANY, mock.ANY, force_raw=False, expected_format=None)
@mock.patch.object(utils, 'cache_instance_image', autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
def test_build_instance_info_for_deploy_nonglance_image_fmt_not_checked(
self, validate_href_mock, mock_cache_image):
cfg.CONF.set_override('conductor_always_validates_images', True,
group='conductor')
cfg.CONF.set_override('disable_deep_image_inspection', True,
group='conductor')
i_info = self.node.instance_info
driver_internal_info = self.node.driver_internal_info
i_info['image_source'] = 'http://image-ref'
i_info['image_checksum'] = 'aa'
i_info['root_gb'] = 10
i_info['image_checksum'] = 'aa'
i_info['image_url'] = 'prior_failed_url'
driver_internal_info['is_whole_disk_image'] = True
self.node.instance_info = i_info
self.node.driver_internal_info = driver_internal_info
self.node.save()
mock_cache_image.return_value = ('fake', '/tmp/foo', 'qcow2')
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
info = utils.build_instance_info_for_deploy(task)
self.assertEqual(self.node.instance_info['image_source'],
info['image_url'])
validate_href_mock.assert_called_once_with(
mock.ANY, 'http://image-ref', False)
mock_cache_image.assert_not_called()
@mock.patch.object(utils, 'cache_instance_image', autospec=True)
@mock.patch.object(utils, 'parse_instance_info', autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
def test_build_instance_info_for_deploy_nonglance_partition_image(
self, validate_href_mock, parse_instance_info_mock):
def test_build_instance_info_for_deploy_nonglance_part_img_checked(
self, validate_href_mock, parse_instance_info_mock,
mock_cache_image):
cfg.CONF.set_override('conductor_always_validates_images', True,
group='conductor')
i_info = {}
driver_internal_info = self.node.driver_internal_info
i_info['image_source'] = 'http://image-ref'
@ -1890,6 +2233,7 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
i_info['image_checksum'] = 'aa'
i_info['root_gb'] = 10
i_info['configdrive'] = 'configdrive'
i_info['image_url'] = 'invalid'
driver_internal_info['is_whole_disk_image'] = False
self.node.instance_info = i_info
self.node.driver_internal_info = driver_internal_info
@ -1908,6 +2252,8 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
'root_gb': 10,
'swap_mb': 5,
'configdrive': 'configdrive'}
mock_cache_image.return_value = ('fake', '/tmp/foo', 'qcow2')
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
@ -1920,6 +2266,58 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
self.assertEqual('partition', info['image_type'])
self.assertEqual(expected_i_info, info)
parse_instance_info_mock.assert_called_once_with(task.node)
mock_cache_image.assert_called_once_with(
mock.ANY, mock.ANY, force_raw=False, expected_format=None)
@mock.patch.object(utils, 'cache_instance_image', autospec=True)
@mock.patch.object(utils, 'parse_instance_info', autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
def test_build_instance_info_for_deploy_nonglance_partition_image(
self, validate_href_mock, parse_instance_info_mock,
mock_cache_image):
i_info = {}
driver_internal_info = self.node.driver_internal_info
i_info['image_source'] = 'http://image-ref'
i_info['kernel'] = 'http://kernel-ref'
i_info['ramdisk'] = 'http://ramdisk-ref'
i_info['image_checksum'] = 'aa'
i_info['root_gb'] = 10
i_info['configdrive'] = 'configdrive'
i_info['image_url'] = 'invalid'
driver_internal_info['is_whole_disk_image'] = False
self.node.instance_info = i_info
self.node.driver_internal_info = driver_internal_info
self.node.save()
validate_href_mock.side_effect = ['http://image-ref',
'http://kernel-ref',
'http://ramdisk-ref']
parse_instance_info_mock.return_value = {'swap_mb': 5}
expected_i_info = {'image_source': 'http://image-ref',
'image_url': 'http://image-ref',
'image_type': 'partition',
'kernel': 'http://kernel-ref',
'ramdisk': 'http://ramdisk-ref',
'image_checksum': 'aa',
'root_gb': 10,
'swap_mb': 5,
'configdrive': 'configdrive'}
mock_cache_image.return_value = ('fake', '/tmp/foo', 'qcow2')
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
info = utils.build_instance_info_for_deploy(task)
self.assertEqual(self.node.instance_info['image_source'],
info['image_url'])
validate_href_mock.assert_called_once_with(
mock.ANY, 'http://image-ref', False)
self.assertEqual('partition', info['image_type'])
self.assertEqual(expected_i_info, info)
parse_instance_info_mock.assert_called_once_with(task.node)
mock_cache_image.assert_not_called()
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
@ -1930,6 +2328,7 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
i_info = self.node.instance_info
i_info['image_source'] = 'http://img.qcow2'
i_info['image_checksum'] = 'aa'
i_info['image_url'] = 'invalid'
self.node.instance_info = i_info
self.node.save()
@ -1946,6 +2345,7 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
i_info = self.node.instance_info
driver_internal_info = self.node.driver_internal_info
i_info['image_source'] = 'http://image-url/folder/'
i_info['image_url'] = 'invalid'
driver_internal_info.pop('is_whole_disk_image', None)
driver_internal_info['is_source_a_path'] = True
self.node.instance_info = i_info
@ -1992,6 +2392,50 @@ class TestBuildInstanceInfoForDeploy(db_base.DbTestCase):
validate_href_mock.assert_called_once_with(
mock.ANY, 'http://image-url/folder', False)
@mock.patch.object(utils, 'cache_instance_image', autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
def test_build_instance_info_for_deploy_source_redirect_not_path(
self, validate_href_mock, mock_cache_image):
cfg.CONF.set_override('conductor_always_validates_images', True,
group='conductor')
mock_cache_image.return_value = ('fake', '/tmp/foo', 'qcow2')
i_info = self.node.instance_info
driver_internal_info = self.node.driver_internal_info
url = 'http://image-url/file'
r_url = 'https://image-url/file'
i_info['image_source'] = url
driver_internal_info['is_whole_disk_image'] = True
self.node.instance_info = i_info
self.node.driver_internal_info = driver_internal_info
self.node.save()
validate_href_mock.side_effect = iter([
exception.ImageRefIsARedirect(
image_ref=url,
redirect_url=r_url),
None])
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
info = utils.build_instance_info_for_deploy(task)
self.assertNotEqual(self.node.instance_info['image_source'],
info['image_url'])
self.assertEqual(r_url, info['image_url'])
validate_href_mock.assert_has_calls([
mock.call(mock.ANY, 'http://image-url/file', False),
mock.call(mock.ANY, 'https://image-url/file', False)
])
mock_cache_image.assert_called_once_with(mock.ANY,
task.node,
force_raw=False,
expected_format=None)
self.assertEqual('https://image-url/file',
task.node.instance_info['image_url'])
self.assertEqual('https://image-url/file',
task.node.instance_info['image_source'])
class TestBuildInstanceInfoForHttpProvisioning(db_base.DbTestCase):
def setUp(self):
@ -2017,7 +2461,8 @@ class TestBuildInstanceInfoForHttpProvisioning(db_base.DbTestCase):
self.node.uuid)
self.cache_image_mock.return_value = (
'733d1c44-a2ea-414b-aca7-69decf20d810',
self.fake_path)
self.fake_path,
'qcow2')
self.ensure_tree_mock = self.useFixture(fixtures.MockPatchObject(
utils.fileutils, 'ensure_tree', autospec=True)).mock
self.create_link_mock = self.useFixture(fixtures.MockPatchObject(
@ -2039,19 +2484,25 @@ class TestBuildInstanceInfoForHttpProvisioning(db_base.DbTestCase):
autospec=True)
@mock.patch.object(image_service, 'GlanceImageService', autospec=True)
def _test_build_instance_info(self, glance_mock, validate_mock,
image_info={}, expect_raw=False):
image_info={}, expect_raw=False,
expect_format='qcow2',
expect_checksum='fake-sha512',
expect_checksum_algo='sha512'):
glance_mock.return_value.show = mock.MagicMock(spec_set=[],
return_value=image_info)
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
instance_info = utils.build_instance_info_for_deploy(task)
glance_mock.assert_called_once_with(context=task.context)
glance_mock.return_value.show.assert_called_once_with(
self.node.instance_info['image_source'])
self.cache_image_mock.assert_called_once_with(task.context,
task.node,
force_raw=expect_raw)
self.cache_image_mock.assert_called_once_with(
task.context,
task.node,
force_raw=expect_raw,
expected_format=expect_format,
expected_checksum=expect_checksum,
expected_checksum_algo=expect_checksum_algo)
symlink_dir = utils._get_http_image_symlink_dir_path()
symlink_file = utils._get_http_image_symlink_file_path(
self.node.uuid)
@ -2092,7 +2543,7 @@ class TestBuildInstanceInfoForHttpProvisioning(db_base.DbTestCase):
cfg.CONF.set_override('force_raw_images', True)
self.image_info['disk_format'] = 'raw'
image_path, instance_info = self._test_build_instance_info(
image_info=self.image_info, expect_raw=True)
image_info=self.image_info, expect_raw=True, expect_format='raw')
self.assertEqual(instance_info['image_checksum'], 'aa')
self.assertEqual(instance_info['image_os_hash_algo'], 'sha512')
@ -2105,7 +2556,8 @@ class TestBuildInstanceInfoForHttpProvisioning(db_base.DbTestCase):
cfg.CONF.set_override('force_raw_images', True)
self.image_info['os_hash_algo'] = 'md5'
image_path, instance_info = self._test_build_instance_info(
image_info=self.image_info, expect_raw=True)
image_info=self.image_info, expect_raw=True,
expect_checksum_algo='md5')
self.assertIsNone(instance_info['image_checksum'])
self.assertEqual(instance_info['image_disk_format'], 'raw')
@ -2117,7 +2569,8 @@ class TestBuildInstanceInfoForHttpProvisioning(db_base.DbTestCase):
self.image_info['os_hash_algo'] = 'md5'
self.image_info['disk_format'] = 'raw'
image_path, instance_info = self._test_build_instance_info(
image_info=self.image_info, expect_raw=True)
image_info=self.image_info, expect_raw=True, expect_format='raw',
expect_checksum_algo='md5')
self.assertEqual(instance_info['image_checksum'], 'aa')
self.assertEqual(instance_info['image_disk_format'], 'raw')
@ -2149,7 +2602,10 @@ class TestBuildInstanceInfoForHttpProvisioning(db_base.DbTestCase):
self.assertEqual('fake-checksum', info['image_os_hash_value'])
self.assertEqual('raw', info['image_disk_format'])
self.cache_image_mock.assert_called_once_with(
task.context, task.node, force_raw=True)
task.context, task.node, force_raw=True,
expected_format=None,
expected_checksum='aa',
expected_checksum_algo=None)
self.checksum_mock.assert_called_once_with(
self.fake_path, algorithm='sha256')
validate_href_mock.assert_called_once_with(
@ -2181,7 +2637,10 @@ class TestBuildInstanceInfoForHttpProvisioning(db_base.DbTestCase):
self.assertEqual('sha256', info['image_os_hash_algo'])
self.assertEqual('fake-checksum', info['image_os_hash_value'])
self.cache_image_mock.assert_called_once_with(
task.context, task.node, force_raw=True)
task.context, task.node, force_raw=True,
expected_format=None,
expected_checksum='aa',
expected_checksum_algo=None)
self.checksum_mock.assert_called_once_with(
self.fake_path, algorithm='sha256')
validate_href_mock.assert_called_once_with(
@ -2198,6 +2657,7 @@ class TestBuildInstanceInfoForHttpProvisioning(db_base.DbTestCase):
i_info['image_checksum'] = 'aa'
i_info['root_gb'] = 10
i_info['image_download_source'] = 'local'
i_info['image_disk_format'] = 'qcow2'
driver_internal_info['is_whole_disk_image'] = True
self.node.instance_info = i_info
self.node.driver_internal_info = driver_internal_info
@ -2215,12 +2675,59 @@ class TestBuildInstanceInfoForHttpProvisioning(db_base.DbTestCase):
self.assertEqual('sha256', info['image_os_hash_algo'])
self.assertEqual('fake-checksum', info['image_os_hash_value'])
self.cache_image_mock.assert_called_once_with(
task.context, task.node, force_raw=True)
task.context, task.node, force_raw=True,
expected_format='qcow2',
expected_checksum='aa',
expected_checksum_algo=None)
self.checksum_mock.assert_called_once_with(
self.fake_path, algorithm='sha256')
validate_href_mock.assert_called_once_with(
mock.ANY, expected_url, False)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
def test_build_instance_info_remote_image_via_http_verified(
self, validate_href_mock):
cfg.CONF.set_override('stream_raw_images', False, group='agent')
cfg.CONF.set_override('image_download_source', 'http', group='agent')
cfg.CONF.set_override('conductor_always_validates_images', True,
group='conductor')
i_info = self.node.instance_info
driver_internal_info = self.node.driver_internal_info
i_info['image_source'] = 'http://image-ref'
i_info['image_checksum'] = 'aa'
i_info['root_gb'] = 10
i_info['image_download_source'] = 'local'
i_info['image_disk_format'] = 'qcow2'
# NOTE(TheJulia): This is the override ability, and we need to
# explicitly exercise the alternate path
del i_info['image_download_source']
del i_info['image_url']
driver_internal_info['is_whole_disk_image'] = True
self.node.instance_info = i_info
self.node.driver_internal_info = driver_internal_info
self.node.save()
expected_url = 'http://image-ref'
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
info = utils.build_instance_info_for_deploy(task)
self.assertEqual(expected_url, info['image_url'])
# Image is not extracted, checksum is not changed,
# due to not being forced to raw.
self.assertNotIn('image_os_hash_algo', info)
self.assertNotIn('image_os_hash_value', info)
self.cache_image_mock.assert_called_once_with(
mock.ANY, mock.ANY, force_raw=False,
expected_format='qcow2')
self.checksum_mock.assert_not_called()
validate_href_mock.assert_called_once_with(
mock.ANY, expected_url, False)
self.assertEqual(expected_url,
task.node.instance_info['image_url'])
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
def test_build_instance_info_local_image_via_dinfo(self,
@ -2251,7 +2758,10 @@ class TestBuildInstanceInfoForHttpProvisioning(db_base.DbTestCase):
self.assertEqual('sha256', info['image_os_hash_algo'])
self.assertEqual('fake-checksum', info['image_os_hash_value'])
self.cache_image_mock.assert_called_once_with(
task.context, task.node, force_raw=True)
task.context, task.node, force_raw=True,
expected_format=None,
expected_checksum='aa',
expected_checksum_algo=None)
self.checksum_mock.assert_called_once_with(
self.fake_path, algorithm='sha256')
validate_href_mock.assert_called_once_with(
@ -2280,18 +2790,182 @@ class TestBuildInstanceInfoForHttpProvisioning(db_base.DbTestCase):
self.context, self.node.uuid, shared=False) as task:
info = utils.build_instance_info_for_deploy(task)
self.assertEqual(expected_url, info['image_url'])
self.assertEqual('aa', info['image_checksum'])
self.assertEqual('raw', info['image_disk_format'])
self.assertIsNone(info['image_os_hash_algo'])
self.assertIsNone(info['image_os_hash_value'])
self.cache_image_mock.assert_called_once_with(
task.context, task.node, force_raw=True)
task.context, task.node, force_raw=True,
expected_format='raw', expected_checksum='aa',
expected_checksum_algo=None)
self.checksum_mock.assert_not_called()
validate_href_mock.assert_called_once_with(
mock.ANY, expected_url, False)
@mock.patch.object(image_service.HttpImageService, 'get',
autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
def test_build_instance_info_remote_checksum_image(self,
validate_href_mock,
get_mock):
# Test case where we would download both the image and the checksum
# and ultimately convert the image.
get_mock.return_value = 'd8e8fca2dc0f896fd7cb4cb0031ba249'
cfg.CONF.set_override('image_download_source', 'local', group='agent')
i_info = self.node.instance_info
driver_internal_info = self.node.driver_internal_info
i_info['image_source'] = 'http://image-ref'
i_info['image_checksum'] = 'http://image-checksum'
i_info['root_gb'] = 10
i_info['image_disk_format'] = 'qcow2'
driver_internal_info['is_whole_disk_image'] = True
self.node.instance_info = i_info
self.node.driver_internal_info = driver_internal_info
self.node.save()
expected_url = (
'http://172.172.24.10:8080/agent_images/%s' % self.node.uuid)
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
info = utils.build_instance_info_for_deploy(task)
self.assertEqual(expected_url, info['image_url'])
self.assertEqual('raw', info['image_disk_format'])
self.cache_image_mock.assert_called_once_with(
task.context, task.node, force_raw=True,
expected_format='qcow2',
expected_checksum='d8e8fca2dc0f896fd7cb4cb0031ba249',
expected_checksum_algo=None)
self.checksum_mock.assert_called_once_with(
self.fake_path, algorithm='sha256')
validate_href_mock.assert_called_once_with(
mock.ANY, expected_url, False)
get_mock.assert_called_once_with('http://image-checksum')
@mock.patch.object(image_service.HttpImageService, 'get',
autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
def test_build_instance_info_remote_checksum_sha256(self,
validate_href_mock,
get_mock):
# Test case where we would download both the image and the checksum
# and ultimately convert the image.
get_mock.return_value = 'a' * 64 + '\n'
cfg.CONF.set_override('image_download_source', 'local', group='agent')
i_info = self.node.instance_info
driver_internal_info = self.node.driver_internal_info
i_info['image_source'] = 'https://image-ref'
i_info['image_checksum'] = 'https://image-checksum'
i_info['root_gb'] = 10
i_info['image_disk_format'] = 'qcow2'
driver_internal_info['is_whole_disk_image'] = True
self.node.instance_info = i_info
self.node.driver_internal_info = driver_internal_info
self.node.save()
expected_url = (
'http://172.172.24.10:8080/agent_images/%s' % self.node.uuid)
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
info = utils.build_instance_info_for_deploy(task)
self.assertEqual(expected_url, info['image_url'])
self.assertEqual('raw', info['image_disk_format'])
self.cache_image_mock.assert_called_once_with(
task.context, task.node, force_raw=True,
expected_format='qcow2',
expected_checksum='a' * 64,
expected_checksum_algo='sha256')
self.checksum_mock.assert_called_once_with(
self.fake_path, algorithm='sha256')
validate_href_mock.assert_called_once_with(
mock.ANY, expected_url, False)
get_mock.assert_called_once_with('https://image-checksum')
@mock.patch.object(image_service.HttpImageService, 'get',
autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
def test_build_instance_info_remote_checksum_sha512(self,
validate_href_mock,
get_mock):
# Test case where we would download both the image and the checksum
# and ultimately convert the image.
get_mock.return_value = 'a' * 128 + '\n'
cfg.CONF.set_override('image_download_source', 'local', group='agent')
i_info = self.node.instance_info
driver_internal_info = self.node.driver_internal_info
i_info['image_source'] = 'https://image-ref'
i_info['image_checksum'] = 'https://image-checksum'
i_info['root_gb'] = 10
i_info['image_disk_format'] = 'qcow2'
driver_internal_info['is_whole_disk_image'] = True
self.node.instance_info = i_info
self.node.driver_internal_info = driver_internal_info
self.node.save()
expected_url = (
'http://172.172.24.10:8080/agent_images/%s' % self.node.uuid)
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
info = utils.build_instance_info_for_deploy(task)
self.assertEqual(expected_url, info['image_url'])
self.assertEqual('raw', info['image_disk_format'])
self.cache_image_mock.assert_called_once_with(
task.context, task.node, force_raw=True,
expected_format='qcow2',
expected_checksum='a' * 128,
expected_checksum_algo='sha512')
self.checksum_mock.assert_called_once_with(
self.fake_path, algorithm='sha256')
validate_href_mock.assert_called_once_with(
mock.ANY, expected_url, False)
get_mock.assert_called_once_with('https://image-checksum')
@mock.patch.object(checksum_utils, 'get_checksum_from_url',
autospec=True)
@mock.patch.object(image_service.HttpImageService, 'validate_href',
autospec=True)
def test_build_instance_info_md5_not_permitted(
self,
validate_href_mock,
get_from_url_mock):
cfg.CONF.set_override('allow_md5_checksum', False, group='agent')
# Test case where we would download both the image and the checksum
# and ultimately convert the image.
cfg.CONF.set_override('image_download_source', 'local', group='agent')
i_info = self.node.instance_info
driver_internal_info = self.node.driver_internal_info
i_info['image_source'] = 'image-ref'
i_info['image_checksum'] = 'ad606d6a24a2dec982bc2993aaaf9160'
i_info['root_gb'] = 10
i_info['image_disk_format'] = 'qcow2'
driver_internal_info['is_whole_disk_image'] = True
self.node.instance_info = i_info
self.node.driver_internal_info = driver_internal_info
self.node.save()
with task_manager.acquire(
self.context, self.node.uuid, shared=False) as task:
self.assertRaisesRegex(exception.ImageChecksumAlgorithmFailure,
'The requested image checksum algorithm '
'cannot be loaded',
utils.build_instance_info_for_deploy,
task)
self.cache_image_mock.assert_not_called()
self.checksum_mock.assert_not_called()
validate_href_mock.assert_not_called()
get_from_url_mock.assert_not_called()
class TestStorageInterfaceUtils(db_base.DbTestCase):
def setUp(self):

View file

@ -23,9 +23,11 @@ import time
from unittest import mock
import uuid
from oslo_config import cfg
from oslo_utils import uuidutils
from ironic.common import exception
from ironic.common import image_format_inspector
from ironic.common import image_service
from ironic.common import images
from ironic.common import utils
@ -63,7 +65,8 @@ class TestImageCacheFetch(BaseTest):
self.cache.fetch_image(self.uuid, self.dest_path)
self.assertFalse(mock_download.called)
mock_fetch.assert_called_once_with(
None, self.uuid, self.dest_path, True)
None, self.uuid, self.dest_path, True,
None, None, None)
self.assertFalse(mock_clean_up.called)
mock_image_service.assert_not_called()
@ -80,7 +83,8 @@ class TestImageCacheFetch(BaseTest):
self.uuid, self.dest_path)
self.assertFalse(mock_download.called)
mock_fetch.assert_called_once_with(
None, self.uuid, self.dest_path, True)
None, self.uuid, self.dest_path, True,
None, None, None)
self.assertFalse(mock_clean_up.called)
mock_image_service.assert_not_called()
@ -155,7 +159,8 @@ class TestImageCacheFetch(BaseTest):
mock_download.assert_called_once_with(
self.cache, self.uuid, self.master_path, self.dest_path,
mock_image_service.return_value.show.return_value,
ctx=None, force_raw=True)
ctx=None, force_raw=True, expected_format=None,
expected_checksum=None, expected_checksum_algo=None)
mock_clean_up.assert_called_once_with(self.cache)
mock_image_service.assert_called_once_with(self.uuid, context=None)
mock_image_service.return_value.show.assert_called_once_with(self.uuid)
@ -177,7 +182,8 @@ class TestImageCacheFetch(BaseTest):
mock_download.assert_called_once_with(
self.cache, self.uuid, self.master_path, self.dest_path,
mock_image_service.return_value.show.return_value,
ctx=None, force_raw=True)
ctx=None, force_raw=True, expected_format=None,
expected_checksum=None, expected_checksum_algo=None)
mock_clean_up.assert_called_once_with(self.cache)
def test_fetch_image_not_uuid(self, mock_download, mock_clean_up,
@ -190,7 +196,8 @@ class TestImageCacheFetch(BaseTest):
mock_download.assert_called_once_with(
self.cache, href, master_path, self.dest_path,
mock_image_service.return_value.show.return_value,
ctx=None, force_raw=True)
ctx=None, force_raw=True, expected_format=None,
expected_checksum=None, expected_checksum_algo=None)
self.assertTrue(mock_clean_up.called)
def test_fetch_image_not_uuid_no_force_raw(self, mock_download,
@ -199,11 +206,14 @@ class TestImageCacheFetch(BaseTest):
href = u'http://abc.com/ubuntu.qcow2'
href_converted = str(uuid.uuid5(uuid.NAMESPACE_URL, href))
master_path = os.path.join(self.master_dir, href_converted)
self.cache.fetch_image(href, self.dest_path, force_raw=False)
self.cache.fetch_image(href, self.dest_path, force_raw=False,
expected_checksum='f00',
expected_checksum_algo='sha256')
mock_download.assert_called_once_with(
self.cache, href, master_path, self.dest_path,
mock_image_service.return_value.show.return_value,
ctx=None, force_raw=False)
ctx=None, force_raw=False, expected_format=None,
expected_checksum='f00', expected_checksum_algo='sha256')
self.assertTrue(mock_clean_up.called)
@ -211,7 +221,8 @@ class TestImageCacheFetch(BaseTest):
class TestImageCacheDownload(BaseTest):
def test__download_image(self, mock_fetch):
def _fake_fetch(ctx, uuid, tmp_path, *args):
def _fake_fetch(ctx, uuid, tmp_path, force_raw, expected_format,
expected_checksum, expected_checksum_algo):
self.assertEqual(self.uuid, uuid)
self.assertNotEqual(self.dest_path, tmp_path)
self.assertNotEqual(os.path.dirname(tmp_path), self.master_dir)
@ -233,7 +244,8 @@ class TestImageCacheDownload(BaseTest):
# Make sure we don't use any parts of the URL anywhere.
url = "http://example.com/image.iso?secret=%s" % ("x" * 1000)
def _fake_fetch(ctx, href, tmp_path, *args):
def _fake_fetch(ctx, href, tmp_path, force_raw, expected_format,
expected_checksum, expected_checksum_algo):
self.assertEqual(url, href)
self.assertNotEqual(self.dest_path, tmp_path)
self.assertNotEqual(os.path.dirname(tmp_path), self.master_dir)
@ -517,7 +529,8 @@ class TestImageCacheCleanUp(base.TestCase):
@mock.patch.object(utils, 'rmtree_without_raise', autospec=True)
@mock.patch.object(image_cache, '_fetch', autospec=True)
def test_temp_images_not_cleaned(self, mock_fetch, mock_rmtree):
def _fake_fetch(ctx, uuid, tmp_path, *args):
def _fake_fetch(ctx, uuid, tmp_path, force_raw, expected_format,
expected_checksum, expected_checksum_algo):
with open(tmp_path, 'w') as fp:
fp.write("TEST" * 10)
@ -754,82 +767,228 @@ class CleanupImageCacheTestCase(base.TestCase):
class TestFetchCleanup(base.TestCase):
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
@mock.patch.object(images, 'image_show', autospec=True)
@mock.patch.object(os, 'remove', autospec=True)
@mock.patch.object(images, 'converted_size', autospec=True)
@mock.patch.object(images, 'fetch', autospec=True)
@mock.patch.object(images, 'image_to_raw', autospec=True)
@mock.patch.object(images, 'force_raw_will_convert', autospec=True,
return_value=True)
@mock.patch.object(image_cache, '_clean_up_caches', autospec=True)
def test__fetch(
self, mock_clean, mock_will_convert, mock_raw, mock_fetch,
mock_size, mock_remove):
self, mock_clean, mock_raw, mock_fetch,
mock_size, mock_remove, mock_show, mock_format_inspector):
image_check = mock.MagicMock()
image_check.__str__.side_effect = iter(['qcow2', 'raw'])
image_check.safety_check.return_value = True
mock_format_inspector.return_value = image_check
mock_show.return_value = {}
mock_size.return_value = 100
image_cache._fetch('fake', 'fake-uuid', '/foo/bar', force_raw=True)
image_cache._fetch('fake', 'fake-uuid', '/foo/bar', force_raw=True,
expected_checksum='1234',
expected_checksum_algo='md5')
mock_fetch.assert_called_once_with('fake', 'fake-uuid',
'/foo/bar.part', force_raw=False)
'/foo/bar.part', force_raw=False,
checksum='1234',
checksum_algo='md5')
mock_clean.assert_called_once_with('/foo', 100)
mock_raw.assert_called_once_with('fake-uuid', '/foo/bar',
'/foo/bar.part')
mock_will_convert.assert_called_once_with('fake-uuid', '/foo/bar.part')
mock_remove.assert_not_called()
mock_show.assert_called_once_with('fake', 'fake-uuid')
mock_format_inspector.assert_called_once_with('/foo/bar.part')
image_check.safety_check.assert_called_once()
self.assertEqual(1, image_check.__str__.call_count)
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
@mock.patch.object(images, 'image_show', autospec=True)
@mock.patch.object(os, 'remove', autospec=True)
@mock.patch.object(images, 'converted_size', autospec=True)
@mock.patch.object(images, 'fetch', autospec=True)
@mock.patch.object(images, 'image_to_raw', autospec=True)
@mock.patch.object(image_cache, '_clean_up_caches', autospec=True)
def test__fetch_deep_inspection_disabled(
self, mock_clean, mock_raw, mock_fetch,
mock_size, mock_remove, mock_show, mock_format_inspector):
cfg.CONF.set_override(
'disable_deep_image_inspection', True,
group='conductor')
image_check = mock.MagicMock()
image_check.__str__.return_value = 'qcow2'
image_check.safety_check.return_value = True
mock_format_inspector.return_value = image_check
mock_show.return_value = {}
mock_size.return_value = 100
image_cache._fetch('fake', 'fake-uuid', '/foo/bar', force_raw=True)
mock_fetch.assert_called_once_with('fake', 'fake-uuid',
'/foo/bar.part', force_raw=False,
checksum=None, checksum_algo=None)
mock_clean.assert_called_once_with('/foo', 100)
mock_raw.assert_called_once_with('fake-uuid', '/foo/bar',
'/foo/bar.part')
mock_remove.assert_not_called()
mock_show.assert_not_called()
mock_format_inspector.assert_called_once_with('/foo/bar.part')
image_check.safety_check.assert_not_called()
self.assertEqual(1, image_check.__str__.call_count)
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
@mock.patch.object(images, 'image_show', autospec=True)
@mock.patch.object(os, 'remove', autospec=True)
@mock.patch.object(os.path, 'exists', autospec=True)
@mock.patch.object(images, 'converted_size', autospec=True)
@mock.patch.object(images, 'fetch', autospec=True)
@mock.patch.object(images, 'image_to_raw', autospec=True)
@mock.patch.object(images, 'force_raw_will_convert', autospec=True,
return_value=True)
@mock.patch.object(image_cache, '_clean_up_caches', autospec=True)
def test__fetch_part_already_exists(
self, mock_clean, mock_will_convert, mock_raw, mock_fetch,
mock_size, mock_exists, mock_remove):
self, mock_clean, mock_raw, mock_fetch,
mock_size, mock_exists, mock_remove, mock_image_show,
mock_format_inspector):
image_check = mock.MagicMock()
image_check.__str__.side_effect = iter(['qcow2', 'raw'])
image_check.safety_check.return_value = True
mock_format_inspector.return_value = image_check
mock_exists.return_value = True
mock_size.return_value = 100
image_cache._fetch('fake', 'fake-uuid', '/foo/bar', force_raw=True)
mock_image_show.return_value = {}
image_cache._fetch('fake', 'fake-uuid', '/foo/bar', force_raw=True,
expected_format=None, expected_checksum='f00',
expected_checksum_algo='sha256')
mock_fetch.assert_called_once_with('fake', 'fake-uuid',
'/foo/bar.part', force_raw=False)
'/foo/bar.part', force_raw=False,
checksum='f00',
checksum_algo='sha256')
mock_clean.assert_called_once_with('/foo', 100)
mock_raw.assert_called_once_with('fake-uuid', '/foo/bar',
'/foo/bar.part')
mock_will_convert.assert_called_once_with('fake-uuid', '/foo/bar.part')
self.assertEqual(1, mock_exists.call_count)
self.assertEqual(1, mock_remove.call_count)
mock_image_show.assert_called_once_with('fake', 'fake-uuid')
mock_format_inspector.assert_called_once_with('/foo/bar.part')
image_check.safety_check.assert_called_once()
self.assertEqual(1, image_check.__str__.call_count)
@mock.patch.object(os, 'rename', autospec=True)
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
@mock.patch.object(images, 'image_show', autospec=True)
@mock.patch.object(images, 'converted_size', autospec=True)
@mock.patch.object(images, 'fetch', autospec=True)
@mock.patch.object(images, 'image_to_raw', autospec=True)
@mock.patch.object(images, 'force_raw_will_convert', autospec=True,
return_value=False)
@mock.patch.object(image_cache, '_clean_up_caches', autospec=True)
def test__fetch_already_raw(
self, mock_clean, mock_will_convert, mock_raw, mock_fetch,
mock_size):
image_cache._fetch('fake', 'fake-uuid', '/foo/bar', force_raw=True)
self, mock_clean, mock_raw, mock_fetch,
mock_size, mock_show, mock_format_inspector,
mock_rename):
mock_show.return_value = {'disk_format': 'raw'}
image_check = mock.MagicMock()
image_check.__str__.return_value = 'raw'
image_check.safety_check.return_value = True
mock_format_inspector.return_value = image_check
image_cache._fetch('fake', 'fake-uuid', '/foo/bar', force_raw=True,
expected_checksum='e00',
expected_checksum_algo='sha256')
mock_fetch.assert_called_once_with('fake', 'fake-uuid',
'/foo/bar.part', force_raw=False)
'/foo/bar.part', force_raw=False,
checksum='e00',
checksum_algo='sha256')
mock_clean.assert_not_called()
mock_size.assert_not_called()
mock_raw.assert_called_once_with('fake-uuid', '/foo/bar',
'/foo/bar.part')
mock_will_convert.assert_called_once_with('fake-uuid', '/foo/bar.part')
mock_raw.assert_not_called()
mock_show.assert_called_once_with('fake', 'fake-uuid')
mock_format_inspector.assert_called_once_with('/foo/bar.part')
image_check.safety_check.assert_called_once()
self.assertEqual(1, image_check.__str__.call_count)
mock_rename.assert_called_once_with('/foo/bar.part', '/foo/bar')
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
@mock.patch.object(images, 'image_show', autospec=True)
@mock.patch.object(images, 'converted_size', autospec=True)
@mock.patch.object(images, 'fetch', autospec=True)
@mock.patch.object(images, 'image_to_raw', autospec=True)
@mock.patch.object(image_cache, '_clean_up_caches', autospec=True)
def test__fetch_format_does_not_match_glance(
self, mock_clean, mock_raw, mock_fetch,
mock_size, mock_show, mock_format_inspector):
mock_show.return_value = {'disk_format': 'raw'}
image_check = mock.MagicMock()
image_check.__str__.return_value = 'qcow2'
image_check.safety_check.return_value = True
mock_format_inspector.return_value = image_check
self.assertRaises(exception.InvalidImage,
image_cache._fetch,
'fake', 'fake-uuid',
'/foo/bar', force_raw=True,
expected_format=None,
expected_checksum='a00',
expected_checksum_algo='sha512')
mock_fetch.assert_called_once_with('fake', 'fake-uuid',
'/foo/bar.part', force_raw=False,
checksum='a00',
checksum_algo='sha512')
mock_clean.assert_not_called()
mock_size.assert_not_called()
mock_raw.assert_not_called()
mock_show.assert_called_once_with('fake', 'fake-uuid')
mock_format_inspector.assert_called_once_with('/foo/bar.part')
image_check.safety_check.assert_called_once()
self.assertEqual(1, image_check.__str__.call_count)
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
@mock.patch.object(images, 'image_show', autospec=True)
@mock.patch.object(images, 'converted_size', autospec=True)
@mock.patch.object(images, 'fetch', autospec=True)
@mock.patch.object(images, 'image_to_raw', autospec=True)
@mock.patch.object(image_cache, '_clean_up_caches', autospec=True)
def test__fetch_not_safe_image(
self, mock_clean, mock_raw, mock_fetch,
mock_size, mock_show, mock_format_inspector):
mock_show.return_value = {'disk_format': 'qcow2'}
image_check = mock.MagicMock()
image_check.__str__.return_value = 'qcow2'
image_check.safety_check.return_value = False
mock_format_inspector.return_value = image_check
self.assertRaises(exception.InvalidImage,
image_cache._fetch,
'fake', 'fake-uuid',
'/foo/bar', force_raw=True)
mock_fetch.assert_called_once_with('fake', 'fake-uuid',
'/foo/bar.part', force_raw=False,
checksum=None, checksum_algo=None)
mock_clean.assert_not_called()
mock_size.assert_not_called()
mock_raw.assert_not_called()
mock_show.assert_called_once_with('fake', 'fake-uuid')
mock_format_inspector.assert_called_once_with('/foo/bar.part')
image_check.safety_check.assert_called_once()
self.assertEqual(0, image_check.__str__.call_count)
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
@mock.patch.object(images, 'image_show', autospec=True)
@mock.patch.object(images, 'converted_size', autospec=True)
@mock.patch.object(images, 'fetch', autospec=True)
@mock.patch.object(images, 'image_to_raw', autospec=True)
@mock.patch.object(images, 'force_raw_will_convert', autospec=True,
return_value=True)
@mock.patch.object(image_cache, '_clean_up_caches', autospec=True)
def test__fetch_estimate_fallback(
self, mock_clean, mock_will_convert, mock_raw, mock_fetch,
mock_size):
self, mock_clean, mock_raw, mock_fetch,
mock_size, mock_show, mock_format_inspector):
mock_show.return_value = {'disk_format': 'qcow2'}
image_check = mock.MagicMock()
image_check.__str__.side_effect = iter(['qcow2', 'raw'])
image_check.safety_check.return_value = True
mock_format_inspector.return_value = image_check
mock_size.side_effect = [100, 10]
mock_clean.side_effect = [exception.InsufficientDiskSpace(), None]
image_cache._fetch('fake', 'fake-uuid', '/foo/bar', force_raw=True)
mock_fetch.assert_called_once_with('fake', 'fake-uuid',
'/foo/bar.part', force_raw=False)
'/foo/bar.part', force_raw=False,
checksum=None, checksum_algo=None)
mock_size.assert_has_calls([
mock.call('/foo/bar.part', estimate=False),
mock.call('/foo/bar.part', estimate=True),
@ -840,4 +999,71 @@ class TestFetchCleanup(base.TestCase):
])
mock_raw.assert_called_once_with('fake-uuid', '/foo/bar',
'/foo/bar.part')
mock_will_convert.assert_called_once_with('fake-uuid', '/foo/bar.part')
mock_show.assert_called_once_with('fake', 'fake-uuid')
mock_format_inspector.assert_called_once_with('/foo/bar.part')
image_check.safety_check.assert_called_once()
self.assertEqual(1, image_check.__str__.call_count)
@mock.patch.object(os, 'rename', autospec=True)
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
@mock.patch.object(images, 'image_show', autospec=True)
@mock.patch.object(os, 'remove', autospec=True)
@mock.patch.object(images, 'converted_size', autospec=True)
@mock.patch.object(images, 'fetch', autospec=True)
@mock.patch.object(images, 'image_to_raw', autospec=True)
@mock.patch.object(image_cache, '_clean_up_caches', autospec=True)
def test__fetch_ramdisk_kernel(
self, mock_clean, mock_raw, mock_fetch,
mock_size, mock_remove, mock_show, mock_format_inspector,
mock_rename):
image_check = mock.MagicMock()
image_check.__str__.return_value = 'raw'
image_check.safety_check.return_value = True
mock_format_inspector.return_value = image_check
mock_show.return_value = {'disk_format': 'aki'}
mock_size.return_value = 100
image_cache._fetch('fake', 'fake-uuid', '/foo/bar', force_raw=True)
mock_fetch.assert_called_once_with('fake', 'fake-uuid',
'/foo/bar.part', force_raw=False,
checksum=None, checksum_algo=None)
mock_clean.assert_not_called()
mock_raw.assert_not_called()
mock_remove.assert_not_called()
mock_show.assert_called_once_with('fake', 'fake-uuid')
mock_format_inspector.assert_called_once_with('/foo/bar.part')
image_check.safety_check.assert_called_once()
self.assertEqual(1, image_check.__str__.call_count)
mock_rename.assert_called_once_with('/foo/bar.part', '/foo/bar')
@mock.patch.object(os, 'rename', autospec=True)
@mock.patch.object(image_format_inspector, 'detect_file_format',
autospec=True)
@mock.patch.object(images, 'image_show', autospec=True)
@mock.patch.object(os, 'remove', autospec=True)
@mock.patch.object(images, 'converted_size', autospec=True)
@mock.patch.object(images, 'fetch', autospec=True)
@mock.patch.object(images, 'image_to_raw', autospec=True)
@mock.patch.object(image_cache, '_clean_up_caches', autospec=True)
def test__fetch_ramdisk_image(
self, mock_clean, mock_raw, mock_fetch,
mock_size, mock_remove, mock_show, mock_format_inspector,
mock_rename):
image_check = mock.MagicMock()
image_check.__str__.return_value = 'raw'
image_check.safety_check.return_value = True
mock_format_inspector.return_value = image_check
mock_show.return_value = {'disk_format': 'ari'}
mock_size.return_value = 100
image_cache._fetch('fake', 'fake-uuid', '/foo/bar', force_raw=True)
mock_fetch.assert_called_once_with('fake', 'fake-uuid',
'/foo/bar.part', force_raw=False,
checksum=None, checksum_algo=None)
mock_clean.assert_not_called()
mock_raw.assert_not_called()
mock_remove.assert_not_called()
mock_show.assert_called_once_with('fake', 'fake-uuid')
mock_format_inspector.assert_called_once_with('/foo/bar.part')
image_check.safety_check.assert_called_once()
self.assertEqual(1, image_check.__str__.call_count)
mock_rename.assert_called_once_with('/foo/bar.part', '/foo/bar')

View file

@ -0,0 +1,108 @@
---
security:
- |
Ironic now checks the supplied image format value against the detected
format of the image file, and will prevent deployments should the
values mismatch. If being used with Glance and a mismatch in metadata
is identified, it will require images to be re-uploaded with a new image
ID to represent corrected metadata.
This is the result of CVE-2024-44082 tracked as
`bug 2071740 <https://bugs.launchpad.net/ironic/+bug/2071740>`_.
- |
Ironic *always* inspects the supplied user image content for safety prior
to deployment of a node should the image pass through the conductor,
even if the image is supplied in ``raw`` format. This is utilized to
identify the format of the image and the overall safety
of the image, such that source images with unknown or unsafe feature
usage are explicitly rejected. This can be disabled by setting
``[conductor]disable_deep_image_inspection`` to ``True``.
This is the result of CVE-2024-44082 tracked as
`bug 2071740 <https://bugs.launchpad.net/ironic/+bug/2071740>`_.
- |
Ironic can also inspect images which would normally be provided as a URL
for direct download by the ``ironic-python-agent`` ramdisk. This is not
enabled by default as it will increase the overall network traffic and
disk space utilization of the conductor. This level of inspection can be
enabled by setting ``[conductor]conductor_always_validates_images`` to
``True``. Once the ``ironic-python-agent`` ramdisk has been updated,
it will perform similar image security checks independently, should an
image conversion be required.
This is the result of CVE-2024-44082 tracked as
`bug 2071740 <https://bugs.launchpad.net/ironic/+bug/2071740>`_.
- |
Ironic now explicitly enforces a list of permitted image types for
deployment via the ``[conductor]permitted_image_formats`` setting,
which defaults to "raw", "qcow2", and "iso".
While the project has classically always declared permissible
images as "qcow2" and "raw", it was previously possible to supply other
image formats known to ``qemu-img``, and the utility would attempt to
convert the images. The "iso" support is required for "boot from ISO"
ramdisk support.
- |
Ironic now explicitly passes the source input format to executions of
``qemu-img`` to limit the permitted qemu disk image drivers which may
evaluate an image to prevent any mismatched format attacks against
``qemu-img``.
- |
The ``ansible`` deploy interface example playbooks now supply an input
format to execution of ``qemu-img``. If you are using customized
playbooks, please add "-f {{ ironic.image.disk_format }}" to your
invocations of ``qemu-img``. If you do not do so, ``qemu-img`` will
automatically try and guess which can lead to known security issues
with the incorrect source format driver.
- |
Operators who have implemented any custom deployment drivers or additional
functionality like machine snapshot, should review their downstream code
to ensure they are properly invoking ``qemu-img``. If there are any
questions or concerns, please reach out to the Ironic project developers.
- |
Operators are reminded that they should utilize cleaning in their
environments. Disabling any security features such as cleaning or image
inspection are at **your** **own** **risk**. Should you have any issues
with security related features, please don't hesitate to open a bug with
the project.
- |
The ``[conductor]disable_deep_image_inspection`` setting is
conveyed to the ``ironic-python-agent`` ramdisks automatically, and
will prevent those operating ramdisks from performing deep inspection
of images before they are written.
- The ``[conductor]permitted_image_formats`` setting is conveyed to the
``ironic-python-agent`` ramdisks automatically. Should a need arise
to explicitly permit an additional format, that should take place in
the Ironic service configuration.
fixes:
- |
Fixes multiple issues in the handling of images as it relates to the
execution of the ``qemu-img`` utility, which is used for image format
conversion, where a malicious user could craft a disk image to potentially
extract information from an ``ironic-conductor`` process's operating
environment.
Ironic now explicitly enforces a list of approved image
formats as a ``[conductor]permitted_image_formats`` list, which mirrors
the image formats the Ironic project has historically tested and expressed
as known working. Testing is not based upon file extension, but upon
content fingerprinting of the disk image files.
This is tracked as CVE-2024-44082 via
`bug 2071740 <https://bugs.launchpad.net/ironic/+bug/2071740>`_.
upgrade:
- |
When upgrading Ironic to address the ``qemu-img`` image conversion
security issues, the ``ironic-python-agent`` ramdisks will also need
to be upgraded.
- |
When upgrading Ironic to address the ``qemu-img`` image conversion
security issues, the ``[conductor]conductor_always_validates_images``
setting may be set to ``True`` as a short term remedy while
``ironic-python-agent`` ramdisks are being updated. Alternatively it
may be advisable to also set the ``[agent]image_download_source``
setting to ``local`` to minimize redundant network data transfers.
- |
As a result of security fixes to address ``qemu-img`` image conversion
security issues, a new configuration parameter has been added to
Ironic, ``[conductor]permitted_image_formats`` with a default value of
"raw,qcow2,iso". Raw and qcow2 format disk images are the image formats
the Ironic community has consistently stated as what is supported
and expected for use with Ironic. These formats also match the formats
which the Ironic community tests. Operators who leverage other disk image
formats, may need to modify this setting further.

View file

@ -0,0 +1,5 @@
---
fixes:
- |
Fixes inspection failure when ``bmc_address`` or ``bmc_v6address`` is
``null`` in the inventory received from the ramdisk.

View file

@ -0,0 +1,44 @@
---
security:
- |
An issue in Ironic has been resolved where image checksums would not be
checked prior to the conversion of an image to a ``raw`` format image from
another image format.
With default settings, this normally would not take place, however the
``image_download_source`` option, which is available to be set at a
``node`` level for a single deployment, by default for that baremetal node
in all cases, or via the ``[agent]image_download_source`` configuration
option when set to ``local``. By default, this setting is ``http``.
This was in concert with the ``[DEFAULT]force_raw_images`` when set to
``True``, which caused Ironic to download and convert the file.
In a fully integrated context of Ironic's use in a larger OpenStack
deployment, where images are coming from the Glance image service, the
previous pattern was not problematic. The overall issue was introduced as
a result of the capability to supply, cache, and convert a disk image
provided as a URL by an authenticated user.
Ironic will now validate the user supplied checksum prior to image
conversion on the conductor. This can be disabled using the
``[conductor]disable_file_checksum`` configuration option.
fixes:
- |
Fixes a security issue where Ironic would fail to checksum disk image
files it downloads when Ironic had been requested to download and convert
the image to a raw image format. This required the
``image_download_source`` to be explicitly set to ``local``, which is not
the default.
This fix can be disabled by setting
``[conductor]disable_file_checksum`` to ``True``, however this
option will be removed in new major Ironic releases.
As a result of this, parity has been introduced to align Ironic to
Ironic-Python-Agent's support for checksums used by ``standalone``
users of Ironic. This includes support for remote checksum files to be
supplied by URL, in order to prevent breaking existing users which may
have inadvertently been leveraging the prior code path. This support can
be disabled by setting
``[conductor]disable_support_for_checksum_files`` to ``True``.

View file

@ -0,0 +1,7 @@
---
fixes:
- |
The fix for CVE-2024-47211 results in image checksum being required in all
cases. However there is no checksum requirement for file://
based images. When checksum is missing for file:// based image_source it is
now calculated on-the-fly.

View file

@ -15,6 +15,7 @@ setenv = VIRTUAL_ENV={envdir}
OS_STDERR_CAPTURE={env:OS_STDERR_CAPTURE:true}
PYTHONUNBUFFERED=1
SQLALCHEMY_WARN_20=true
TOX_CONSTRAINTS_FILE=upper-constraints.txt
deps =
-c{env:TOX_CONSTRAINTS_FILE:https://releases.openstack.org/constraints/upper/master}
-r{toxinidir}/requirements.txt
@ -167,5 +168,7 @@ paths = ./ironic/hacking/
[testenv:bandit]
usedevelop = False
deps = -r{toxinidir}/test-requirements.txt
deps =
-c{env:TOX_CONSTRAINTS_FILE:https://releases.openstack.org/constraints/upper/master}
-r{toxinidir}/test-requirements.txt
commands = bandit -r ironic -x tests -n5 -ll -c tools/bandit.yml

619
upper-constraints.txt Normal file
View file

@ -0,0 +1,619 @@
voluptuous===0.14.2
chardet===5.2.0
enum-compat===0.0.3
rsa===4.9
restructuredtext-lint===1.4.0
netmiko===4.1.2
sshtunnel===0.4.0
PasteDeploy===3.1.0
typing===3.7.4.3
python-saharaclient===4.2.0
Routes===2.5.1
rtslib-fb===2.1.76
oslo.limit===2.4.0
tzdata===2024.1
smmap===5.0.1
confget===5.1.2
XStatic-Angular-Bootstrap===2.5.0.0
WebOb===1.8.7
sphinxcontrib-actdiag===3.0.0
pecan===1.5.1
os-api-ref===3.0.0
python-ldap===3.4.4
oslo.concurrency===6.0.0
websocket-client===1.7.0
osprofiler===4.1.0
os-resource-classes===1.1.0
tabulate===0.9.0
python-ironic-inspector-client===5.1.0
lxml===5.1.0
vintage===0.4.1
rst2txt===1.1.0
setproctitle===1.3.3
pytest===8.0.1
python-slugify===8.0.4
cursive===0.2.3
oslo.service===3.4.0
django-appconf===1.0.6
ntc_templates===4.3.0
sphinxcontrib-nwdiag===2.0.0
rbd-iscsi-client===0.1.8
requests-aws===0.1.8
alabaster===0.7.13;python_version=='3.8'
alabaster===0.7.16;python_version>='3.9'
pbr===6.0.0
munch===4.0.0
waiting===1.4.1
attrs===23.2.0
microversion-parse===1.0.1
jwcrypto===1.5.4
Pint===0.21.1;python_version=='3.8'
Pint===0.23;python_version>='3.9'
oslo.i18n===6.3.0
jsonpath-rw-ext===1.2.2
python-mistralclient===5.2.0
oslo.context===5.5.0
python-senlinclient===3.1.0
rcssmin===1.1.1
pycadf===3.1.1
grpcio===1.60.1
pysendfile===2.0.1
sniffio===1.3.0
fixtures===4.1.0
neutron-lib===3.11.0
XStatic-FileSaver===1.3.2.0
oslo.metrics===0.8.0
storage-interfaces===1.0.5
persist-queue===0.8.1
pystache===0.6.5
XStatic-Font-Awesome===4.7.0.0
nose===1.3.7
nosehtmloutput===0.0.7
waitress===3.0.0
os-refresh-config===13.2.0
pysnmp===4.4.12
Mako===1.3.2
pyScss===1.4.0
sphinxcontrib-htmlhelp===2.0.1;python_version=='3.8'
sphinxcontrib-htmlhelp===2.0.5;python_version>='3.9'
XStatic-jQuery===3.5.1.1
ddt===1.7.1
XStatic-Graphlib===2.1.7.0
pyserial===3.5
moto===5.0.1
infi.dtypes.wwn===0.1.1
python-freezerclient===5.2.0
python-vitrageclient===5.0.0
py-pure-client===1.47.0
nosexcover===1.0.11
krest===1.3.6
psycopg2===2.9.9
networkx===3.1;python_version=='3.8'
networkx===3.2.1;python_version>='3.9'
XStatic-Angular===1.8.2.2
pyngus===2.3.1
zuul-sphinx===0.7.0
Tempita===0.5.2
ply===3.11
google-api-core===2.17.1
requests-toolbelt===1.0.0
simplejson===3.19.2
types-paramiko===3.4.0.20240205
immutables===0.20
python-swiftclient===4.5.0
pyOpenSSL===24.0.0
monasca-common===3.8.0
zeroconf===0.131.0
scipy===1.10.1;python_version=='3.8'
scipy===1.12.0;python_version>='3.9'
opentelemetry-exporter-otlp===1.22.0
python-gnupg===0.5.2
mypy-extensions===1.0.0
rsd-lib===1.2.0
XStatic-Jasmine===2.4.1.2
googleapis-common-protos===1.62.0
python-glanceclient===4.5.0
prometheus_client===0.20.0
jaraco.classes===3.3.1
pyinotify===0.9.6
debtcollector===3.0.0
requests-unixsocket===0.3.0
responses===0.25.0
croniter===2.0.1
horizon===24.0.0
octavia-lib===3.5.0
python-watcherclient===4.4.0
MarkupSafe===2.1.5
types-python-dateutil===2.8.19.20240106
ruamel.yaml.clib===0.2.8
doc8===1.1.1
pymongo===4.6.1
python-cloudkittyclient===5.0.0
soupsieve===2.5
sqlparse===0.4.4
oslotest===5.0.0
jsonpointer===2.4
defusedxml===0.7.1
opentelemetry-sdk===1.22.0
netaddr===0.10.1
pyghmi===1.5.67
sphinxcontrib-blockdiag===3.0.0
thrift===0.16.0
gnocchiclient===7.0.8
backoff===2.2.1
wcwidth===0.2.13
sphinxcontrib.datatemplates===0.11.0
jsonpath-rw===1.4.0
prettytable===3.9.0
vine===5.1.0
taskflow===5.6.0
traceback2===1.4.0
arrow===1.3.0
semantic-version===2.10.0
async-timeout===4.0.3;python_version=='3.10'
async-timeout===4.0.3;python_version=='3.8'
async-timeout===4.0.3;python_version=='3.9'
virtualbmc===3.1.0
SQLAlchemy===1.4.51
pyroute2===0.7.12
google-auth===2.28.0
pyasn1-lextudio===1.1.2
kazoo===2.10.0
pyspnego===0.10.2
XStatic-roboto-fontface===0.5.0.0
pyudev===0.24.1
eventlet===0.35.1
openstack-doc-tools===3.3.1
oslo.messaging===14.7.0
jira===3.6.0
extras===1.0.0
PyJWT===2.8.0
typing_extensions===4.9.0
XStatic-lodash===4.16.4.2
zVMCloudConnector===1.6.3
paramiko===3.4.0
ifaddr===0.2.0
reno===4.1.0
ncclient===0.6.15
imagesize===1.4.1
pydot===2.0.0
urllib3===1.26.18
graphviz===0.20.1
PyKMIP===0.10.0
python-observabilityclient===0.1.1
whereto===0.4.0
pywbem===1.6.2
python-subunit===1.4.4
tornado===6.4
pycparser===2.21
mock===5.1.0
PyYAML===6.0.1
beautifulsoup4===4.12.3
ovs===3.1.2
cryptography===42.0.4
httpcore===1.0.3
URLObject===2.4.3
nocasedict===2.0.1
psycopg2-binary===2.9.9
openstack-release-test===5.0.0
pylxd===2.3.2
pycryptodomex===3.20.0
requests-mock===1.11.0
os-apply-config===13.2.0
oslosphinx===4.18.0
gunicorn===21.2.0
storpool===7.3.0
textfsm===1.1.2
python-3parclient===4.2.13
unittest2===1.1.0
django-compressor===4.4
libvirt-python===10.0.0
python-zunclient===5.0.0
tzlocal===5.2
sphinxcontrib-jsmath===1.0.1
python-novaclient===18.6.0
pact===1.12.0
bcrypt===4.1.2
exceptiongroup===1.2.0;python_version=='3.10'
exceptiongroup===1.2.0;python_version=='3.8'
exceptiongroup===1.2.0;python_version=='3.9'
os-client-config===2.1.0
XStatic-Angular-Gettext===2.4.1.0
Deprecated===1.2.14
h11===0.14.0
Pygments===2.17.2
XStatic-Hogan===2.0.0.3
XStatic-objectpath===1.2.1.0
python-manilaclient===4.8.0
sphinxcontrib-serializinghtml===1.1.5;python_version=='3.8'
sphinxcontrib-serializinghtml===1.1.10;python_version>='3.9'
requests===2.31.0
snowballstemmer===2.2.0
Jinja2===3.1.3
XStatic-Bootstrap-SCSS===3.4.1.0
pyzabbix===1.3.1
ptyprocess===0.7.0
threadloop===1.0.2
amqp===5.2.0
ruamel.yaml===0.18.6
websockify===0.11.0
gssapi===1.8.3
XStatic-JQuery.quicksearch===2.0.3.2
mpmath===1.3.0
python-binary-memcached===0.31.2
django-debreach===2.1.0
sphinx-feature-classification===1.1.0
django-pymemcache===1.0.0
XStatic-JQuery-Migrate===3.3.2.1
pytest-html===4.1.1
appdirs===1.4.4
google-auth-httplib2===0.2.0
pkgutil_resolve_name===1.3.10;python_version=='3.8'
daiquiri===3.2.5.1
influxdb===5.3.1
funcparserlib===1.0.1
passlib===1.7.4
dib-utils===0.0.11
cliff===4.6.0
os-brick===6.7.0
ansible-runner===2.3.5
scp===0.14.5
python-zaqarclient===2.7.0
lockfile===0.12.2
ldappool===3.0.0
termcolor===2.4.0
joblib===1.3.2
google-api-python-client===2.118.0
castellan===5.0.0
oslo.versionedobjects===3.3.0
enmerkar===0.7.1
webcolors===1.13
aodhclient===3.5.1
autobahn===23.1.2;python_version=='3.8'
autobahn===23.6.2;python_version>='3.9'
SQLAlchemy-Utils===0.41.1
retryz===0.1.9
pluggy===1.4.0
coverage===7.4.1
freezegun===1.4.0
toml===0.10.2
pycdlib===1.14.0
pyperclip===1.8.2
cassandra-driver===3.29.0
XStatic-Angular-Schema-Form===0.8.13.0
opentelemetry-exporter-otlp-proto-http===1.22.0
gabbi===2.4.0
nwdiag===3.0.0
XStatic-bootswatch===3.3.7.0
pytest-xdist===3.5.0
XStatic-JS-Yaml===3.8.1.0
XStatic-term.js===0.0.7.0
oslo.log===5.5.1
nodeenv===1.8.0
gossip===2.4.0
suds-community===1.1.2
importlib-metadata===6.2.1;python_version=='3.8'
importlib-metadata===6.2.1;python_version=='3.9'
importlib-metadata===6.11.0;python_version>='3.10'
oslo.middleware===6.1.0
XStatic-mdi===1.6.50.2
django-pyscss===2.0.3
uritemplate===4.1.1
docutils===0.20.1
threadpoolctl===3.3.0
os-ken===2.8.1
ujson===5.9.0
selenium===3.141.0
mistral-lib===2.10.0
dogtag-pki===11.2.1
XStatic-Angular-UUID===0.0.4.0
purestorage===1.19.0
sphinxcontrib-seqdiag===3.0.0
os-win===5.9.0
capacity===1.3.14
retrying===1.3.4
XStatic-Dagre===0.6.4.1
platformdirs===4.2.0
pydotplus===2.0.2
boto3===1.34.44
jeepney===0.8.0
stestr===4.1.0
pillow===9.5.0
infoblox-client===0.6.0
pysmi-lextudio===1.1.13
oslo.serialization===5.4.0
warlock===2.0.1
exabgp===4.2.21
sphinxcontrib-httpdomain===1.8.1
metalsmith===2.1.1
s3transfer===0.10.0
text-unidecode===1.3
sphinxcontrib-svg2pdfconverter===1.2.2
murano-pkg-check===0.3.0
oslo.vmware===4.4.0
XStatic-moment===2.8.4.3
autopage===0.5.2
sqlalchemy-migrate===0.13.0
gitdb===4.0.11
python-monascaclient===2.8.0
ldap3===2.9.1
opentelemetry-api===1.22.0
requests-ntlm===1.2.0
automaton===3.2.0
os-service-types===1.7.0
keyring===24.3.0
elementpath===4.2.1
jsonschema-specifications===2023.12.1
testscenarios===0.5.0
sphinxcontrib-pecanwsme===0.11.0
sadisplay===0.4.9
infinisdk===240.1.2
packaging===23.2
opentelemetry-exporter-otlp-proto-grpc===1.22.0
XStatic-Dagre-D3===0.4.17.0
nose-exclude===0.5.0
psutil===5.9.8
txaio===23.1.1
elasticsearch===2.4.1
django-nose===1.4.7
asgiref===3.7.2
XStatic-JQuery.TableSorter===2.14.5.2
pifpaf===3.1.5
pysmi===0.3.4
blockdiag===3.0.0
testtools===2.7.1
infi.dtypes.iqn===0.4.0
XStatic-tv4===1.2.7.0
XStatic-JSEncrypt===2.3.1.1
python-cinderclient===9.5.0
keystonemiddleware===10.6.0
django-formtools===2.5.1
XStatic-Spin===1.2.5.3
tap-as-a-service===13.0.0.0rc1
os-traits===3.0.0
typepy===1.3.2
SecretStorage===3.3.3
opentracing===2.4.0
XStatic-Rickshaw===1.5.1.0
iso8601===2.1.0
tooz===6.1.0
linecache2===1.0.0
oauth2client===4.1.3
idna===3.6
yamlloader===1.3.2
protobuf===4.25.3
sushy===5.0.0
python-neutronclient===11.2.0
pika===1.3.2
oslo.cache===3.7.0
WebTest===3.0.0
openstack.nose-plugin===0.11
os-collect-config===13.2.0
edgegrid-python===1.3.1
python-qpid-proton===0.39.0
python-octaviaclient===3.7.0
pysaml2===7.3.1;python_version=='3.8'
pysaml2===7.4.2;python_version>='3.9'
requests-oauthlib===1.3.1
oslo.reports===3.3.0
pysnmp-lextudio===5.0.33
bitmath===1.3.3.1
ceilometermiddleware===3.3.1
pyasn1-modules-lextudio===0.2.9
testrepository===0.0.20
sympy===1.12
Logbook===1.7.0.post0
PyNaCl===1.5.0
osc-lib===3.0.1
python-consul===1.1.0
more-itertools===10.2.0
seqdiag===3.0.0
numpy===1.24.4;python_version=='3.8'
numpy===1.26.4;python_version>='3.9'
msgpack===1.0.7
Sphinx===7.1.2;python_version=='3.8'
Sphinx===7.2.6;python_version>='3.9'
oslo.config===9.4.0
openstackdocstheme===3.2.0
osc-placement===4.3.0
rpds-py===0.18.0
zake===0.2.2
python-rsdclient===1.0.2
flux===1.3.5
python-solumclient===3.8.0
pysnmpcrypto===0.0.4
krb5===0.5.1
PyMySQL===1.1.0
uhashring===2.3
kubernetes===29.0.0
httplib2===0.22.0
betamax===0.9.0
construct===2.10.70
pytest-metadata===3.1.1
pyparsing===3.1.1
geomet===0.2.1.post1
opentelemetry-exporter-otlp-proto-common===1.22.0
distlib===0.3.8
XStatic-Moment-Timezone===0.5.22.0
dogpile.cache===1.3.1
python-barbicanclient===5.7.0
salt===3006.6
opentelemetry-semantic-conventions===0.43b0
api-object-schema===2.0.0
blinker===1.7.0
WSME===0.12.1
tomli===2.0.1;python_version=='3.10'
tomli===2.0.1;python_version=='3.8'
tomli===2.0.1;python_version=='3.9'
proboscis===1.2.6.0
backports.zoneinfo===0.2.1;python_version=='3.8'
oslo.upgradecheck===2.3.0
stevedore===5.2.0
pywinrm===0.4.3
botocore===1.34.44
xmltodict===0.13.0
pyasn1===0.5.1
oslo.rootwrap===7.2.0
Django===4.2.10
pexpect===4.9.0
contextvars===2.4
cmd2===2.4.3
python-json-logger===2.0.7
redis===5.0.1
jmespath===1.0.1
click===8.1.7
XStatic-smart-table===1.4.13.2
kuryr-lib===3.0.0
scrypt===0.8.20
jsonpatch===1.33
python-daemon===3.0.1
os-testr===3.0.0
cotyledon===1.7.3
xattr===1.1.0
systemd-python===235
python-memcached===1.62
openstacksdk===3.0.0
infi.dtypes.nqn===0.1.0
looseversion===1.3.0
six===1.16.0
dulwich===0.21.7
dfs-sdk===1.2.27
sentinels===1.0.0
kombu===5.3.5
distro===1.9.0
zstd===1.5.5.1
yaql===3.0.0
requestsexceptions===1.4.0
testresources===2.0.1
falcon===3.1.3
tomlkit===0.12.3
etcd3gw===2.4.0
Flask-RESTful===0.3.10
GitPython===3.1.42
python-ironicclient===5.5.0
XStatic===1.0.3
XStatic-Angular-FileUpload===12.2.13.0
python-openstackclient===6.6.0
pyzmq===25.1.2
nocaselist===2.0.0
oslo.db===15.0.0
simplegeneric===0.8.1
python-pcre===0.7
yappi===1.6.0
mbstrdecoder===1.1.3
pymemcache===4.0.0
wrapt===1.16.0
oslo.privsep===3.3.0
sphinxcontrib-apidoc===0.5.0
oslo.policy===4.3.0
python-muranoclient===2.8.0
hvac===2.1.0
pyeclib===1.6.1
wsgi-intercept===1.13.0
ndg-httpsclient===0.5.1
repoze.lru===0.7
rfc3986===2.0.0
tenacity===8.2.3
python-designateclient===6.0.1
future===0.18.3
pytest-cov===4.1.0
reactivex===4.0.4
Paste===3.7.1
pytest-django===4.8.0
jaeger-client===4.8.0
XStatic-Json2yaml===0.1.1.0
boto===2.49.0
os-vif===3.5.0
hyperlink===21.0.0
mitba===1.1.1
python-masakariclient===8.4.0
Werkzeug===3.0.1
pyasn1-modules===0.3.0
APScheduler===3.10.4
xmlschema===2.5.1
python-troveclient===8.4.0
cachez===0.1.2
XStatic-Bootstrap-Datepicker===1.4.0.0
CouchDB===1.2
netifaces===0.11.0
cachetools===5.3.2
ws4py===0.5.1
sphinxcontrib-qthelp===1.0.3;python_version=='3.8'
sphinxcontrib-qthelp===1.0.7;python_version>='3.9'
keystoneauth1===5.6.0
statsd===4.0.1
python-keystoneclient===5.4.0
ceilometer===22.0.0.0rc1
diskimage-builder===3.32.0
heat-translator===3.0.0
python-magnumclient===4.4.0
docker===7.0.0
storops===1.2.11
anyio===4.2.0
XStatic-Angular-lrdragndrop===1.0.2.6
ovsdbapp===2.6.0
aniso8601===9.0.1
rjsmin===1.2.1
icalendar===5.0.11
decorator===5.1.1
DateTimeRange===2.2.0
cffi===1.16.0
python-cyborgclient===2.3.0
futurist===3.0.0
jsonschema===4.19.2
sphinxcontrib-devhelp===1.0.2;python_version=='3.8'
sphinxcontrib-devhelp===1.0.6;python_version>='3.9'
python-blazarclient===4.0.1
alembic===1.9.4
execnet===2.0.2
glance-store===4.7.0
sphinxcontrib-programoutput===0.17
storpool.spopenstack===3.2.0
sphinx-testing===1.0.1
dnspython===2.5.0
oauthlib===3.2.2
Babel===2.14.0
logutils===0.3.5
zipp===3.17.0
greenlet===3.0.3
XStatic-Angular-Vis===4.16.0.0
iniconfig===2.0.0
referencing===0.33.0
confluent-kafka===2.3.0
xvfbwrapper===0.2.9
influxdb-client===1.40.0
tosca-parser===2.10.0
charset-normalizer===3.3.2
Flask===3.0.2
httpx===0.26.0
sqlalchemy-filters===0.13.0
marathon===0.13.0
sphinxcontrib-runcmd===0.2.0
confspirator===0.3.0
fasteners===0.19
sortedcontainers===2.4.0
python-linstor===1.21.0
filelock===3.13.1
python-tackerclient===2.0.0
python-heatclient===3.5.0
kafka-python===2.0.2
oslo.utils===7.1.0
gitdb2===4.0.2
requests-kerberos===0.14.0
itsdangerous===2.1.2
XStatic-jquery-ui===1.13.0.1
monasca-statsd===2.7.0
python-dateutil===2.8.2
virtualenv===20.25.0
colorama===0.4.6
confetti===2.5.3
ironic-lib===6.0.0
pytz===2024.1
opentelemetry-proto===1.22.0
XStatic-D3===3.5.17.0
actdiag===3.0.0
sysv-ipc===1.1.0
sphinxcontrib-applehelp===1.0.4;python_version=='3.8'
sphinxcontrib-applehelp===1.0.8;python_version>='3.9'
scikit-learn===1.3.2;python_version=='3.8'
scikit-learn===1.4.0;python_version>='3.9'

View file

@ -40,8 +40,13 @@
- ironic-tempest-bios-ipmi-direct-tinyipa
- ironic-tempest-bfv
- ironic-tempest-ipa-partition-uefi-pxe-grub2
- metalsmith-integration-glance-centos9-legacy
- metal3-integration
# NOTE(rpittau): Currently broken because of an issue with parted
- metalsmith-integration-glance-centos9-legacy:
voting: false
# NOTE(TheJulia): At present, metal3 doesn't leverage
# stable branches, and as far as we are aware these jobs
# can be removed once this branch is made stable.
# - metal3-integration
# Non-voting jobs
- ironic-inspector-tempest:
voting: false
@ -86,8 +91,12 @@
- ironic-tempest-bios-ipmi-direct-tinyipa
- ironic-tempest-bfv
- ironic-tempest-ipa-partition-uefi-pxe-grub2
- metalsmith-integration-glance-centos9-legacy
- metal3-integration
# NOTE(rpittau): Currently broken because of an issue with parted
#- metalsmith-integration-glance-centos9-legacy
# NOTE(TheJulia): At present, metal3 doesn't leverage
# stable branches, and as far as we are aware these jobs
# can be removed once this branch is made stable.
# - metal3-integration
experimental:
jobs:
# TODO(dtantsur): these jobs are useful but currently hopelessly