[Bug 2019460] Re: nova-compute 23.2.2-0ubuntu1~cloud2 unable to detach volumes

Chuan Li 2019460 at bugs.launchpad.net
Thu Jun 15 09:27:27 UTC 2023


I also encountered the issue of not being able to detach the volume. At
first, I thought I had reproduced this problem, but it turned out not to
be the case.

The symptom is that, launch a VM boot from a ceph volume, the volume
will be deleted when delete the VM. When I delete the instance, the
volume is still in "in-use" state.

nova-compute.log

ERROR nova.volume.cinder  Error: The server could not comply with the
request since it is either malformed or otherwise incorrect. (HTTP 406)
(Request-ID: req-759ad2ed-22f9-4286-81ba-2b543d089b41) Code: 406:
cinderclient.exceptions.NotAcceptable: The server could not comply with
the request since it is either malformed or otherwise incorrect. (HTTP
406) (Request-ID: req-759ad2ed-22f9-4286-81ba-2b543d089b41)

WARNING nova.compute.manager [instance:
04544489-dfd2-4c0c-b8c8-a07acbee2b58] Ignoring unknown cinder exception
for volume d0a58ecc-e63a-49e4-8785-fb34d113e0f2: The server could not
comply with the request since it is either malformed or otherwise
incorrect. (HTTP 406) (Request-ID:
req-759ad2ed-22f9-4286-81ba-2b543d089b41):
cinderclient.exceptions.NotAcceptable: The server could not comply with
the request since it is either malformed or otherwise incorrect. (HTTP
406) (Request-ID: req-759ad2ed-22f9-4286-81ba-2b543d089b41)

apache log

"DELETE
/v3/c1524133a19945fc9f59708819277bc9/attachments/aa8bc30a-b22b-409c-8472-351b73fdd1ea
HTTP/1.1" 406 5462 "-" "python-cinderclient"

Regarding this issue, I conducted more tests.

Test 1:

deploy a env with focal+ victoria
cinder pkg version in cinder-ceph units: 2:17.4.0-0ubuntu1~cloud4
nova pkg version in nova-compute units: 2:22.4.0-0ubuntu1~cloud4

result: I can NOT reproduce the issue

Test 2:

deploy a env with focal+ victoria
2 nova-compute units are with different nova pkg versions, one is with 2:22.4.0-0ubuntu1~cloud4, the other is with 2:22.4.0-0ubuntu1~cloud3
cinder pkg version in cinder-ceph units: 2:17.4.0-0ubuntu1~cloud3
Launch a VM on each nova-compute node.

Result: I can reproduce the issue for both VMs

Test 3:

deploy a env with focal+ victoria
2 nova compute with different nova pkg versions, one is with 2:22.4.0-0ubuntu1~cloud4, the other is with 2:22.4.0-0ubuntu1~cloud3
cinder pkg version in cinder-ceph units: 2:17.4.0-0ubuntu1~cloud4
Launch a VM on each nova-compute node.

Result: I can NOT reproduce the issue for both VMs

Thus, my conclusion is that the issue is caused by the Cinder version
2:17.4.0-0ubuntu1~cloud3 and has nothing to do with the Nova version.

-- 
You received this bug notification because you are a member of Ubuntu
OpenStack, which is subscribed to Ubuntu Cloud Archive.
https://bugs.launchpad.net/bugs/2019460

Title:
  nova-compute 23.2.2-0ubuntu1~cloud2 unable to detach volumes

Status in Ubuntu Cloud Archive:
  Invalid
Status in Ubuntu Cloud Archive victoria series:
  Fix Released
Status in Ubuntu Cloud Archive wallaby series:
  Fix Released
Status in OpenStack Compute (nova):
  Invalid
Status in nova package in Ubuntu:
  Invalid
Status in nova source package in Focal:
  Fix Released

Bug description:
  The following packages were updated on Wallaby compute nodes to fix
  https://security.openstack.org/ossa/OSSA-2023-003.html:

  python3-nova:amd64 (3:23.2.2-0ubuntu1~cloud1, 3:23.2.2-0ubuntu1~cloud2),
  python3-os-brick:amd64 (4.3.3-0ubuntu1~cloud0, 4.3.3-0ubuntu1~cloud1),
  nova-compute-libvirt:amd64 (3:23.2.2-0ubuntu1~cloud1, 3:23.2.2-0ubuntu1~cloud2),
  nova-common:amd64 (3:23.2.2-0ubuntu1~cloud1, 3:23.2.2-0ubuntu1~cloud2),
  os-brick-common:amd64 (4.3.3-0ubuntu1~cloud0, 4.3.3-0ubuntu1~cloud1),
  nova-compute-kvm:amd64 (3:23.2.2-0ubuntu1~cloud1, 3:23.2.2-0ubuntu1~cloud2),
  nova-compute:amd64 (3:23.2.2-0ubuntu1~cloud1, 3:23.2.2-0ubuntu1~cloud2)

  nova-compute is now unable to detach volumes from instances:

  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server [req-470d3e0e-e59c-40c5-9597-6649c08add16 046191f8ebfd4695b3387a5ead3a9a55 85945271df8b4a6f9d37c37e4e52958d - default default] Exception during message handling: TypeError: disconnect_volume() got an unexpected keyword argument 'force'
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/exception_wrapper.py", line 71, in wrapped
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification(
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server self.force_reraise()
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server raise self.value
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/exception_wrapper.py", line 63, in wrapped
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw)
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/utils.py", line 1434, in decorated_function
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 211, in decorated_function
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context,
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server self.force_reraise()
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server raise self.value
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 200, in decorated_function
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 7195, in detach_volume
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server do_detach_volume(context, volume_id, instance, attachment_id)
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py", line 360, in inner
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server return f(*args, **kwargs)
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 7192, in do_detach_volume
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server self._detach_volume(context, bdm, instance,
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 7143, in _detach_volume
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server driver_bdm.detach(context, instance, self.volume_api, self.driver,
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/virt/block_device.py", line 476, in detach
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server self._do_detach(context, instance, volume_api, virt_driver,
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/virt/block_device.py", line 408, in _do_detach
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server self.driver_detach(context, instance, volume_api, virt_driver)
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/virt/block_device.py", line 347, in driver_detach
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server volume_api.roll_detaching(context, volume_id)
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server self.force_reraise()
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server raise self.value
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/virt/block_device.py", line 328, in driver_detach
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server virt_driver.detach_volume(context, connection_info, instance, mp,
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/virt/libvirt/driver.py", line 2592, in detach_volume
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server self._disconnect_volume(context, connection_info, instance,
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/virt/libvirt/driver.py", line 1862, in _disconnect_volume
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server vol_driver.disconnect_volume(
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server TypeError: disconnect_volume() got an unexpected keyword argument 'force'
  2023-05-13 05:53:00.128 3219193 ERROR oslo_messaging.rpc.server

  Looks like it doesn't know about the "force" keyword that's being
  passed. This breaks basic volume functionality.

To manage notifications about this bug go to:
https://bugs.launchpad.net/cloud-archive/+bug/2019460/+subscriptions




More information about the Ubuntu-openstack-bugs mailing list